OpenAI Said to be Considering Developing its Own AI Chips


AI Chips
OpenAI Said to be Considering Developing its Own AI Chips
Spread the love

Introduction:

OpenAI, one of the most heavily financed companies in the AI industry, is exploring the possibility of developing its own AI Chips hardware chips. According to Reuters, the corporation has been debating AI chip options as the scarcity of chips for training AI models has intensified since at least last year. According to reports, OpenAI is examining a number of options to further its chip aspirations, including buying an AI chip maker or starting an internal chip design project.

According to Reuters, OpenAI CEO Sam Altman has prioritized the company’s purchase of more AI chips. Currently, OpenAI, like the majority of its rivals, develops models like ChatGPT, GPT-4, and DALL-E 3 using GPU-based technology. The most advanced AI of today can be trained using GPUs because of its propensity for doing numerous computations in parallel.

The GPU supply chain has been severely pressured by the generative AI boom, which has been lucrative for GPU manufacturers like Nvidia. Microsoft cautioned in a summer earnings report that there is a severe scarcity of the server hardware required to run AI, which could cause service interruptions. Additionally, until 2024, Nvidia’s top-performing AI chips are said to be out of stock.

Additionally, GPUs are necessary for running and providing OpenAI’s models; the company runs customer workloads on clusters of GPUs in the cloud. However, they are costly.

According to a study by Bernstein analyst Stacy Rasgon, ChatGPT would need to buy around $48.1 billion in GPUs at first and about $16 billion in chips annually to remain operating if the volume of its searches reached a tenth that of Google Search.

See also  Salesforce introduces Einstein Copilot Studio to help Customers Customize their AI

It wouldn’t be OpenAI’s first attempt to develop its own AI processors. To train massive generative AI systems like PaLM-2 and Imagen, Google uses a processor called the TPU (short for “tensor processing unit”). Customers of AWS can use Amazon’s chips for both training (Trainium) and inferencing (Inferentia). According to reports, Microsoft is cooperating with AMD to create the Athena internal AI processor, which OpenAI is supposedly evaluating.

OpenAI Said to be Considering Developing its Own AI Chips:

OpenAI Said to be Considering Developing its Own AI Chips image

OpenAI Said to be Considering Developing its Own AI Chips [Source of Image: Techcrunch.com]

OpenAI possesses significant financial assets and is contemplating a stock offering that, as per a recent Wall Street Journal report, has the potential to elevate its secondary-market valuation to $90 billion. With more than $11 billion in venture capital funding and approaching the milestone of $1 billion in annual sales, the company stands on a robust foundation to undertake these endeavors.

But the hardware industry, especially the AI chip industry, is a harsh one. A contract with Microsoft reportedly caused the AI chipmaker Graphcore’s worth to drop by $1 billion, and the company announced job losses last year in response to the “extremely challenging” macroeconomic situation. (As Graphcore reported declining income and rising losses over the past six months, the situation grew more serious.)

In the meantime, 10% of the employees at Intel-owned AI chip maker Habana Labs were let go. Additionally, Meta’s efforts to develop a proprietary AI chip have been plagued by problems, forcing the business to abandon part of its test hardware. 

See also  Syria's Assad meets Iran’s supreme leader during Tehran visit

Even if OpenAI commits to developing a custom microprocessor, it might take years and cost hundreds of millions of dollars every year to complete. It is still being determined whether the startup’s backers, among them Microsoft, have the desire to make such a dangerous wager.


Spread the love

Sai Sandhya