Friday, November 22, 2024
Technology

OpenAI said to be considering developing its own AI chips

OpenAI, one of the best-funded AI startups in business, is exploring making its own AI chips.

Discussions of AI chip strategies within the company have been ongoing since at least last year, according to Reuters, as the shortage of chips to train AI models worsens. OpenAI is reportedly considering a number of strategies to advance its chip ambitions, including acquiring an AI chip manufacturer or mounting an effort to design chips internally.

OpenAI CEO Sam Altman has made the acquisition of more AI chips a top priority for the company, Reuters reports.

Currently, OpenAI, like most of its competitors, relies on GPU-based hardware to develop models such as ChatGPT, GPT-4 and DALL-E 3. GPUs’ ability to perform many computations in parallel make them well-suited to training today’s most capable AI.

But the generative AI boom — a windfall for GPU makers like Nvidia — has massively strained the GPU supply chain. Microsoft is facing a shortage of the server hardware needed to run AI so severe that it might lead to service disruptions, the company warned in a summer earnings report. And Nvidia’s best-performing AI chips are reportedly sold out until 2024.

GPUs are also essential for running and serving OpenAI’s models; the company relies on clusters of GPUs in the cloud to perform customers’ workloads. But they come at a sky-high cost.

An analysis from Bernstein analyst Stacy Rasgon found that if ChatGPT queries grew to a tenth the scale of Google Search, it’d require roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips a year to keep operational.

OpenAI wouldn’t be the first to pursue creating its own AI chips.

Google has a processor, the TPU (short for “tensor processing unit”), to train large generative AI systems like PaLM-2 and Imagen. Amazon offers proprietary chips to AWS customers both for training (Trainium) and inferencing (Inferentia). And Microsoft, reportedly, is working with AMD to develop an in-house AI chip called Athena, which OpenAI is said to be testing.

Certainly, OpenAI is in a strong position to invest heavily in R&D. The company, which has raised more than $11 billion in venture capital, is nearing $1 billion in annual revenue. And it’s considering a share sale that could see its secondary-market valuation soar to $90 billion, according to a recent Wall Street Journal report.

But hardware is an unforgiving business — particularly AI chips.

Last year, AI chipmaker Graphcore, which allegedly had its valuation slashed by $1 billion after a deal with Microsoft fell through, said that it was planning to job cuts due to the “extremely challenging” macroeconomic environment. (The situation grew more dire over the past few months as Graphcore reported falling revenue and increased losses.) Meanwhile, Habana Labs, the Intel-owned AI chip company, laid off an estimated 10% of its workforce. And Meta’s custom AI chip efforts have been beset with issues, leading the company to scrap some its experimental hardware. 

Even if OpenAI commits to bringing a custom chip to market, such an effort could take years and cost hundreds of millions of dollars annually. It remains to be seen if the startup’s investors, one of which is Microsoft, have the appetite for such a risky bet.

source

Leave a Reply

Your email address will not be published. Required fields are marked *