The business that created ChatGPT is looking at Nvidia’s accelerator options, possibly developing its own.
It goes without saying that Nvidia dominates the market for the hardware needed to train extensive language models for AI applications. Nevertheless, because there is a much greater demand than there is supply, the cost has increased and the line to buy them is comparable to that at the local automobile registration office. According to reports, OpenAI, the company behind ChatGPT, is now considering manufacturing its own chips as a hardware alternative to Nvidia’s.
The strategies OpenAI has to break the hardware deadlock the AI industry is experiencing are described in a Reuters report. One of those choices is to establish its own path without relying on Nvidia’s hardware, which it would probably do by buying one of Nvidia’s rivals in order to later produce its own processors. According to the study, OpenAI has only advanced to the point of analyzing merger targets. Along with producing its own processors, it is also exploring collaborating more closely with rivals like Nvidia, or diversifying its chip supply to fully cut off Nvidia.
According to the source, OpenAI CEO Sam Altman has declared that the company’s acquisition of more AI chips is its top priority right now. It’s simpler to say than to do. According to TSMC, Nvidia is unable to produce enough H100 AI chips to meet demand for another 1.5 years due to capacity constraints in its manufacturing facility. 10,000 Nvidia GPUs were purportedly employed by the company to train ChatGPT. OpenAI is trying to scale, but is having trouble because of a lack of supply and “eye-watering” costs.
Reuters estimates that $48 billion worth of GPUs would be needed to scale OpenAI to a query traffic that is 1/10th that of Google. After that, it would require $16 billion in yearly spending just to meet demand. The business and the sector as a whole are dealing with an existential issue. Moreover, Nvidia, which apparently makes up to 1,000% margins on each H100 it sells, will benefit from the development.
The route that OpenAI is considering is also not novel because companies like Amazon, Google, and Meta use custom chips created to meet their unique requirements. Microsoft, its largest partner, is apparently developing its own unique silicon. Unfortunately, it would take OpenAI years to develop, produce, and install new silicon, so it