The GenAI boom has exposed just how reliant the big tech companies have become on Nvidia, the leading global manufacturer of high-end graphics processing units (GPUs). The rise in demand for specialized chips for AI solutions has fueled the dependency. However, the competitive landscape is set to be disrupted now.
Big tech companies such as Amazon, Microsoft, Google, and Meta are increasingly seeking to reduce their dependence on Nvidia by putting their own AI chips on the table.
In September 2023, Amazon announced an investment of $4B in Anthropic to accelerate the development of Anthropic’s future foundation models and make them widely accessible to AWS customers.
One of the major reasons for this massive investment is that Anthropic has reportedly agreed to build its AI solutions using AI chips designed by Amazon. This move is a clear indicator that Amazon is staking its claim in the escalating AI arms race.
Facebook owner Meta Platforms also has plans to develop a custom chip aimed at supporting its artificial intelligence (AI) push. The social media giant has been striving to boost its computing power for the power-hungry GenAI products.
As the company pushes more AI into its social media platforms, it is going to need a lot more specialized ships to support the growth. With the successful deployment of its custom chip, Meta could save hundreds of millions of dollars in AI chip purchasing and energy costs.
Other major tech players are also developing in-house computing power. Microsoft has announced its first custom AI chip to train models. The chip design team at Google is using DeepMind AI running on Google servers to design AI processors.
According to Bloomberg, Sam Altman, CEO of OpenAI, is pushing to raise billions for a network of AI chip factories. Altman has been in touch with global investors and unnamed top chip manufacturers for the fabrication of AI chips.
Nvidia has established such a dominant position in the AI chips market, which is estimated to reach $140B by 2027 according to Gartner, by mastering the designing and manufacturing of special chips needed for chatbots and other AI systems. While the big tech players have invested heavily in Nvidia, the chipmaker has not kept up with demand, and this is a primary reason why the major tech companies are building AI chips of their own.
Some of the most popular AI chips, such as the Nvidia H100 have become highly sought after and extremely expensive. Not only does Nvidia face stiff challenges from the big tech companies, but there is also increased pressure to make more powerful and efficient AI chips to beat industry rivals AMD and Intel.
To help extend its dominance, Nvidia has announced its next-generation GH200 Grace Hopper chips. According to Nvidia, the new chip will have three times the memory capacity of the popular H100 GPU.
Interestingly, some of the big tech companies that are developing their own AI chips are also some of the most important strategic partners to Nvidia. While these companies will continue to power most of their AI systems using Nvidia chips, the goal is to reduce the dependency. It is going to be a challenging balancing act for all the stakeholders.
Nvidia is having its AI moment with stock price up by almost 30 percent year-to-date at the end of January. Last year Nvidia’s value soared past $1 trillion – an all-time high. However, as Nvidia’s major buyers are building their own AI chips, it casts some uncertainty on the chip maker’s long-term revenue growth. As the demand for GenAI use cases continues to grow, AI chips are set to be the next big battleground for AI supremacy.