Nvidia dominates AI compute and sets the prices. The big AI companies are developing alternative strategies.
Following in the footsteps of Google and Amazon, Microsoft is now looking to develop its own AI chips, reports The Information. The chip, code-named "Athena," could be unveiled at Microsoft's Ignite developer conference in Seattle from November 14 to 17. The chip has been in development since 2019, according to the report.
AI chip for cloud data centers
The Athena chip has already been shown to select groups at Microsoft and OpenAI, according to The Information's source. It is designed to be on par with Nvidia's H100 GPU, putting Microsoft on par with Google and Amazon.
Microsoft is also considering offering the chip as a cloud solution for AI development teams. This is a challenge because developers are trained on Nvidia's CUDA, and machine learning frameworks are optimized for Nvidia chips.
Currently, Microsoft's AI data centers use Nvidia graphics cards to train and deploy large-scale language models to customers. Google, Amazon, and OpenAI also use Nvidia GPUs.
Big AI is fighting back against Nvidia's dominance
Microsoft is also working closely with Advanced Micro Devices (AMD) on AMD's next AI chip, the MI300X, according to The Information's source.
AMD's chip, designed specifically for AI applications, is set to be released in the fourth quarter. According to AMD CEO Lisa Su, the company is already working with leading cloud providers, large enterprises, and many leading AI companies.
In any case, it appears that the major AI companies are naturally trying to reduce Nvidia's dominance in the AI chip sector, at least to put themselves in a better negotiating position. Since the ChatGPT hype began, Nvidia's sales have skyrocketed and the industry is suffering from constant supply shortages.
Google, for example, is pushing its TPUs (Tensor Processing Unit) and signing exclusive deals with AI companies like Midjourney and Character AI to attract more developers to its chip infrastructure and get high-value real-world cases.
Amazon is investing in Anthropic, the largest OpenAI competitor. Anthropic will use Amazon's Trainium and Inferentia AI chips for AI training and inference for the Claude chatbot. OpenAI is also considering developing its own AI chips.