Update:
According to the Financial Times, Altman is in talks with Middle Eastern investors, including Sheikh Tahnoon bin Zayed al-Nahyan of the United Arab Emirates.
The goal is to develop chips needed for training and running AI models, and to reduce OpenAI's dependence on Nvidia. Altman is also in talks with Taiwan Semiconductor Manufacturing Co (TSMC) about a chip manufacturing partnership.
It is unclear whether the chip venture will operate as a subsidiary of OpenAI or as a separate entity. However, OpenAI is expected to be the main customer of the new company.
According to the FT, OpenAI is working on a new AI model that will be a "major upgrade" to GPT-4 and is expected to be released later this year. At the World Economic Forum in Davos, Altman described GPT-4 as a "preview" of future developments. It also means that OpenAI's reliance on AI chips will continue to grow.
Original article:
OpenAI's Sam Altman seeks billions to create a global AI chip factory network
According to Bloomberg, OpenAI CEO Sam Altman wants to build a global network of AI chip factories.
OpenAI CEO Sam Altman is planning to raise billions of dollars for a chip company to build a global network of semiconductor fabs, according to Bloomberg. The publication cites several people with knowledge of the matter.
Altman is in talks with various potential large investors, including Abu Dhabi-based G42 and SoftBank Group Corp. The project would partner with leading chipmakers.
Not enough chips for an AGI world
Altman is reportedly concerned that as AI becomes more widespread, there will not be enough chips available for large-scale deployment. Some current forecasts for AI chip production fall short of projected demand.
Altman believes the industry needs to act now to ensure there is sufficient supply by the end of the decade. But building a global network of chip factories would require significant investment and take years.
Building and maintaining semiconductor fabs is more expensive than the approach taken by other companies in the industry. A single state-of-the-art fab can cost tens of billions of dollars to build.
Companies like Amazon, Google, and Microsoft are instead focusing on designing their custom semiconductor products and outsourcing manufacturing to outside companies.
Microsoft unveiled its first AI chips at the end of November, Meta before that in the spring of 2023. Google (TPU) and Amazon (Trainium) have been developing these chips for years.
Nvidia is the biggest winner of the AI hype so far
Google, Amazon, Meta, OpenAI, and Microsoft are all using Nvidia GPUs to train AI and deploy models to customers. Meta alone plans to install 340,000 Nvidia H100 GPUs in its server farms by the end of the year.
As a result, Nvidia currently dominates the market for AI computing power and sets the prices, resulting in extreme revenue growth. Chip startups like Graphcore find it difficult to compete with Nvidia's dominance, which is a result of the interaction between software and hardware. So it's not enough to build a faster chip, which would be hard enough.
The major AI players are unhappy with this development and are working on alternative solutions. Microsoft, for example, is working closely with Advanced Micro Devices (AMD) on AMD's MI300X AI chip, which has been available since early December 2023.
Altman is also an investor in RainAI, which is developing a neuromorphic processing unit (NPU) that mimics human-like functions and promises significantly higher processing power and energy efficiency than today's GPUs. OpenAI will purchase $51 million worth of AI chips from Rain AI, a tiny fraction of the total amount being spent on AI chips.