SoftBank invests nearly one billion US dollars in compute for generative AI models
SoftBank is investing $960 million to expand its computing power by 2025 with the goal of developing world-leading generative AI models, reports Nikkei Asia. The Japanese telecommunications company is buying Nvidia GPUs and plans to complete a large language model (LLM) with 390 billion parameters this year. SoftBank plans to first train a Japanese-language AI model with top global performance, followed by a state-of-the-art model with over a trillion parameters next year. This investment makes SoftBank the leader in processing power in Japan, with access available to other companies. In addition, SoftBank is building one of Japan's largest AI data centers in Hokkaido to provide new services and ensure data sovereignty as required by the government. OpenAI recently released a Japanese language optimized version of GPT-4.
AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.
Subscribe nowAI news without the hype
Curated by humans.
- Over 20 percent launch discount.
- Read without distractions – no Google ads.
- Access to comments and community discussions.
- Weekly AI newsletter.
- 6 times a year: “AI Radar” – deep dives on key AI topics.
- Up to 25 % off on KI Pro online events.
- Access to our full ten-year archive.
- Get the latest AI news from The Decoder.