SoftBank is investing $960 million to expand its computing power by 2025 with the goal of developing world-leading generative AI models, reports Nikkei Asia. The Japanese telecommunications company is buying Nvidia GPUs and plans to complete a large language model (LLM) with 390 billion parameters this year. SoftBank plans to first train a Japanese-language AI model with top global performance, followed by a state-of-the-art model with over a trillion parameters next year. This investment makes SoftBank the leader in processing power in Japan, with access available to other companies. In addition, SoftBank is building one of Japan's largest AI data centers in Hokkaido to provide new services and ensure data sovereignty as required by the government. OpenAI recently released a Japanese language optimized version of GPT-4.

Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.