Short

SoftBank is investing $960 million to expand its computing power by 2025 with the goal of developing world-leading generative AI models, reports Nikkei Asia. The Japanese telecommunications company is buying Nvidia GPUs and plans to complete a large language model (LLM) with 390 billion parameters this year. SoftBank plans to first train a Japanese-language AI model with top global performance, followed by a state-of-the-art model with over a trillion parameters next year. This investment makes SoftBank the leader in processing power in Japan, with access available to other companies. In addition, SoftBank is building one of Japan's largest AI data centers in Hokkaido to provide new services and ensure data sovereignty as required by the government. OpenAI recently released a Japanese language optimized version of GPT-4.

Ad
Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Short

Google is consolidating its generative AI teams under the leadership of DeepMind to accelerate the development of more advanced models, building on previous consolidation efforts such as the merger of DeepMind and Brain last year. All teams working on AI models are now under one roof, including the Responsible AI team, which will play a larger role in developing new models, Alphabet CEO Pichai wrote on Thursday. Google also wants to standardize the requirements for rolling out AI capabilities, and invest more in testing to ensure the models' responses are accurate and appropriate. The consolidation effort also extends to AI hardware development, with the Platforms and Ecosystems and Devices and Services teams being combined into a new group called Platforms and Devices.

Google News