Ad
Skip to content
Read full article about: Nvidia, Amazon, and Microsoft could invest up to $60 billion in OpenAI

OpenAI's latest funding round might hit peak circularity. According to The Information, the AI company is in talks with Nvidia, Microsoft, and Amazon about investments totaling up to $60 billion. Nvidia could put in as much as $30 billion, Amazon more than $10 billion—possibly even north of $20 billion—and Microsoft less than $10 billion. On top of that, existing investor SoftBank could contribute up to $30 billion. If these deals go through, the funding round could reach the previously rumored $100 billion mark at a valuation of around $730 billion.

Critics will likely point out how circular these deals really are. Several potential investors, including Microsoft and Amazon, also sell servers and cloud services to OpenAI. That means a chunk of the investment money flows right back to the investors themselves. These arrangements keep the AI hype machine running without the actual financial benefits of generative AI showing up in what end users pay.

Read full article about: China greenlights 400,000 Nvidia H200 chip imports for tech giants, according to Reuters

China has authorized ByteDance, Alibaba, and Tencent to purchase Nvidia's H200 AI chips, Reuters reports, citing four people familiar with the matter. The three tech giants can import more than 400,000 H200 chips combined. Additional companies are on a waiting list for future approvals.

The approval came during Nvidia CEO Jensen Huang's visit to China. Huang arrived in Shanghai last Friday and has since traveled to Beijing and other cities. The Chinese government is attaching conditions to the approvals that are still being finalized. A fifth source told Reuters the licenses are too restrictive, and customers aren't converting approvals into orders yet. Beijing has previously discussed requiring companies to buy a certain quota of domestic chips before they can import foreign semiconductors.

The H200 is Nvidia's second most powerful AI chip, delivering roughly six times the performance of the H20. Chinese companies have ordered more than two million H200 chips, according to Reuters - far more than Nvidia can deliver. Beijing had previously held off on allowing imports to support its domestic chip industry. The U.S. approved exports in early January.

Read full article about: Nvidia pours $2 billion into Coreweave

Nvidia invests $2 billion in cloud provider Coreweave, buying shares at $87.20 each. The two companies are expanding their existing partnership to build AI data centers with more than 5 gigawatts of capacity by 2030.

As part of the deal, Coreweave will deploy multiple generations of Nvidia hardware, including the Rubin platform, Vera processors, and Bluefield storage systems. The partners also plan to integrate Coreweave's software into Nvidia's reference architectures for cloud providers and enterprise customers.

Coreweave went public in March 2025 and specializes in AI-optimized cloud services. The company is involved in expanding OpenAI's Stargate project. OpenAI has also invested several billion dollars in Coreweave.

China reportedly tightens Nvidia H200 restrictions, limits purchases to special cases

The AI race between the US and China enters a new phase: Washington loosens Nvidia export rules, but Beijing reportedly halts purchases. China wants to shield its chip industry and may require buyers to also purchase domestic chips.

UK startup turns planetary biodiversity into AI-generated drug candidates

UK company Basecamp Research has developed AI models together with researchers from Nvidia and Microsoft that generate potential new therapies against cancer and multidrug-resistant bacteria from a database of over one million species.

Read full article about: Global AI compute hits 15 million H100 equivalents, Epoch AI finds

Epoch AI has released a comprehensive database of AI chip sales showing that global computing capacity now exceeds 15 million H100 equivalents. This metric compares the performance of various chips to Nvidia's H100 processor. The data, published on January 8, 2026, reveals that Nvidia's new B300 chip now generates the majority of the company's AI revenue, while the older H100 has dropped below ten percent. The analysis covers chips from Nvidia, Google, Amazon, AMD, and Huawei.

Epoch AI estimates this hardware collectively requires over 10 gigawatts of power - roughly twice what New York City consumes. The figures are based on financial reports and analyst estimates, since exact sales numbers are often not disclosed directly. The dataset is freely available and aims to bring transparency to computing capacity and energy consumption.