OpenAI is buying health app Torch for around 100 million dollars. The deal includes 60 million upfront and the rest in retention shares, The Information reports. Torch unifies scattered health records into what the founders call a "medical memory for AI", "a context engine that helps you see the full picture, connect the dots, and make sure nothing important gets lost." The app runs on OpenAI models. All four employees, including CEO Ilya Abyzov, are joining OpenAI.
Apple will use Google's Gemini models for its AI features, including a revamped version of Siri. The multi-year partnership means Apple will rely on Google's Gemini and cloud technology for its upcoming products, according to CNBC. The new features are expected to roll out later this year.
The move comes as Apple continues to struggle with Siri's underlying architecture. Internal reports describe Siri as a technically fragmented system built from old rule-based components and newer generative models - a combination that makes updates difficult and leads to frequent errors. Apple is also working on an entirely new in-house LLM architecture and a model with roughly one trillion parameters, aiming to eventually break free from external providers. Google faced similar challenges early on keeping pace with OpenAI's rapid progress but managed to catch up.
Leading figures in China's AI industry are tempering expectations: China won't overtake the US in the AI race anytime soon. Justin Lin, head of Alibaba's Qwen model series, puts the odds of a Chinese company surpassing OpenAI or Anthropic in the next three to five years at less than 20 percent. Tang Jie from Zhipu AI warned at the AGI Next Summit in Beijing that the gap with the US may actually be widening, though recent open-source releases suggest otherwise.
At the conference, executives cited limited computing capacity and US export controls on advanced chips as key hurdles. US infrastructure is one to two orders of magnitude larger, forcing Chinese companies to focus resources on current projects.
Yao Shunyu, a former OpenAI researcher and now Tencent's AI chief scientist, was more optimistic. He cited three to five years as a realistic timeframe for China to catch up but said the lack of advanced chipmaking machines was the main technical hurdle.
OpenAI is bringing in the team behind Convogo, an AI startup that built software for evaluating executives, as part of its broader cloud strategy. Founder Matt Cooper announced the news on LinkedIn. Convogo's software used AI to automatically analyze interviews, surveys, and psychometric tests.
According to OpenAI (via Techcrunch), the acquisition is about the people, not the product. The three founders, Matt Cooper, Evan Cater, and Mike Gillett, will help drive OpenAI's AI cloud efforts. The deal was settled entirely in shares, though the amount remains undisclosed. Convogo's software is being shut down.
The founding team's strong product focus likely made them attractive. Cooper writes that the key to closing the gap between AI's potential and its actual use lies in well-designed, purpose-driven applications, a "usage gap" narrative that Microsoft and OpenAI have both pushed before.
Last fall, OpenAI reportedly set aside a stock pool for employees worth about ten percent of the company. Based on the $500 billion valuation from October 2024, that comes to around $50 billion, according to The Information, citing two people familiar with the plans.
OpenAI has also already issued $80 billion in allocated shares. Combined with the new stock pool, employees now own about 26 percent of the company. Meanwhile, OpenAI is in early talks with investors about a new funding round worth roughly $750 billion.
Epoch AI has released a comprehensive database of AI chip sales showing that global computing capacity now exceeds 15 million H100 equivalents. This metric compares the performance of various chips to Nvidia's H100 processor. The data, published on January 8, 2026, reveals that Nvidia's new B300 chip now generates the majority of the company's AI revenue, while the older H100 has dropped below ten percent. The analysis covers chips from Nvidia, Google, Amazon, AMD, and Huawei.
Epoch AI estimates this hardware collectively requires over 10 gigawatts of power - roughly twice what New York City consumes. The figures are based on financial reports and analyst estimates, since exact sales numbers are often not disclosed directly. The dataset is freely available and aims to bring transparency to computing capacity and energy consumption.
In a glorious AI future, you'll order pizza directly from Excel. Microsoft and Stripe are teaming up to bring shopping to the AI assistant Copilot. US users will soon be able to buy products directly in the chat without ever leaving the app. At launch, the feature includes Etsy retailers and brands like Urban Outfitters and Anthropologie.