Ad
Skip to content
Read full article about: Anthropic's Claude Cowork was built in under two weeks using Claude Code to write the code

Anthropic's Claude Code inventor says his tool wrote almost all the code for Claude Cowork. Claude Cowork is a newly launched AI tool from Anthropic that builds on Claude Code but adds a user-friendly interface for non-programmers. According to Claude Code inventor Boris Cherny, "pretty much" all the code was generated using Claude Code.

Claude Code inventor Boris Cherny says his tool wrote nearly all the code for Claude Cowork. | Screenshot via X

Product Manager Felix Rieseberg says the app came together in just a sprint and a half, roughly one and a half weeks. The team had already built some prototypes and explored ideas beforehand, though, and the current release is still a research preview with a few rough edges, Rieseberg says. Claude Code also provided an extensive foundation to build on; Rieseberg is likely referring mainly to the front-end work.

Read full article about: OpenAI acquires Torch to build a "medical memory for AI"

OpenAI is buying health app Torch for around 100 million dollars. The deal includes 60 million upfront and the rest in retention shares, The Information reports. Torch unifies scattered health records into what the founders call a "medical memory for AI", "a context engine that helps you see the full picture, connect the dots, and make sure nothing important gets lost." The app runs on OpenAI models. All four employees, including CEO Ilya Abyzov, are joining OpenAI.

The deal signals OpenAI's push toward a personalized health assistant in ChatGPT. Last week, the company launched a ChatGPT Health section and an offering for healthcare companies. Anthropic recently added health features to Claude as well. The moves reflect a shared bet on a massive market: hundreds of millions of weekly chatbot conversations already focus on health.

Read full article about: Apple turns to Google's Gemini as Siri's technical debt becomes too much to handle

Apple will use Google's Gemini models for its AI features, including a revamped version of Siri. The multi-year partnership means Apple will rely on Google's Gemini and cloud technology for its upcoming products, according to CNBC. The new features are expected to roll out later this year.

In a statement, Apple said that after careful evaluation, Google's technology offers the most capable foundation for its applications. Rumors about talks between the two tech giants first surfaced in March of last year. Later reports suggested the switch would cost Apple more than one billion dollars annually.

The move comes as Apple continues to struggle with Siri's underlying architecture. Internal reports describe Siri as a technically fragmented system built from old rule-based components and newer generative models - a combination that makes updates difficult and leads to frequent errors. Apple is also working on an entirely new in-house LLM architecture and a model with roughly one trillion parameters, aiming to eventually break free from external providers. Google faced similar challenges early on keeping pace with OpenAI's rapid progress but managed to catch up.

Comment Source: CNBC
Read full article about: Chinese AI industry admits US remains ahead for now

Leading figures in China's AI industry are tempering expectations: China won't overtake the US in the AI race anytime soon. Justin Lin, head of Alibaba's Qwen model series, puts the odds of a Chinese company surpassing OpenAI or Anthropic in the next three to five years at less than 20 percent. Tang Jie from Zhipu AI warned at the AGI Next Summit in Beijing that the gap with the US may actually be widening, though recent open-source releases suggest otherwise.

At the conference, executives cited limited computing capacity and US export controls on advanced chips as key hurdles. US infrastructure is one to two orders of magnitude larger, forcing Chinese companies to focus resources on current projects.

Yao Shunyu, a former OpenAI researcher and now Tencent's AI chief scientist, was more optimistic. He cited three to five years as a realistic timeframe for China to catch up but said the lack of advanced chipmaking machines was the main technical hurdle.

The cautious outlook follows a strong week on the stock market. Startups Zhipu AI and MiniMax Group together raised over one billion dollars in Hong Kong, with MiniMax shares doubling on their first day of trading.

Read full article about: Convogo's founders join OpenAI to close the gap between AI potential and actual use

OpenAI is bringing in the team behind Convogo, an AI startup that built software for evaluating executives, as part of its broader cloud strategy. Founder Matt Cooper announced the news on LinkedIn. Convogo's software used AI to automatically analyze interviews, surveys, and psychometric tests.

According to OpenAI (via Techcrunch), the acquisition is about the people, not the product. The three founders, Matt Cooper, Evan Cater, and Mike Gillett, will help drive OpenAI's AI cloud efforts. The deal was settled entirely in shares, though the amount remains undisclosed. Convogo's software is being shut down.

The founding team's strong product focus likely made them attractive. Cooper writes that the key to closing the gap between AI's potential and its actual use lies in well-designed, purpose-driven applications, a "usage gap" narrative that Microsoft and OpenAI have both pushed before.

The acquisition also fits OpenAI's strategy of controlling the entire value chain, from infrastructure to models to the end product. This push likely reflects how differentiating on model capabilities alone is getting harder as performance converges and cheaper open-source alternatives catch up.

Read full article about: OpenAI reportedly sets aside $50 billion for employee stock program

Last fall, OpenAI reportedly set aside a stock pool for employees worth about ten percent of the company. Based on the $500 billion valuation from October 2024, that comes to around $50 billion, according to The Information, citing two people familiar with the plans.

OpenAI has also already issued $80 billion in allocated shares. Combined with the new stock pool, employees now own about 26 percent of the company. Meanwhile, OpenAI is in early talks with investors about a new funding round worth roughly $750 billion.

A previous analysis found that OpenAI pays its employees more than any tech startup in history, with stock-based compensation averaging about $1.5 million per employee. That level of spending complicates the path to profitability: the company is targeting around $20 billion in ARR. But on top of hefty payroll, development costs, and day-to-day operations, OpenAI faces about $1.4 trillion in data center commitments over the next eight years.

Read full article about: Global AI compute hits 15 million H100 equivalents, Epoch AI finds

Epoch AI has released a comprehensive database of AI chip sales showing that global computing capacity now exceeds 15 million H100 equivalents. This metric compares the performance of various chips to Nvidia's H100 processor. The data, published on January 8, 2026, reveals that Nvidia's new B300 chip now generates the majority of the company's AI revenue, while the older H100 has dropped below ten percent. The analysis covers chips from Nvidia, Google, Amazon, AMD, and Huawei.

Epoch AI estimates this hardware collectively requires over 10 gigawatts of power - roughly twice what New York City consumes. The figures are based on financial reports and analyst estimates, since exact sales numbers are often not disclosed directly. The dataset is freely available and aims to bring transparency to computing capacity and energy consumption.