Ad
Skip to content
Read full article about: UK government taps Anthropic AI to help citizens find jobs

The British government has chosen Anthropic to develop an AI assistant for the GOV.UK website. The Department for Science, Innovation and Technology (DSIT) plans to use the system to help citizens navigate government services and receive personalized guidance. The initial focus will be on jobseekers - helping them with career advice, connecting them to training opportunities, and explaining available programs.

The partnership builds on a declaration of intent signed in February 2025. Anthropic engineers are collaborating directly with UK officials to ensure the government can eventually run the system on its own. Users will keep full control over their data and can opt out at any time.

Anthropic's regional head Pip White said the collaboration demonstrates how AI can be deployed safely for the public good. The company isn't the only US tech firm making moves in the UK - Microsoft, OpenAI, and Nvidia committed over 31 billion pounds to British AI infrastructure last year.

There's one notable difference between Anthropic and some of its competitors: while OpenAI holds a $200 million contract with the US Department of Defense, Anthropic prohibits US law enforcement agencies from using its models for domestic surveillance.

Former Tesla AI chief Andrej Karpathy now codes "mostly in English" just three months after calling AI agents useless

Just last October, Andrej Karpathy dismissed AI agents: “They just don’t work.” Now he says 80 percent of his coding is agent-based and calls it the “biggest change to my basic coding workflow in ~2 decades.” A typically measured voice is joining the agent coding hype, but with some warnings attached.

Read full article about: OpenAI reportedly launches ChatGPT ads at premium TV prices

OpenAI is charging around $60 per 1,000 impressions for its initial ChatGPT ads, far above typical online advertising rates in the low single digits and closer to what advertisers pay for premium TV spots like NFL games, according to The Information. The ads show up below ChatGPT responses in the free and lower-cost "Go" tiers.

OpenAI is also reportedly charging per impression rather than per click. Advertisers typically prefer click-based billing because it's easier to measure results. The decision to go with impressions likely reflects how AI chatbot users behave differently than traditional search users: they click on external links far less often. Perplexity uses the same approach, also charging per 1,000 impressions.

The move toward advertising—at premium prices and in a format that's less appealing to advertisers—suggests OpenAI needs to ramp up revenue quickly to justify its high valuation to investors. Sam Altman previously called ChatGPT advertising a last resort and a potential dystopia.

Read full article about: Microsoft's Maia 200 AI chip claims performance lead over Amazon and Google

Microsoft has unveiled its new AI inference chip, Maia 200. Built specifically for inference workloads, the chip delivers 30 percent better performance per dollar than current-generation chips in Microsoft's data centers, the company claims. It's manufactured using TSMC's 3-nanometer process, packs over 140 billion transistors, and features 216 GB of high-speed memory.

According to Microsoft, the Maia 200 is now the most powerful in-house chip among major cloud providers. The company claims it delivers three times the FP4 performance of Amazon's Trainium 3 while also outperforming Google's TPU v7 in FP8 calculations—though independent benchmarks have yet to verify these figures.

Microsoft
Microsoft's comparison shows the Maia 200 outperforming Amazon's Trainium 3 and Google's TPU v7 across key specifications. | Image: Microsoft

Microsoft says the chip already powers OpenAI's GPT 5.2 models and Microsoft 365 Copilot. Developers interested in trying it out can sign up for a preview of the Maia SDK. The Maia 200 is currently available in Microsoft's Iowa data center, with Arizona coming next. More technical details about the chip are available here.

Read full article about: Nvidia pours $2 billion into Coreweave

Nvidia invests $2 billion in cloud provider Coreweave, buying shares at $87.20 each. The two companies are expanding their existing partnership to build AI data centers with more than 5 gigawatts of capacity by 2030.

As part of the deal, Coreweave will deploy multiple generations of Nvidia hardware, including the Rubin platform, Vera processors, and Bluefield storage systems. The partners also plan to integrate Coreweave's software into Nvidia's reference architectures for cloud providers and enterprise customers.

Coreweave went public in March 2025 and specializes in AI-optimized cloud services. The company is involved in expanding OpenAI's Stargate project. OpenAI has also invested several billion dollars in Coreweave.

Emergency meetings and failed billion-dollar talks reveal the chaos behind Apple's pivot to Google Gemini

Internal crisis meetings, a leader who cried “bullshit” and convinced no one, and billion-dollar negotiations that fell apart: Bloomberg reveals the backstory behind Apple’s decision to partner with Google.