Author HubMaximilian Schreiner
Google is integrating Anthropic's Model Context Protocol (MCP) directly into its cloud infrastructure. MCP serves as a universal standard for connecting AI models with external data and tools, eliminating the need to program new interfaces for every application.
Starting immediately, Google is offering managed servers that give AI agents direct access to services like Google Maps, BigQuery, and both the Compute and Kubernetes Engines. This allows AI to handle tasks like independently managing infrastructure or planning travel routes. Through the Apigee platform, companies can also deploy their own internal APIs as AI tools. Google announced plans to expand support to additional services, such as Cloud Storage and databases, in the near future.
US aviation startup Boom Supersonic, typically focused on developing a supersonic passenger jet, is making a surprise entry into the energy business to capitalize on the AI boom.
Founder Blake Scholl unveiled "Superpower," a 42-megawatt gas turbine designed specifically to handle the massive energy loads of AI data centers. With the US power grid struggling to meet demand, companies are increasingly turning to independent power sources like these turbines to keep their facilities running.
The system uses the core of the "Symphony" engine, which was originally built for the company's planned Overture jet. Scholl notes that unlike older models, the turbine can maintain full power in high heat without requiring water cooling.
Crusoe has signed on as the launch customer, and Boom has secured $300 million to begin production. The company plans to use the revenue from the turbine business to help fund the development of its aircraft.
Essential AI's new open-source model, Rnj-1, outperforms significantly larger competitors on the "SWE-bench Verified" test. This benchmark is considered particularly challenging because it evaluates an AI's ability to independently solve real-world programming problems. Despite being a compact model with just eight billion parameters, Rnj-1 scores 20.8 points.

By comparison, similarly sized models like Qwen 3 (without reasoning, 8B) only reach 4.5 points in Essential AI's testing. The system was introduced by Ashish Vaswani, co-founder of Essential AI and co-author of the famous "Attention Is All You Need" paper that launched the Transformer architecture. Rnj-1 is also Transformer-based, specifically utilizing the Gemma 3 architecture. According to the company, development focused primarily on better pre-training rather than post-training methods like reinforcement learning. These improvements also result in lower pre-training computational costs, thanks to the use of the Muon optimizer.