Ad
Skip to content
Read full article about: OpenAI claims four engineers and Codex build the Sora Android app in just 28 days

OpenAI claims its team built the Sora Android app in just 28 days by leveraging its code-generation AI, Codex. According to a report from OpenAI employees Patrick Hum and RJ Marsan, a small team of four engineers utilized an early version of the GPT-5.1 Codex model to build the application, processing around five billion tokens along the way.

According to the authors, the AI handled the bulk of the actual writing—specifically tasks like translating existing iOS code into Android-compatible formats. This allowed the human developers to focus on high-level architecture, planning, and verifying the results. The team described Codex as acting like a new, experienced colleague that just needed clear instructions to get the job done. Despite the rapid timeline, OpenAI reports the app is 99.9 percent stable. You can read a detailed breakdown of their process on the OpenAI blog.

Read full article about: Google improves "Search Live" with new AI voice

Google has updated the voice for "Search Live." A new Gemini audio model powers the feature, producing responses that sound more natural and fluid, according to a blog post. Search Live lets users have real-time conversations while displaying relevant websites. The feature is part of Google Search's "AI Mode".

The update rolls out to all Search Live users in the US over the coming week. Users can open the Google app on Android or iOS, tap the Live icon, and speak their question.

The update fits into Google's broader push to build a voice-controlled assistant capable of handling everyday tasks—a goal shared by OpenAI and other major AI companies.

Read full article about: Anthropic places $21 billion order for Google chips via Broadcom

AI lab Anthropic has placed orders totaling $21 billion with Broadcom for Google's AI chips. Broadcom CEO Hock Tan confirmed that the startup is purchasing "Ironwood Racks" equipped with Google's Tensor Processing Units (TPUs).

The move follows a massive cloud partnership between Anthropic and Google announced in late October. That deal grants Anthropic access to up to one million TPUs and is expected to bring over one gigawatt of new AI compute capacity online by 2026. Anthropic maintains a multi-cloud strategy, spreading its workloads across Google TPUs, Amazon's Trainium chips, and Nvidia GPUs.

Read full article about: Google opens its infrastructure for AI models via MCP

Google is integrating Anthropic's Model Context Protocol (MCP) directly into its cloud infrastructure. MCP serves as a universal standard for connecting AI models with external data and tools, eliminating the need to program new interfaces for every application.

Starting immediately, Google is offering managed servers that give AI agents direct access to services like Google Maps, BigQuery, and both the Compute and Kubernetes Engines. This allows AI to handle tasks like independently managing infrastructure or planning travel routes. Through the Apigee platform, companies can also deploy their own internal APIs as AI tools. Google announced plans to expand support to additional services, such as Cloud Storage and databases, in the near future.