Ad
Short

Anthropic's Claude now integrates directly with Microsoft 365, letting users pull in content from Outlook, SharePoint, OneDrive, and Teams. This means emails, calendars, and documents can show up right in a Claude chat, making it easier to bring company data into the conversation.

Anthropic is also adding a centralized search that covers company resources like HR documents and team guidelines. These features are available to Claude Team and Enterprise users, but admins have to enable them first.

The rollout follows Microsoft's recent Copilot integration in Windows, showing just how closely the competition for AI in the workplace is tied to the digital office ecosystem.

Ad
Ad
Short

According to two people familiar with the matter, AI startup Anthropic expects to nearly triple its annual revenue by 2026, Reuters reports. The company projects an annualized revenue of $9 billion by the end of 2025 and is targeting more than $20 billion in 2026 under its base scenario, or as much as $26 billion in the most optimistic case.

Anthropic told Reuters that its annualized revenue stood at around $7 billion as of October. Most of the company's growth comes from enterprise clients, who account for roughly 80 percent of its revenue. The startup was recently valued at $183 billion after raising $13 billion in new funding.

Short

Google Labs Creative Director Henry Daubrez says the new Veo 3.1 update is being overhyped. Though it adds helpful features like image-to-image animation, he sees the changes as minor. Daubrez blames financial pressure in the AI industry for turning small updates into big marketing moments. Veo is part of Flow by Google.

"The bigger issue is that the enormous financial stakes around AI are turning timelines into marketing noise, with rumors inflated to the point where every incremental update gets treated like a paradigm shift."

Henry Daubrez

Ad
Ad
Short

Apple's new M5 chip is aimed at speeding up AI features on the MacBook Pro, iPad Pro, and Apple Vision Pro. Apple says the M5 delivers more than four times the GPU performance for AI tasks compared to the previous M4 chip. Each of the ten GPU cores includes a built-in "Neural Accelerator." The 16-core Neural Engine can process up to 38 trillion operations per second, making it 60 percent faster than the Neural Engine in the M4. The CPU is also up to 15 percent quicker at multi-threaded processing.

The chip uses a third-generation 3 nm manufacturing process, offers 153 GB/s of shared memory bandwidth (about 30 percent more than before), and supports up to 32 GB of unified memory. This allows larger AI models to run directly on the device. Apple expects that existing AI apps and tools like Apple Intelligence will see faster performance. Developers can access the AI hardware using Core ML, Metal 4, and Tensor APIs.

Ad
Ad
Google News