Author HubMatthias Bastian
Yann LeCun, Chief Scientist at Meta's FAIR lab, is the focus of a new "AI Stories" documentary. In the film, LeCun talks in Paris about his early work on neural networks, his collaboration with Jeff Hinton, and the evolution of deep learning and open-source AI.
LeCun believes the real race in AI is about openness, not national borders. "What we're seeing is not a competition between regions but more a competition between the open research, open-source world and the proprietary world," he says. For LeCun, real progress in AI comes from open systems that make innovation widely accessible.
The timing is notable, as Mark Zuckerberg recently suggested that Meta could reconsider its open-source approach with Llama. If that happens, it's unclear whether Meta would still be the right place for LeCun.
Cohere has raised $500 million in a new funding round, pushing its valuation to $6.8 billion. The Canadian company builds AI models and services for enterprise customers.
Alongside the funding news, Cohere is bringing in Joelle Pineau as its new Chief AI Officer. Pineau previously served as VP of AI Research at Meta, where she led the FAIR team. She is also a professor at McGill University and a member of the Mila Institute in Montréal, and will continue her work with both organizations while joining Cohere.
Cohere's previous funding round was in July 2024, when it also raised $500 million at a $5.5 billion valuation. Major backers include Canadian pension fund PSP Investments, Cisco, Fujitsu, AMD, and export credit agency EDC.
Anthropic has raised the context window for Claude Sonnet 4 to one million tokens on the Anthropic API, Amazon Bedrock, and soon Google Cloud Vertex AI. This is five times larger than before, letting users process entire codebases or large sets of research documents in a single run. The change is primarily for developers working with extensive source code or those who need to summarize large volumes of text. The one million token context is currently in public beta for customers with Tier 4 or custom API limits.

The expanded context window comes with steeper pricing. For requests over 200,000 input tokens, Anthropic charges $6 per million tokens, double the usual rate. Output tokens cost $22.50 per million, up from the standard $15. Anthropic points to prompt caching and batch processing as ways to cut costs, with batch processing potentially lowering expenses by up to 50 percent.