Ad
Skip to content
Read full article about: OpenAI plans to nearly double its workforce by 2026 as it ramps up enterprise push

The AI lab wants to grow from 4,500 to 8,000 employees by the end of 2026, the Financial Times reports, citing two people familiar with the plans. Most new hires will go into product development, engineering, research, and sales. OpenAI is also bringing on "technical ambassadorship" specialists to help companies integrate its tools.

Much of this hiring likely ties back to OpenAI's Frontier, an agent-based AI platform designed to embed deeply into company workflows, the kind of integration that requires hands-on development at the customer's site. OpenAI has already launched the Frontier Alliance with consulting firms like McKinsey, and partnerships with private equity firms are in the works.

The broader context is OpenAI's push to win enterprise customers, particularly in coding, where Anthropic has been steadily gaining ground. While OpenAI was focused on ChatGPT features, image generation, video models, and all the weird outcomes that came with people actually using this technology, Anthropic quietly carved out a bigger share of the enterprise space. OpenAI is now reportedly building a desktop super app that bundles all its key features into one platform.

OpenAI's chief scientist trusts AI with experiments but says it's not at the level to design complex systems

OpenAI Chief Scientist Jakub Pachocki used to write every line of code by hand. Now AI handles experiments that once took him a week, but he’s not ready to let it run the show.

Read full article about: OpenAI acquires Astral to bring Python's most popular dev tools into its Codex AI coding platform

OpenAI is acquiring Astral, the company behind the widely used Python tools Ruff, uv, and ty. Astral founder Charlie Marsh announced that his team is joining OpenAI's Codex team, the company's platform for agentic AI coding. According to Marsh, Astral's tools are downloaded hundreds of millions of times each month and have become a core part of modern Python development. Marsh says integrating them with Codex gives both projects the most room to grow. Astral was backed by Accel and Andreessen Horowitz, among others.

Our goal with Codex is to move beyond AI that simply generates code and toward systems that can participate in the entire development workflow—helping plan changes, modify codebases, run tools, verify results, and maintain software over time. Astral’s developer tools sit directly in that workflow.

OpenAI

OpenAI says it will keep the tools open source after the acquisition closes. Astral's Douglas Creager wrote on Hacker News that the tools are under a permissive license, so in a worst-case scenario, the community could fork the software and continue developing it independently.

No one can guarantee how motives, incentives, and decisions might change years down the line. But that's why we bake optionality into it with the tools being permissively licensed.

Douglas Creager
Read full article about: OpenAI overhauls ChatGPT's model selection

OpenAI has redesigned how model selection works in ChatGPT. Instead of individual model names, users now see up to three tiers at first glance, depending on their subscription: "Instant" for quick, everyday responses, "Thinking" for more complex tasks, and "Pro" for the most powerful models. The new menu lets users pick a specific model version from a dropdown - options include "Latest" (currently 5.4), 5.2, 5.0, or o3.

More granular settings are available under "Configure." That's where users can turn on the old Auto function, which lets ChatGPT switch from Instant to Thinking when it detects a more complex question. OpenAI has also recently simplified the repeat menu under Answers and added the "Nerdy" personality style. On top of that, the company is rolling out GPT-5.4 mini and improving GPT-5.3 Instant, which now uses less sensationalized wording according to the changelog.

The so-called routing system—where ChatGPT decides which model handles a given request—has been a sore spot for OpenAI for a while now. Many users found the system opaque when it first launched, since the router didn't always pick the most capable model. That fueled suspicion that OpenAI was quietly steering expensive requests toward cheaper models to save on compute costs.

Read full article about: OpenAI's AWS deal may undermine Microsoft's Azure exclusivity rights

Microsoft fears OpenAI's AWS deal may violate Azure exclusivity contract.

"We are confident that OpenAI understands and respects the importance of living up to [its] legal obligation," a Microsoft spokesperson told The Information. A statement that sounds less like confidence and more like a warning.

Microsoft holds the exclusive rights to sell OpenAI's models directly to cloud customers through its Azure platform. But OpenAI and AWS are planning a new product, what they call a "stateful runtime environment," that runs OpenAI models entirely on AWS infrastructure without relying on the Microsoft-hosted versions.

AWS doesn't intend to sell model APIs directly but rather offer tools for developing custom AI applications, effectively sidestepping the contractual exclusivity on a technical level.

Read full article about: OpenAI turns model compression into a talent hunt with its 16 MB "Parameter Golf" challenge

OpenAI challenges researchers to build the best language model in just 16 MB - and uses the competition to scout talent. In an open research competition called "Parameter Golf," OpenAI is asking developers to build the best possible language model under tight constraints: weights and training code combined must stay under 16 MB, and training can take no longer than ten minutes on eight H100 GPUs. Submissions are judged on compression performance against a fixed FineWeb dataset.

OpenAI is putting up one million dollars in computing credits through its partner Runpod. Top performers may get invited for job interviews - the company plans to hire a small group of junior researchers in June, including students and Olympiad winners. The GitHub repository includes baseline models, evaluation scripts, and a public leaderboard. Anyone 18 or older in supported countries can participate through April 30.

The competition for AI talent among big tech companies is more intense than ever. Meta has repeatedly poached top researchers from OpenAI, in some cases offering compensation packages reportedly worth up to 300 million dollars.

Pentagon plans to let AI companies train models on classified data

The US Department of War is working to set up secure environments where AI companies can train their models on classified data. Until now, models were only allowed to read classified data, not learn from it.

OpenAI ships GPT-5.4 mini and nano, faster and more capable but up to 4x pricier

OpenAI has released two new compact models—GPT-5.4 mini and nano—built for coding assistants, subagents, and computer control. GPT-5.4 mini nearly matches the full model’s performance, but both new models come with a steep price hike over their predecessors.