Author HubMaximilian Schreiner
Cognitive scientist Melanie Mitchell is pushing back against recent New York Times columns by writer Thomas Friedman, criticizing his framing of advanced AI.
In his pieces, Friedman calls for close U.S.-China collaboration on AI regulation and warns of an approaching "superintelligence." Much of his argument leans on comments from his friend Craig Mundie, the former Microsoft executive, as well as media reports. Mitchell says these claims lack scientific evidence. According to her, examples Friedman cites - like AI "teaching itself" new languages or chatbots pursuing their own hidden agendas - can be explained by training data and have been debunked.
Mitchell describes Friedman’s outlook as "magical thinking." In her view, he ascribes AI with mysterious powers that actually stem from human data and relatively simple mechanisms. She warns that because of Friedman’s broad reach, these myths risk shaping public understanding of AI. Instead of speculative scenarios, Mitchell argues for fact-based realism and human-led regulation.
AI artist Xania Monet has signed a $3 million record deal with Hallwood Media, according to Billboard. The project is led by 31-year-old designer Telisha Jones from Mississippi, who uses the Suno platform to turn her lyrics into music. With Suno currently facing copyright lawsuits from major labels, their subsidiaries held back from making offers.
Monet entered the Billboard charts for the first time last week. Her track "Let Go, Let God" reached No. 21 on the Hot Gospel Songs chart, while another single hit No. 1 on the R&B Digital Song Sales chart. Manager Romel Murphy said around 90 percent of her lyrics draw on personal experiences, and that upcoming projects will include collaborations with human producers.
Hallwood had previously signed another AI artist from Suno back in July.
OpenAI is planning an additional $100 billion in spending on reserve servers over the next five years, according to The Information. By 2030, the company expects to have spent around $350 billion on rented server capacity.
At a Goldman Sachs conference, CFO Sarah Friar explained that OpenAI often has to delay product launches or hold back features because of severe limits on available compute.
The extra servers are meant to protect against sudden spikes in usage and to support future model training. Projections suggest OpenAI will spend about $85 billion per year on servers, nearly half of what Amazon, Microsoft, Google, and Oracle combined earned in 2024. Taken together, these investments push the expected cash outflow through 2029 to $115 billion.
Huawei introduced its new AI supercomputer, the Atlas 950 SuperCluster, at the Connect 2025 conference. The system uses more than 524,000 Ascend-950DT chips and, according to Huawei, can deliver up to 524 FP8 exaFLOPS for training and 1 FP4 zettaFLOP for inference. That makes it capable of handling models with trillions of parameters, according to Huawei.
In contrast to Nvidia's Rubin systems, Huawei continues to focus on scale over individual chip performance. As reported by Tom's Hardware, the setup requires about 64,000 square meters of floor space. Huawei is already planning a follow-up: the Atlas 960, slated for 2027, will include more than one million chips.