Ad
Skip to content
Read full article about: Hume AI open-sources TADA, a speech model five times faster than rivals with zero hallucinated words

Hume AI has open-sourced TADA, an AI system for speech generation that processes text and audio in sync. Unlike previous systems that generate significantly more audio frames per text token, TADA maps exactly one audio signal to each text token. The result, according to Hume AI: TADA is over five times faster than comparable systems and produced zero transcription hallucinations—no made-up or skipped words compared to the source text—across tests with more than 1,000 samples. In human evaluations, the system scored 3.78 out of 5 for naturalness.

Hume AI says TADA is compact enough to run on smartphones, though longer texts can cause the voice to occasionally drift. The system comes in two sizes—1B and 3B parameters—both based on Llama. The smaller model supports English, while the 3B version covers seven additional languages. All code and models are available on GitHub and Hugging Face under the MIT license, and the full technical details can be found in the paper.

Hallucinated references are passing peer review at top AI conferences and a new open tool wants to fix that

Fake citations are slipping past peer review at top AI conferences, and commercial LLMs can’t spot the fakes they generate. A new open-source tool called CiteAudit allegedly catches what GPT, Gemini, and Claude miss.

Read full article about: OpenAI offers open-source maintainers six months of free ChatGPT Pro and Codex access

OpenAI is launching a new support program for open-source developers. Core maintainers of public software projects can apply for six months of free access to ChatGPT Pro with Codex, API credits, and Codex Security. Codex Security, a new AI tool for code security checks, will be reviewed on a case-by-case basis and only granted selectively due to the capabilities of GPT-5.4, according to OpenAI.

Developers who prefer other programming tools like OpenCode, Cline, or OpenClaw can also apply. Projects that don't meet all the criteria but play an important role in the broader software ecosystem are also welcome to apply. The program builds on OpenAI's existing Codex Open Source Fund, which the company has backed with one million dollars.

Google's new open TranslateGemma models bring translation for 55 languages to laptops and phones

TranslateGemma shows how targeted training helps Google squeeze more performance out of smaller models: the 12B version translates better than a base model twice its size and runs on a regular laptop. With the growing Gemma family, Google is staking its claim in the race for open AI models.

Read full article about: Abu Dhabi's TII claims its Falcon H1R 7B reasoning model matches rivals seven times its size

The Technology Innovation Institute (TII) from Abu Dhabi has released Falcon H1R 7B, a compact reasoning language model with 7 billion parameters. TII says the model matches the performance of competitors two to seven times larger across various benchmarks, though as always, benchmark scores only loosely correlate with real-world performance, especially for smaller models. Falcon H1R 7B uses a hybrid Transformer-Mamba architecture, which lets it process data faster than comparable models.

Falcon H1R 7B scores 49.5 percent across four benchmarks, outperforming larger models like Qwen3 32B (46.2 percent) and Nemotron H 47B Reasoning (43.5 percent). | Image: Technology Innovation Institute (TII)

The model is available as a complete checkpoint and quantized version on Hugging Face, along with a demo. TII released it under the Falcon LLM license, which allows free use, reproduction, modification, distribution, and commercial use. Users must follow the Acceptable Use Policy, which TII can update at any time.

Read full article about: Meta is reportedly ditching open Llama models for Avocado, a closed model built for direct sales

According to Bloomberg's sources, Meta is shifting its focus to a new AI model codenamed "Avocado," with a release potentially coming next spring. Avocado is expected to launch as a closed model, letting the company sell access directly. This marks a major shift from Meta's established open-model strategy. Internally, the open-source approach reportedly lost steam after the disappointing performance of Llama 4. Management is betting big on Alexandr Wang, who joined Meta following the company's deal with Scale AI.

The development process involves some surprising ingredients. According to Bloomberg, the team is training Avocado using several external models, including Google's Gemma, OpenAI's gpt-oss, and Alibaba's Qwen. Using Chinese technology clashes with CEO Mark Zuckerberg's previous warnings about Chinese censorship.