A coalition of tech investors and AI companies is launching a US political action network called "Leading the Future" to influence AI legislation. The initiative is backed by venture capital firm Andreessen Horowitz, OpenAI president Greg Brockman and his wife Anna Brockman, as well as Perplexity and investor Ron Conway. The network plans to spend over $100 million on campaign donations and digital outreach to support candidates seen as tech-friendly and oppose those pushing for stricter AI regulations. Organizers Josh Vlasto and Zac Moffatt say the goal isn’t deregulation, but to promote "sensible guardrails." Initially, efforts will focus on four key states: New York, California, Illinois, and Ohio. Modeled after the crypto-focused Fairshake initiative, the network aims to work across party lines. Its main objective is to prevent what the industry sees as a fragmented patchwork of AI laws across the US.
Brave discovered a security flaw in Perplexity’s AI browser Comet that allows for so-called indirect prompt injection attacks. In these attacks, malicious commands are hidden in web pages or comments and are then interpreted by the AI assistant as legitimate user instructions when summarizing a page. During testing, Brave showed that Comet could be tricked into reading out sensitive user data, like email addresses and one-time passwords, and sending them to attackers. Perplexity responded by issuing updates, but according to Brave, the issue still isn’t fully resolved. Brave also offers its own AI assistant, Leo, in its browser and faces similar security challenges.
xAI has released Grok 2 as an open model, including the weights. Elon Musk announced on X that Grok 2.5, xAI's top model for 2024, is now open source. The weights for Grok 2 are available on Hugging Face. Musk also said Grok 3 will be released as open source in about six months.
Grok 2 is available under the xAI Community License. Usage is free for research and non-commercial projects, while commercial use must follow xAI's guidelines. The license prohibits using Grok 2 to develop or train other large AI models. If you redistribute the model, you have to credit the source and include "Powered by xAI."
Open-weight reasoning models often use far more tokens than closed models, making them less efficient per query, according to Nous Research. Models like DeepSeek and Qwen use 1.5 to 4 times more tokens than OpenAI and Grok-4—and up to 10 times more for simple knowledge tasks. Mistral's Magistral models stand out for especially high token use.
Average tokens used per task by different AI models. | Image: Nous ResearchIn contrast, OpenAI's gpt-oss-120b, with very short reasoning paths, shows that open models can be efficient, especially for math problems. Token usage depends heavily on the type of task. Full details and charts are available at Nous Research.
