Google's AI systems processed over 980 trillion tokens in June—more than double the amount in May, according to Google product manager Logan Kilpatrick and DeepMind CEO Demis Hassabis. Tokens are short text chunks used by AI models to understand or generate responses.

Ad
Image: via X

The increase may not only reflect higher use, but especially more use of so-called reasoning models like Gemini Flash 2.5, which process many more tokens to give more accurate responses. Artificial Analysis reports Flash 2.5 uses about 17 times more tokens than its previous version and is 150 times more expensive for reasoning tasks.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Matthias is the co-founder and publisher of THE DECODER, exploring how AI is fundamentally changing the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.