Content
summary Summary

A recent study of 666 participants reveals a troubling connection: the more people use AI tools, the worse they perform on critical thinking tests. The impact appears strongest among young users.

Ad

Researcher Michael Gerlich at the Swiss Business School used the Halpern Critical Thinking Assessment (HCTA) to measure participants' abilities. The test combines multiple-choice and open-ended questions to evaluate various aspects of critical thinking, from analyzing arguments to understanding probability.

The findings point to a clear pattern: frequent AI users consistently scored lower on critical thinking tests. The researchers attribute this to "cognitive offloading" - the habit of letting AI handle thinking tasks instead of working through problems ourselves. While the critical thinking scores came from standardized tests, the AI usage data relied on participants' self-reporting.

The study found that participants between 17 and 25 years old used AI tools the most and scored the lowest on critical thinking tests. In contrast, those over 46 used AI less frequently and demonstrated stronger critical thinking skills.

Ad
Ad

Education makes a difference

Higher education seems to offer some protection against these effects. People with more advanced degrees maintained better critical thinking skills even when using AI regularly. They tended to question AI-generated information more often and think problems through more thoroughly.

Many participants expressed worry about becoming too dependent on AI. "It’s great to have all this information at my fingertips, but I sometimes worry that I’m not really learning or retaining anything. I rely so much on AI that I don’t think I’d know how to solve certain problems without it," admitted one 25-year-old participant.

Balancing AI use in education

The researcher suggests schools need to find the right balance with AI tools. Instead of letting AI handle passive tasks, they recommend focusing on active learning strategies that develop critical thinking skills.

The study recommends training teachers to use AI in ways that encourage, rather than replace, student thinking. It also emphasizes teaching students when and how to use AI appropriately while maintaining their own cognitive abilities.

Trust in AI emerged as another key factor. The more people trust AI systems, the more likely they are to delegate their thinking to them - creating a cycle that further diminishes critical thinking skills.

Recommendation

Gerlich points out that the relationship between AI use and thinking skills isn't simple. While the study clearly shows that people who use AI more tend to think less critically, there are many factors at play. Even though AI tools make things faster and easier to access, they might be quietly discouraging people from engaging in deeper, more thoughtful analysis.

Gerlich says we need to look closer at these effects over time. The only way to really understand how AI shapes our thinking patterns is to follow people for several years, tracking both how they use AI and how their cognitive abilities develop. Right now, we're just seeing a snapshot of a much bigger picture.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • A study of 666 participants found that frequent use of AI tools is associated with lower scores on critical thinking tests, particularly among younger users aged 17-25.
  • The researchers attribute this decline in critical thinking skills to "cognitive offloading," where people rely on AI to handle thinking tasks instead of working through problems themselves.
  • Higher education appears to mitigate the negative effects of AI use on critical thinking, as participants with advanced degrees maintained better critical thinking skills even when using AI regularly and were more likely to question AI-generated information.
Sources
Max is managing editor at THE DECODER. As a trained philosopher, he deals with consciousness, AI, and the question of whether machines can really think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.