Ad
Skip to content

95% of UK students now use AI and their experiences couldn't be more divided

Image description
Nano Banana Pro prompted by THE DECODER

Key Points

  • According to a recent HEPI survey, 95 percent of UK undergraduates now use generative AI, with a growing number incorporating AI-generated text directly into exam submissions.
  • Student opinion is split: roughly half see AI as beneficial, while others fear losing independent thinking skills. A medical student case study found that those relying solely on AI without human feedback performed the worst, yet felt the most confident.
  • Most students consider AI skills essential, but fewer than half feel supported by their lecturers, with clear disparities across disciplines, socioeconomic backgrounds, and gender.

Generative AI is practically universal among British students. But a new survey reveals a major gap between how students use AI and how well universities support them.

In just three years, generative AI has gone from novelty to standard practice at UK universities. According to the Higher Education Policy Institute's (HEPI) Student Generative AI Survey 2026, 95 percent of full-time undergraduates now use AI in at least one form.

In 2024, that number was 66 percent. The survey is based on responses from 1,054 students surveyed in December 2025. The question is no longer whether students use AI, but how well they use it, the authors write.

Horizontal bar chart comparing generative AI use in assessed work across 2024, 2025, and 2026. In 2026, 61 percent use AI to have concepts explained, 49 percent to summarize articles, 40 percent for research ideas, and 39 percent to structure thoughts. 36 percent use AI to search the internet. 25 percent generate text and edit it before submitting, 17 percent generate text and edit with a digital tool, and 13 percent generate visual or other media. The share including AI-generated text directly in assessed work rose from 3 percent in 2024 to 8 percent in 2025 and 12 percent in 2026. Students reporting none of these uses dropped from 47 percent in 2024 to 11 percent in 2025 and 6 percent in 2026.
Students increasingly use generative AI for assessed work. The most common uses in 2026: having concepts explained (61 percent), summarizing articles (49 percent), and generating research ideas (40 percent). The share including AI-generated text directly in assessed work quadrupled from 3 percent in 2024 to 12 percent in 2026. | Image: HEPI

AI-generated text in exams quadruples as cheating fears grow

The vast majority of students use generative AI for assessed work: having concepts explained, summarizing material, and structuring ideas. About a third also use AI as a search engine. The share of students inserting AI-generated text directly into exam papers has quadrupled since 2024. At the same time, the use of general tools like ChatGPT is declining slightly, which the authors attribute to more specialized tools gaining ground.

Ad
DEC_D_Incontent-1

Nearly two-thirds of students say exam formats have changed significantly. In free-text responses, several express fear of being falsely accused of cheating. "Constant worry that my work will flag AI detection, even though I have never used AI to write an assignment," one student wrote.

Students split between deeper learning and intellectual dependency

The survey reveals a deeply divided student body. Just under half say AI has improved their study experience, but others worry about fairness, losing their own skills, and what AI means for the job market. Some feel creative subjects are being devalued.

Two quotes from the study capture the divide. One student says AI has helped them "focus on critical analysis and deeper understanding" by "saving hours of tedious work." Another puts it bluntly: "I'm not using my brain at all."

When it comes to sources, students split into three roughly equal groups: those relying on traditional sources, those using AI, or those combining both. Nearly one in ten barely consults traditional sources anymore.

Ad
DEC_D_Incontent-2

Around 15 percent of students use AI for companionship, advice, or to combat loneliness, according to the report. Some turn to purely AI-based therapy services. Overall, four in ten say AI affects their feelings of loneliness, with positive and negative effects roughly balanced. "It's because it's like having a friend close by," one student wrote. Another: "Just feel isolated."

Horizontal bar chart showing how UK students use generative AI across 2024, 2025, and 2026. Text generation leads at 56 percent in 2026, down from 64 percent in 2025. Summarizing and note-taking reaches 38 percent, writing enhancement 37 percent, and translation support 31 percent. Image and audio generation sits at 22 percent, speech-to-text transcription at 18 percent, coding at 17 percent, and data analysis at 17 percent. New categories in 2026 include friendship, company, or tackling loneliness at 15 percent, AI-only counseling or therapy at 5 percent, and partly AI-based counseling at 4 percent. Students reporting none of these uses dropped from 34 percent in 2024 to 8 percent in 2025 and 5 percent in 2026.
Text generation remains the most common AI use case at 56 percent, while new applications for loneliness, counseling, and therapy are emerging. | Image: HEPI

Most students see AI skills as essential, but fewer than half feel supported

More than two-thirds of students consider AI skills essential, yet fewer than half feel supported by their lecturers. Only about a third say their university actively encourages them to use AI.

The availability of AI tools has multiplied since 2024, and significantly fewer universities now ban AI than the year before. Russell Group universities, which lagged in 2025, are now the most likely to encourage students to use AI.

But persistent inequalities remain across subject, background, and gender. Humanities students are significantly more skeptical and feel particularly unsupported. Students from wealthier households use AI more often, while male students tend to have more prior experience. A third of students arrive at university with no AI experience at all. Environmental concerns play a role, too. Nearly a quarter say the ecological impact of AI keeps them from using it.

Medical students using AI without oversight performed worst but felt most confident

A case study from Queen Mary University of London found that medical students who only used AI without human feedback during clinical exercises performed the worst, but were the most confident in their abilities. Professor Rakesh Patel compares this to giving students a sports car before they've learned to drive. As a counterexample, the report highlights Aston University, which made AI training mandatory across all programs and provided all staff with AI tools as early as 2023.

The HEPI report recommends introducing first-year students to AI in a structured way, creating clear exam guidelines with both AI-free and AI-supported formats, providing tools for all students, and conducting targeted research into how AI affects loneliness and mental well-being.

The debate around AI in education has been intensifying for months. Last year, an Anthropic study showed that in nearly half of all analyzed conversations with the AI assistant Claude, students outsourced higher-order thinking like analysis and creation to the AI.

At the same time, AI companies are pushing into higher education. Anthropic launched "Claude for Education" with a dedicated learning mode, while OpenAI has been courting students with "ChatGPT Edu" since 2024, aiming to lock in users early and keep them as they move into the workforce.

AI pioneer Andrej Karpathy recently argued that schools need a fundamental rethink: assume all work done outside the classroom was created with AI, and shift exams entirely to in-person settings.

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.