Students at Staffordshire University have said they are feeling “deprived of knowledge and enjoyment” after large parts of a course they hoped would launch their digital careers were taught by AI.
James and Owen were among 41 students who took a coding module in Staffordshire last year, hoping to change careers through a government-funded apprenticeship program designed to help them become cyber security experts or software engineers.
But after the AI-generated slides were sometimes read by an AI voiceover, James said he lost confidence in the program and the people running it, worrying that he had spent “two years” of his life on a course that was done “in the cheapest possible way”.
“If we handed in AI-generated stuff, we’d be kicked out of uni, but we’re being taught by AI,” James said during a confrontation with his lecturer, recorded as part of a course in October 2024.
James and other students confronted university officials several times about the AI materials. But it appears that the university is still using AI-generated materials to teach courses. This year, the university uploaded a policy statement to the curriculum website to justify the use of AI, offering “a framework for academic professionals to leverage AI automation” in scholarly work and teaching.
The university’s public-facing policies limit students’ use of AI, saying that students who outsource work to AI or claim AI-generated work as their own are violating its integrity policy and can be challenged for academic misconduct.
“I’m at the halfway point of my life, my career,” James said. “I don’t think I can leave here now and start another career again. I’m stuck on this course.”
The Staffordshire case comes as more universities use AI tools – to teach students, tailor course content and deliver personalized feedback. A policy paper from the Department of Education released in August lauded the development, saying that generative AI “has the power to transform education”. A survey (PDF) of 3,287 higher education teaching staff last year by educational technology firm Zysk found that almost a quarter were using AI tools in their teaching.
For students, AI teaching appears to be less transformative than discouraging. In the US, students post negative online reviews about professors who use AI. In the UK, graduate students have complained on Reddit that their lecturers are copying and pasting feedback from ChatGPT or using AI-generated images in courses.
One student wrote, “I understand the pressure on lecturers right now that might force them to use AI, it seems frustrating.”
James and Owen said they noticed the use of AI “almost immediately” in their Staffordshire course last year when, during their first class, the lecturer put together a PowerPoint presentation that included an AI version of her voice reading from slides.
Soon after, he said, he noticed other signs that some course content was AI-generated, including American English edited inconsistently to British English, suspicious file names, as well as “general, surface-level information” that sometimes inexplicably referred to American law.
The signs of AI-generated content continued this year. In a course video uploaded to the website, a voiceover presenting the material suddenly changes to a Spanish accent for about 30 seconds before switching back to a British accent.
The Guardian reviewed the Staffordshire curriculum materials and used two different AI detectors – Winston AI and Originality AI – to scan this year’s curriculum materials. The two found that many assignments and presentations “had a high probability of being AI-generated”.
At the beginning of the course, James said, he raised his concerns to the student representative during a monthly meeting. Then, in late November, he broadcast them during a lecture, which was recorded as a part of the course material. In the recording, he asks the lecturer not to bother with the slides.
“I know these slides are AI-generated, I know everyone in this meeting knows these slides are AI-generated, I would like you to remove these slides,” he says. “I don’t want to be taught by GPT.”
Shortly afterwards, the course’s student representative said: “We have sent this back, James, and the response was that teachers are allowed to use a variety of devices. We were quite disappointed by this response.”
Another student says: “There is some useful stuff in the presentation. But it’s like, 5% are useful nuggets, and there’s a lot of repetition. There’s some gold in the bottom of this pan. But we can probably get the gold ourselves by asking ChatGPT.”
The lecturer laughs uncomfortably. “I appreciate people being candid…” he says, then turns the subject to another tutorial he created – using ChatGPT. “To be honest, I did it on short notice,” he says.
Eventually, the course leader told James that two human lecturers would study the material for the final session, “so you don’t get the AI experience”.
In response to a query from the Guardian, Staffordshire University said that “academic standards and learning outcomes were maintained” on the course.
It states: “Staffordshire University supports the responsible and ethical use of digital technologies in line with our guidance. AI tools can support elements of preparation, but they do not replace academic expertise and should always be used in ways that maintain academic integrity and sector standards.”
While the university did bring in a non-AI lecturer for the final lecture of the course, James and Owen said it was too little, too late, especially because the university appears to have used AI in this year’s teaching materials as well.
“I feel like part of my life has been stolen,” James said.
Owen, who is in the midst of a career change, said he had chosen the course to gain underlying knowledge, not just qualifications – and he felt it was a waste of time.
“Sitting with this material in front of you is really not worth anyone’s time when you could be spending that time on something really meaningful, it’s really frustrating,” he said.