糖心Vlog

‘Guard rails needed’ as big tech targets students with AI tutors

Fears raised over ‘culturally encouraged introversion’ but could study assistants help bridge gap between universities and Silicon Valley?

Published on
八月 25, 2025
Last updated
八月 25, 2025
A woman looks at a laptop screen displaying the logo of 'OpenAI', to illustrate that academics question the pedagogical impact of AI learning assistants.
Source: Didem Mente/Anadolu via Getty Images

AI-powered study assistants are “going to have a role, whether we like it or not”, as technology giants target higher education, and academics should consider how best to support students to use them or risk “being led by neophilia”.

ChatGPT?last month launched its “Study Mode”?option, which the company said helps students “work through questions step by step”.

Claude and Gemini have also introduced similar learning modes in recent months, while academic publishers have sought to add AI tools to help students use their resources.

The moves come after a wave of concern that students were using AI to cheat on their university work amid near-universal adoption among the current cohort of students.

Universities need to understand how the?new tools work and set clear expectations of where they can be used effectively, and where “it’s not very good yet”, said Lee Mager, AI innovation development lead at the London School of Economics.

“The head in the sand approach is the worst possible approach a university can take,” he added.

AI assistants have “a lot of potential”, Mager said, but cautioned that “off-the-shelf tools”, which are not developed alongside course leaders and academics, are limited in their effectiveness. ?

“Any out-of-the-box, Mooc app is just never going to be good enough. [These companies are] just trying to show the schools ‘look, we’re not a threat to you, and you can get this cool learning app if you buy a partnership with us’.”

He also raised concerns that such tools may perpetuate a “culturally encouraged introversion” whereby students are more comfortable speaking to an AI tutor rather than an academic.

Such assistants are most useful, he said, when they’re used as a “tool” alongside traditional learning models, rather than replacing them.

Phil Newton, a neuroscientist at Swansea University Medical School, said while “the jury’s still out on whether they’re any good”, AI study assistants could bring a “lot of positives” for students trying to better engage with academic life.

“For lots of people, in many ways, [AI tutors] feel a bit safer,” he said. “The tools are not going to judge you unless you ask them…they’ve got an almost limitless amount of very expert-level knowledge on whatever it is you want to ask them. It can be very rewarding to do that and to help yourself prep for an interaction with a tutor.”

Universities can increasingly expect big tech and AI firms to target the edtech sector, he said, as “not only is this already happening, but these companies really want to expand into higher education. They see it as a market that’s going to be a success for them. So it is happening, and it’s going to happen, whether we like it or not.”

While OpenAI insists that users are in charge of how their data is used and can choose to opt out of having it used for training purposes, Newton said his biggest concern with these technologies was what happens to the student data that is inputted.

“These companies are getting enormous amounts of data about individuals, about their learning, what they’re learning, and how they’re learning, and I don’t think there is good evidence that there are sufficient safeguards in place for those interactions, particularly in the US.”

Publishers are also increasingly adding AI features to academic papers and textbooks. Oxford University Press recently introduced an “” to its “Law Trove” online archive of law textbooks.

The tool helps summarise textbooks for students and generates quizzes based on the content. It warns: “As with any generative AI tool, hallucinations remain possible, so it’s good practice to make checks against original content.”

Michael Veale, associate professor in the Faculty of Law at UCL, said it’s “unclear whether these quizzes are pre-built, tested and validated”, or whether they are built “on the fly” by AI.?

“If these are built on the fly it means that no human, particularly not the author, has actually had a chance to look at them and say, ‘yes, that quiz is both the correct thing to say, and also is pedagogically useful for the learning aim for the chapter,” he said, adding that such features should be vetted and “verified” by the author.?

“What I haven’t seen any of these publishers do is show any pedagogical research that indicates that these are going to help,” he said. “This stuff is being led, I think, by a kind of neophilia where the publishers want to show they’re doing something.”

An OpenAI spokesperson said that AI should complement the traditional classroom experience, not replace it, and noted that it worked alongside learning scientists, teachers, and pedagogy experts to develop its Study Mode tool.

Oxford University Press was approached for comment.?

juliette.rowsell@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
Please
or
to read this article.

Reader's comments (2)

Your concerns might be misplaced. The issue is not whether or not university like them, but whether students will put up with being spoon fed by this rubbish given the amount they are paying for "tuition". Decades ago there was a panic over computer-assisted learning [itself an offspring of the "AI" affectionados replacing face-to-face teaching. Unis rushed to install CAL labs and were a little gobsmacked when the students simply declined to use the expensive new toys but preferred to work with their human tutors snd peers. The technology might be different, but the underlying issues are the same. Students might try out new methods, but that is no guarantee they will embrace them in a way that gives a return on investment to the companies building or marketing or the unis trying to hop onto the latest hobbyhorse.
new
This is a good point and it's true that all we get these days are demands for more contact hours with staff (which the students may or may not nother to attend) so they won't like being fobbed off. But tbh the CAL of the past was pretty rubbish and very much for the teaching nerds, no-one really bought into it in a big way. But as this article and others like it point out this is not so much some crap add on no-one wanted except the nerds who wasited their time investing in it (no disrespect to CAL colleagues intended by the way) but more of a zeitgeist in the wider world in my opinion. which will just impact every thing and the students will more and more come to expect this kind of thing.
ADVERTISEMENT