AI-powered study assistants are “going to have a role, whether we like it or not”, as technology giants target higher education, and academics should consider how best to support students to use them or risk “being led by neophilia”.
ChatGPT?last month launched its “Study Mode”?option, which the company said helps students “work through questions step by step”.
Claude and Gemini have also introduced similar learning modes in recent months, while academic publishers have sought to add AI tools to help students use their resources.
The moves come after a wave of concern that students were using AI to cheat on their university work amid near-universal adoption among the current cohort of students.
Universities need to understand how the?new tools work and set clear expectations of where they can be used effectively, and where “it’s not very good yet”, said Lee Mager, AI innovation development lead at the London School of Economics.
“The head in the sand approach is the worst possible approach a university can take,” he added.
AI assistants have “a lot of potential”, Mager said, but cautioned that “off-the-shelf tools”, which are not developed alongside course leaders and academics, are limited in their effectiveness. ?
“Any out-of-the-box, Mooc app is just never going to be good enough. [These companies are] just trying to show the schools ‘look, we’re not a threat to you, and you can get this cool learning app if you buy a partnership with us’.”
He also raised concerns that such tools may perpetuate a “culturally encouraged introversion” whereby students are more comfortable speaking to an AI tutor rather than an academic.
Such assistants are most useful, he said, when they’re used as a “tool” alongside traditional learning models, rather than replacing them.
Phil Newton, a neuroscientist at Swansea University Medical School, said while “the jury’s still out on whether they’re any good”, AI study assistants could bring a “lot of positives” for students trying to better engage with academic life.
“For lots of people, in many ways, [AI tutors] feel a bit safer,” he said. “The tools are not going to judge you unless you ask them…they’ve got an almost limitless amount of very expert-level knowledge on whatever it is you want to ask them. It can be very rewarding to do that and to help yourself prep for an interaction with a tutor.”
Universities can increasingly expect big tech and AI firms to target the edtech sector, he said, as “not only is this already happening, but these companies really want to expand into higher education. They see it as a market that’s going to be a success for them. So it is happening, and it’s going to happen, whether we like it or not.”
While OpenAI insists that users are in charge of how their data is used and can choose to opt out of having it used for training purposes, Newton said his biggest concern with these technologies was what happens to the student data that is inputted.
“These companies are getting enormous amounts of data about individuals, about their learning, what they’re learning, and how they’re learning, and I don’t think there is good evidence that there are sufficient safeguards in place for those interactions, particularly in the US.”
Publishers are also increasingly adding AI features to academic papers and textbooks. Oxford University Press recently introduced an “” to its “Law Trove” online archive of law textbooks.
The tool helps summarise textbooks for students and generates quizzes based on the content. It warns: “As with any generative AI tool, hallucinations remain possible, so it’s good practice to make checks against original content.”
Michael Veale, associate professor in the Faculty of Law at UCL, said it’s “unclear whether these quizzes are pre-built, tested and validated”, or whether they are built “on the fly” by AI.?
“If these are built on the fly it means that no human, particularly not the author, has actually had a chance to look at them and say, ‘yes, that quiz is both the correct thing to say, and also is pedagogically useful for the learning aim for the chapter,” he said, adding that such features should be vetted and “verified” by the author.?
“What I haven’t seen any of these publishers do is show any pedagogical research that indicates that these are going to help,” he said. “This stuff is being led, I think, by a kind of neophilia where the publishers want to show they’re doing something.”
An OpenAI spokesperson said that AI should complement the traditional classroom experience, not replace it, and noted that it worked alongside learning scientists, teachers, and pedagogy experts to develop its Study Mode tool.
Oxford University Press was approached for comment.?
请先注册再继续
为何要注册?
- 注册是免费的,而且十分便捷
- 注册成功后,您每月可免费阅读3篇文章
- 订阅我们的邮件
已经注册或者是已订阅?