糖心Vlog

Australian funder bars grant reviewers from using ChatGPT

ARC responds to allegations that assessors were using chatbot to write feedback on applications

Published on
July 7, 2023
Last updated
July 7, 2023
Shanghai,China-June 23rd 2023 RunwayML (AI video App by Runway company), OpenAI ChatGPT and other AI application software on screen.
Source: iStock

Australia鈥檚 main research funder has barred peer reviewers from using artificial intelligence chatbots to聽produce feedback, following allegations that this had been happening.

The Australian Research Council published on聽the issue after applicants for grants of聽up to聽A$500,000 (拢262,000) awarded under the Discovery Projects scheme reported spotting the 鈥渢ell-tale鈥 signs of聽ChatGPT in聽assessors鈥 comments.

The said the reports were a 鈥済eneric regurgitation鈥 of their applications with little evidence of critique, insight or assessment, and that one reviewer had even forgotten to remove the 鈥渞egenerate response鈥 prompt that appears at the bottom of all ChatGPT-created text.

The new guidance, published on 7聽July, says peer reviewers 鈥渁re required to preserve the principles of confidentiality鈥.

糖心Vlog

ADVERTISEMENT

Campus collection: AI transformers like ChatGPT are here, so what next?


鈥淩elease of material into generative AI tools constitutes a breach of confidentiality and peer reviewers鈥ust not use generative AI as part of their assessment activities,鈥 the policy says.

It adds that reviewers 鈥渁re asked to provide detailed high quality, constructive assessments that assist the selection advisory committees to assess the merits of an application. The use of generative AI may compromise the integrity of the ARC鈥檚 peer review process by, for example, producing text that contains inappropriate content, such as generic comments and restatements of the application.鈥

糖心Vlog

ADVERTISEMENT

The policy says that, where the ARC suspects that reports are AI-generated, they will be removed from the review process, and that the ARC 鈥渕ay impose consequential actions in addition to any imposed by the employing institution鈥.

Australian researchers had suggested that the use of ChatGPT to write feedback was a symptom of the time pressure that academics in the country are under.

In terms of grant applicants, the ARC guidance says that, while AI 鈥減resents an opportunity to assist researchers in the crafting of grant proposals鈥, this 鈥渕ay raise issues around authorship and intellectual property including copyright. Content produced by generative AI may be based on the intellectual property of others or may also be factually incorrect.鈥

As such, the ARC 鈥渁dvises applicants to use caution in relation to the use of generative AI tools in developing their grant applications鈥 and notes that universities are required to certify that all applicants 鈥渁re responsible for the authorship and intellectual content of the application鈥.

糖心Vlog

ADVERTISEMENT

chris.havergal@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

The AI chatbot may soon kill the undergraduate essay, but its transformation of research could be equally seismic. Jack Grove examines how ChatGPT is already disrupting scholarly practices and where the technology may eventually take researchers 鈥 for good or ill

16 March

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT