AI and assessment integrity – threat or opportunity?

By ashton.wenborn, 26 June, 2023
View
The maturation of AI chatbots has caused alarm that the integrity of assessments could be challenged. But there is a case to be made that such tools could support teaching and learning – and a compelling one that says higher education can’t afford to not use them
Article type
Article
Main text

The fact that no one can predict with any great confidence how AI technologies will augment the teaching and assessment of students is a cause for concern. As a sector, higher education must reconcile with such novel technologies and the challenges they present for assessment integrity, and, in turn, what it means for the teaching and learning journey if a student can simply consult a large language model such as ChatGPT for the answers.

Aaron Yaverski, regional vice president for EMEA at Turnitin, addressed some of these issues and more at THE Digital Universities UK 2023. He says AI tech is something that we cannot ignore, and that it is incumbent on the sector to build a policy framework that tackles issues of academic integrity and AI’s potential as a learning tool.

Some have heralded it as the death of the formative essay. Yaverski disagrees. Students might use it to do their work for them, but some academics see ChatGPT as a great ideation tool. “I dont think AI is any different from anything else,” Yaverski said. “If we think back to the early days of any new tool, from calculators to the internet, they didn't signal the end, but rather the beginning of a new approach or pathway to learning. I think [AI] will be the same. And, while some people will want to use it to avoid doing the work themselves, most people will figure out the right way to use it to support and enhance the writing process.”

Issues of academic integrity can be circumvented through clever assessment design. Programme-level assessment and group work are less about outcomes that can be sourced from AI but about the teamwork and problem-solving skills students have developed. Understanding the incentives to cheat is key. Yaverski believes most students do not want to cheat and don’t. Students want to learn; AI won’t change that.

Not all prognostications of AI’s impact need to be pessimistic. It could help international students with academic writing, and address access issues. “It could be a good inclusivity tool,” said Yaverski. “For a student with dyslexia or any other special needs, it [could offer] a level playing field. I could definitely see a world where it is not doom and gloom, and the Terminator is [not] coming to get us.”

Find out more about Turnitin.

Standfirst
The maturation of AI chatbots has caused alarm that the integrity of assessments could be challenged. But there is a case to be made that such tools could support teaching and learning – and a compelling one that says higher education can’t afford to not use them

comment