Higher education in the UK is going through one of the most challenging periods since the introduction of student fees in 1998. Student-staff ratios are up and student satisfaction is down. Money is tight. Of full-time UK undergraduates, 56 per cent work part-time to make ends meet, according to a June 2024 survey by the Higher Education Policy Institute, which means lectures are being skipped. Staff can also be under the cosh. Two-thirds of university academics were considering leaving the sector partly because of workload, found a 2022 report from the University and College Union.
Universities want more money, students want to pay less, and academics need space to do research and support students. The government’s coffers are empty. Universities also want students to engage creatively and analytically with their subject. But really stretching students can fall by the wayside when resources are thin, and that is to the detriment of both lecturers and students.
So, what to do?
The answer for a university that meets the Humboldtian ideal of holistic research and teaching, which offers personalised education and serves students from all over the world, lies, we believe, in generative artificial intelligence (GenAI).
- Resource collection: AI and the university
- Case study: can AI chatbots transform mental health in South African universities?
- Higher education needs a united approach to AI
The mention of GenAI in universities can trigger a certain amount of angst. It is often viewed as nothing more than a portal to plagiarism and the erosion of standards. And it can be. To make good use of GenAI that scrapes and recombines data from the internet, you need to already have an in-depth understanding of your subject – and probably of several other subjects, too. But, applied correctly, GenAI can transform the student experience in ways similar to access to expert, in-person tutorials.
A synthetic teaching assistant, for example, can respond in real time to student queries and probe the student’s knowledge with questions of its own. This is not an app. It is a fully integrated AI tool, embedded in the institution’s system. The learning offered is tailored to expand the knowledge and capabilities of the individual student. One of the first “learning buddies”, Syntea, was developed by IU International University of Applied Sciences in Germany and recently launched in the UK by IU’s UK subsidiary, LIBF.
Can an AI tutor match an academic discussion? In the first instance it does, of course, cover off core knowledge. If a student were to ask: “What is a bubble sorting algorithm?” they would be given an answer from the course material. If the student then asked, say, for the difference between bubble tea and a bubble sorting algorithm, the information from the course book would be augmented with information from large language models. Guardrails are needed to ensure that the machine stays on topic. Just as importantly, a human tutor should check every new answer from the AI bot. This both ensures that information is reliable and keeps the training of the system accurate.
Delegating repetitive tasks to a bot can leave tutors free to tackle more complex questions. For example, at LIBF, we often find that tutors augment Syntea’s answers with content or explanations from other areas.
The burden of exam setting and marking is another load that a synthetic teaching assistant can lighten. To pose questions at the right level of difficulty, the system tracks how much knowledge has been acquired and retained. Over time, exams could become superfluous.
But will GenAI be a primrose path to the automation and depersonalisation of third-sector education?
No. Society, particularly a democratic society, needs to educate as many of its citizens as possible. The system as it stands is, arguably, not fit for that purpose. Technology will be needed to close the gap. Many academics will remember a time when research involved a pencil, paper and index cards, and people spent whole careers writing a concordance. Now, such “solutions” sound, at best, quaint.
Well-constructed AI-powered tutors will change universities and university research much more profoundly than online databases did. That’s because what a university can do will no longer be limited by the capacity of staff to manage basic support, marking, admin and the dozens of repetitive tasks that increasingly make teaching in many universities unfulfilling.
Universities tend to have a concept of the ideal student: a school-leaver who can study full-time and has been fully prepared for the course they are on (which is also the right one for them) by advanced exams taken at school. There are no basic knowledge gaps for the lecturer to fill, no need to make allowances for time spent working or caring for others, and no need to consider whether the student should choose a different course.
By focusing on the needs of that ideal student, universities are not always serving students or society as well as they might. We would argue that they are also failing to make the best use of the skills and knowledge of their teaching staff. Both of those have serious implications for overall economic productivity and human well-being.
GenAI won’t solve all of society’s problems – it will likely create new ones. But, when it is used well, it can make space for university teaching staff to support students – and wider society – by helping to offer a personalised tutor, and a Socratic dialogue, to all who want it.
Steve Hill is vice-chancellor and CEO of LIBF. Quintus Stierstorfer is director of synthetic teaching at IU Group.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment