Chatbots as a learning tool: Obstacles and opportunities

Is ChatGPT a helpful tool for being successful at university? Definitely, and it's by no means the only way to prepare for exames. But every student - and teacher - should know the risks. Christin Klose/dpa
Is ChatGPT a helpful tool for being successful at university? Definitely, and it's by no means the only way to prepare for exames. But every student - and teacher - should know the risks. Christin Klose/dpa

Whether in healthcare or in industry - or something as trivial as crafting personalized birthday wishes - artificial intelligence (AI) is now a part of many people's daily lives.

What about higher education? Can AI in the form of large language models (LLMs) such as the chatbot ChatGPT be used to prepare for exams? And is it allowed?

Take Germany. Most universities there are only beginning to draw up guidelines or instructions in regard to AI, says Jens Tobor, project manager of the University Forum for Digitalization (HFD) at the Centre for Higher Education (CHE) in Gütersloh.

They're mainly recommendations and not binding yet, he notes, particularly considering the many grey areas left by the recent approval of the European Union's Artificial intelligence Act, the first of its kind in the world.

As Jannica Budde, the HFD's senior project manager, points out: "In contrast to exams themselves, whatever you enjoy and works for you in preparing for them is allowed for the time being." But it's important to realize, she adds, that ChatGPT is a language model, not a knowledge model, so "you've got to be aware that the information may be false."

"Aside from data protection and copyright, there isn't yet a binding legal framework that universities could go by," Tobor says. These laws are the biggest hurdle in harnessing AI as a personal learning assistant, however, since students first have to feed the model with the knowledge they need for a certain subject.

Including copyrighted learning materials or old exams is problematic. "It would amount to reproduction and might be illegal," remarks Tobor, who says it's not yet clear whether, and to what extent, the AI companies behind the software applications further process the included data.

Instead, he recommends using the chatbot as a kind of Socratic dialogue partner. For example, ChatGPT asks the student reflective, individually predetermined questions about a set of facts, and checks whether the student has understood them.

"The beauty of this is AI doesn't provide suspect information, but rather promotes a more in-depth - and hence more conducive to learning - review of the subject matter," Tobor says.

Katharina Opper, an education scientist and e-learning developer, has experience with this method. Employed by the ancient Greek philosopher Socrates, it involves "posing questions without providing answers," Opper writes.

She has developed a prompt that enables AI to ask targeted questions and thereby encourage independent thinking. The person who enters the prompt is asked what the topic of discussion is, and then the dialogue can begin.

This approach, Opper says, is less prone to false information since no information is given that could be adopted uncritically.

A bit of background: Chatbots powered by LLMs, such as ChatGPT, are subject to "hallucinations," that is, the generation of plausible-sounding falsehoods. They occur because, drawing on patterns in their extensive training data, they simply predict the likelihood of the next word in a given sentence.

It's also possible to have ChatGPT play dumb, so to speak, and make the student explain the subject matter to it. "This process consolidates learning too," says Tobor. The chatbot is assigned its role as, say, a fellow student who's ignorant about the subject and is told what it needs to know.

Another option is to have the chatbot ask exam questions that the student must answer. "AI can do this fairly well, but the amount of false information it gives about fact-based exam knowledge is still considerable," says Malte Persike, scientific director of the Centre for Teaching and Learning Services at RWTH Aachen University in Germany.

"In regard to specialized content and knowledge, especially numbers and dates, I'd warn everyone not to rely on AI," he says.

While results are much better if AI is coupled with a databank from which it retrieves the specialized content, false information is still possible. "This is either because the question is misunderstood and the correct information isn't accessed from the databank, or the information from the databank isn't understood," explains Persike.

If you want to download documents from a digital learning space as a PDF file and load it back into the AI though, you come up against the aforementioned copyright limitations - but only if you use a commercial system. According to Persike, there are now AI tools you can install on your own laptop that run locally and don't transfer any data into the internet.

"In all likelihood, this would be legally permissible," he says. These models so far lack the quality and capacity of ChatGPT-4, "but if you want a dialogue partner, the quality of open-source alternatives is good."

Are we allowed or not allowed to use artificial intelligence as a learning aid? Many universities do not yet have precise regulations on this. Markus Hibbeler/dpa
Are we allowed or not allowed to use artificial intelligence as a learning aid? Many universities do not yet have precise regulations on this. Markus Hibbeler/dpa