ChatGPT and studying – use it, don’t abuse it

ChatGPT’s ability to create plausible academic arguments asks serious questions about higher education. Walter Jansen of UM’s Centre for Teaching & Learning EDLAB and Peter Vermeer, chair of the Committee of Boards of Examiners, on how best to approach this new tool and safeguard the quality of education.

EDLAB’s Walter Jansen sees the release of Chat GPT-3 as big step forward, but still as part of an ongoing evolution. “I think GPT-3 is a bit of a game changer because of how convincing the conversational tone is. But of course, LLMs [Large Language Models] have been around for a while.” While there was a perceived need to act to pre-empt cheating, that is to have assignments written by GPT, Jansen and his colleagues have been thinking about what to do with LLMs for some time.

“You see a current of trying to fight it, to reverse to traditional exams with pen and paper under supervision – but I don’t think that’s the way forward for us.” While acknowledging the sometimes hilarious flaws of GPT, Jansen thinks that from a strategic perspective we should assume that it will evolve quickly and be near-perfect soon enough. “That means it makes more sense to integrate it into the curriculum – and integrate it organically, so not just a module on how to use it, but learning how to use it in different situations.”

Digital literacy

Jansen thinks that UM’s PBL model, which relies on constructive and self-directed education as well as collective, contextual learning, could introduce LLMs as just another tool for aggregating information. “But two things are crucial: you have to learn how to prompt the system – so how to ask the right questions to get the most out of it – and you need a really high degree of digital literacy to make sense of what it produces.” The latter is increasingly recognised to be an essential citizenship skill.

“ChatGPT just generates a plausible text based on non-copyrighted information, but because of the convincing tone, it’s easy to forget that it might be complete nonsense.” GPT is constantly improving also thanks to user feedback but it’s important to understand that it’s not factually reliable. When Jansen first asked it who UM President Rianne Letschert was, the response somewhat troublingly never mentioned UM. (But maybe the programme just knows something we don’t yet. Follow @Rianneletschert and @MaastrichtU for the latest transfer rumours…)

“Students have grown up in and will work in an information jungle. Rather than wish that away, we need to deal with it.” Critically thinking about information is crucial to that. In his own teaching activities in European Studies, Jansen has students ask Chat GPT for feedback on arguments they have developed and then reflect on the output together and judge its quality. An example of pragmatically embedding this and other technologies into the educational design with a view to developing critical thinking skills.

Assessments are a part of learning

So ChatGPT can be used as a tool in PBL – an unreliable partner with a specific skillset. That still leaves the problem of assessments though. Jansen understands the necessity of quality control when awarding credits and eventually a degree but conceives of assessments as part of learning. “Ideally it includes concrete feedback and gives you a way of improving, of growing.” Instead of improved invigilation, he will look for ways of linking students to the papers they produce, for example by having them present their line of argumentation.

Jansen thinks it’s problematic to conceive of students as potential frauds. After all, they are investing money and time to acquire specific knowledge and skills. “We shouldn’t underestimate their intrinsic motivation and sense of integrity.” Peter Vermeer, member of the Board of Examiners (BoE) at the Faculty of Science and Engineering (FSE) agrees: “The problem is more the perverse incentive structure of scaled-up, commodified education that has students compete with each other for grades.”

Instead, Vermeer stresses that assessments are an integral part of the educational design. “Students have a right to know where they stand with regards to the intended learning outcomes. And for those designing the curricula it’s also crucial to know whether the approach has worked or if it needs tweaking.” He notes that everyone at UM who works on guaranteeing quality of education also has one eye on improving it. Exams aren’t an end in themselves. 

Putting out the fire now; learning to cook with it soon

And yet, the BoE has to respond to ChatGPT and has to do so quickly. It has three tools at its disposal. “You can detect fraud by relying on software.” UM already makes use of several such programmes to detect plagiarism. However, as Vermeer points out, it doesn’t look like any of the available programmes can deliver meaningful results at the moment.

Then, there is prevention, for example asking students to do exams onsite under supervision instead of at home. “This is basically what we’ve been forced into doing this period.  Lots of planning and organising regarding the exams has already been done, so this is difficult but we need to secure academic integrity and the quality of our education.” Vermeer sees this more as putting a patch on the tyre to get home – or in this case to the end of the academic year.

“These are drastic interventions; they’re necessary now but I don’t like them. You take away students ability to test themselves against a certain intended learning outcome, for example writing a paper.” Vermeer’s preferred strategy is integrating the new technology into the curriculum. “By next year, we want to include smart AI use as an intended learning outcome and organically embed it in the curriculum. This includes understanding how it works, learning how to prompt it and how to evaluate the output.”

A challenge to students first and foremost

Crucially, Vermeer thinks it’s absurd to regard the situation as a zero-sum game pitting educators against students. “Students are our partners in this: they don’t win by managing to avoid a learning outcome – they lose. Similarly educators don’t win by detecting fraud – they win by teaching students.”

While Vermeer understands why students would be giddy about the potential of ChatGPT, he holds a much grimmer view of the situation. “Personally, I think this is a nightmare for students: AI is emerging as their competitor. The question shouldn’t be how they can use this to avoid acquiring skills and knowledge, but how can they distinguish themselves meaningfully from the existential threat that AI will pose to them on the labour market. Smart students should demand to learn a skillset tailored to staying relevant under these conditions.”

By: Florian Raith

Peter Vermeer

Peter Vermeer is chair of the Board of Examiners (BoE) of the Maastricht School of Liberal Arts and Sciences at FSE and teaches at University College Maastricht. He is also chair of the Committee of Boards of Examiners, which includes all BoE chairs.

Also read

  • Last year, Maastricht University further strengthened its profile in science and technology by launching a Bachelor in Computer Science. The response exceeded all expectations, with the first cohort of 300 students from all over the world starting the programme in September. “Computers and...

  • The healthcare sector is facing limited resources. Without smart innovation, quality of care is at risk for all patient groups. Jessica Storm, an alumna of the master’s programme Healthcare Policy Innovation and Management, researched the cost-effectiveness of fall prevention for one of the largest...

  • Were you absolutely cruising through your exams? Were you well prepared with plenty time to spare? Then Anique de Bruin’s work won’t change your life. But for everyone else, the Professor of Self-regulation in Higher Education has useful insights and tools. Together with a refugee scholar financed...

More news