Katleen Gabriels

Episode 1: Ethics and AI

In our very first DSMINDSETS podcast episode, we invited Dr. Katleen Gabriels, a moral philosopher from the Faculty of Arts and Social Sciences, to talk to us about her career, interests and research. During the opening of the academic year at Maastricht University, she also was recognized this year with the prestigious Edmond Hustinx prize for science. 

In the first segment of the podcast, Dr. Gabriels shared with us her hobby of gardening which allows her to unwind and to temporarily recharge away from her professional activities. She also told us about how she became interested in ethics and philosophy which all began with a quite different dream of becoming a journalist and a dabble into documentary filmmaking!

Dr. Gabriels then described her research interests in teaching morality to machines as well as online shaming and abuse. She argued in her books "Regels voor Robots" and "Conscientious AI" that technology is not neutral and that potentially biased human decision-making can influence the design of algorithms. The behaviour of these algorithms can have effects that are ethically and morally contentious when perceived by humans. A quite surprising and interesting example is given about a misbehaving soap dispenser! 

In one of her articles on online shaming she explained her view that online abuse and offline abuse are related and that online abuse can have its origins in the real-world and vice versa.

She also opined that, as a culture, we seem to soon forget past mistakes and misapplications of technology in society. She warned against this and encourages us to learn from these mistakes going forward in order to apply technology in a responsible and ethical manner.

As advice for future students interested in pursuing a similar career trajectory in academia, she recommends lots of passion with healthy doses of humour and humility to not take yourself too seriously.

From the DSMINDSETS podcast team, it was a pleasure to bring you this first episode. Stay tuned for the next one!

Listen to podcast