Artificial intelligence and the law

Should AI be allowed to manipulate us on a daily basis? Should it be trained on people’s data without their knowledge or consent? How can we enforce laws concerning AI, privacy and competition? In RegTech4AI, Konrad Kollnig brings together AI and the law to answer these and other questions. 

Both the law and computer code are sets of human-made rules, Konrad Kollnig says. Having excelled at computer science, he decided to branch out. “Coding can get monotonous, and law shifted my focus towards how technology affects society. In computer science, we tend to be biased towards tech-based solutions and sceptical of the humanities.” His latest project brings the two fields together.

Supported by an AiNed Fellowship, RegTech4AI aims to improve the EU’s process for AI regulation, protect citizens in the age of AI, and make knowledge of AI regulation more accessible to SMEs and startups. Kollnig and colleagues will also make recommendations to mitigate excessive platform power in AI and develop new methods to contribute to regulatory enforcement. 

Large scale manipulation

The EU’s new AI Act may be capturing the headlines, but many existing laws are pertinent to AI, including the GDPR and competition law. Regulations limit abuse of market power, Kollnig says, as well as the use of fully automated decisions about individuals. These are not, he stresses, hypothetical discussions about futuristic technologies. “Social media and digital advertising already use AI in a way that affects us all, in that they try to manipulate us at scale.

“Digital technology relies on data about people. Addictive social-media algorithms are trained on people’s scroll and click behaviour. Capturing your attention isn’t just about giving you what you want, but what your brain is most susceptible to,” Kollnig says. Connecting likeminded people is at best a secondary motivation; the business model is to generate profit by maximising time spent on the platform and selling personalised ads. Data harvesting is a means to achieving this. 

Opaque and probably illegal

For his PhD dissertation, which won an award from the Council of Europe, Kollnig studied the gap between law and practice when it comes to data collection. He analysed a dataset of 2.5 million apps for compliance with privacy laws. When testing a subset, he found that 70% sent data to third-party partners before requesting consent. Less than 3% were fully compliant. “It’s not robust enough to mount a legal challenge, but it does show that there’s a serious, far-reaching problem with implementing laws in everyday technology.”

To do his part, Kollnig built a privacy app that monitors other apps, and is now helping millions of individuals. But the problem goes beyond this. “A key principle of the GDPR is transparency. Yet, hardly any websites are sufficiently transparent. Even the data-protection authorities rarely understand who your data is being shared with. Many websites and apps send your data to thousands of companies to show a single ad on your device. This all happens in milliseconds. How could anyone understand it? The whole business around data is opaque and probably illegal.” 

Text continues below the photo.

UMagazine

Law enforcement

“One focus of RegTech4AI is legal enforcement, or trying to better protect citizens’ rights when it comes to digital technologies. If there’s no economic incentive for companies to comply with the law, they probably won’t bother. After all, it requires additional staff, which puts them at a competitive disadvantage.” Being too much of a crook could have negative consequences; being not enough of a crook means betraying your shareholders. “That’s why I’m so excited about digging deeper to find out how we can collect evidence of illegal practices. It’s clear that the current rules aren’t being properly enforced, so we have to ask ourselves how we can better translate between law and computer science.” 

“I shouldn’t have to be doing this,” he continues, half-jokingly, of his efforts to help regulators with enforcement in the digital space. Clearly, not all companies are bad. Principle-based legislation such as the GDPR offers wriggle room that wealthier companies can exploit; for everyone else, it can create uncertainty. And legal guidelines produced by regulators may be useful to lawyers, but are not the best way of communicating with programmers. 

Working across disciplines

“I’ve asked regulators why they don’t use GitHub [an open-source developer platform]. Why not share example code or mock-ups of consent forms? Speaking the ‘language’ of technology would help companies translate the laws into code. But the responsible authorities are usually made up of lawyers and economists”, Kollnig says. “Great people with great intentions, without a doubt—but if you want to regulate technology, you also need experts in those fields.” 

With the AiNed Fellowship and funding from the law faculty, he will hire four PhD candidates and one postdoc in law, computer science and potentially psychology. The research falls under the auspices of UM’s Law and Tech Lab. “We apply technical methods to law, rather than the other way around. UM’s Faculty of Law has a strong technical edge; even our legal scholars have good coding skills, which is rare.”

The university’s interdisciplinary and international profile help, too. “The faculty has long specialised in European law, which is great for this research. And we’re close to Brussels, Paris and London.” This might help Kollnig in his aim of organising the first European conference with legal and technical scholars from the field. “When it comes to AI, we need an interdisciplinary and international approach.”


Text: Florian Raith
Photography: Paul van der Veer

Also read