Our society is slowly becoming saturated with surveillance technology. Not only through camera surveillance or algorithms in crime detection but also with seemingly innocuous applications such as a smart fridge or gas meter, smart apps, a digital doorbell, or the cameras built into a Tesla. Most people seem aware of the benefits and conveniences that the integration of digital contrivances brings to our society. However, Marc Schuilenburg, Professor of Digital Surveillance at Erasmus School of Law, thinks that too little is said about the risks of AI and algorithms. According to him, digital ethics are lacking and if criminology does discuss the negative side of digitalisation, it lacks a reflection on the relationship of surveillance with power structures and the concrete experiences of individuals.
In his inaugural lecture 'Why we need to talk to each other about big data and algorithms regarding security', Schuilenburg explains what he thinks the conversation about digital surveillance should be about. After a technology is designed and applied, the legal framework is often only considered after implementation. According to the criminologist, this needs to change; ethics and legal frameworks should already be considered in the design phase.
Feudal security
According to Schuilenburg, we live in a surveillance-dominated society. “Everyone participates in this to a greater or lesser extent. Surveillance is fully integrated into our routines and lifestyles”, he explains to Vers Beton. “Digital surveillance is a present that can be used to fight serious crime but becomes a poison when it leads to systematic human rights violations.” It is not only the government that violates fundamental and human rights when applying digital surveillance. Big tech companies are also in a powerful position and own much user data. In this context, Schuilenburg speaks of “feudal security”.
Social inclusion
Making digital surveillance public would help address the risks surrounding AI and algorithms. This does not mean that the government should always be completely transparent about the algorithms used. However, according to Schuilenburg, public services should be clear about what public values they are promoting. Society should be able to question the government about these values, tools, and their application. Such questions are particularly relevant in the design phase of digital technology. “Groups of people other than ICT experts should also be involved in the design process. Think of young people, minorities, and the experiences of practitioners. Approach it as a form of social inclusion because designers need to learn to better empathise with the lifeworld of others”, Schuilenburg argues.
How to integrate this concretely into the practice of algorithm builders? That is the big question for Schuilenburg: “This is exactly what I want to investigate in Rotterdam as a Professor of Digital Surveillance. It will not be easy to dive into this domain of mathematicians, econometricians, and programmers. But we have to start somewhere. In my view, criminology should take full social responsibility and not wait until all sorts of things have been implemented. By then, it will be too little, too late, because the most important choices will have been made.”
- Professor
- More information
Read the entire article of Vers Beton and the piece published by Sociale Vraagstukken (both in Dutch).