April 8th - 9th 2024, at the Erasmus University Rotterdam, the Netherlands.
Keynote lectures by Professor Peter Fussey (University of Essex, director of CRISP) and Professor Antoinette Rouvroy (University of Namur).
The widespread integration of AI into surveillance technologies within the realm of public safety has become increasingly prevalent, exemplified by the likes of biometric and emotional recognition, anomaly detection, DNA banks, crowd control, round-the-clock monitoring, and other forms of data-intensive surveillance. This phenomenon prompts crucial inquiries into its effects on the human experience. The ambition of the symposium is to explore the multidimensional facets of AI-surveillance and its implications for public safety – encompassing beyond its technological and judicial issues to its emotional, social, and cultural contours. Building on the research in surveillance and AI, the aim of the symposium is to explore how AI-surveillance is experienced by individuals who define, judge, and have emotions related to being watched or being a watcher.
The symposium will provide a platform for researchers, practitioners, artists and experts from various fields to share insights, findings, and perspectives related to the intersection of AI, surveillance, public safety and human experiences. Potential topics include, but are not limited to the following topics:
- Cultural and social dimensions: Explore the intersection of AI-surveillance with cultural, social and historical contexts, encompassing gender, ethnicity, LGBTIQ+ experiences, citizens' viewpoints and children's perspectives across various surveillance practices.
- Chilling effects: Delve into the repercussions and consequences of AI-surveillance on individuals and societies at large, and how AI-surveillance may create chilling effects, wherein individuals alter their behavior out of concern for potential consequences.
- Resistance: Explore inventive tactics of individuals and groups to counter AI-surveillance, revealing the evolving landscape of resistance, including countersurveillance and gamification techniques.
- Inequality: Scrutinize the connection between AI-surveillance, algorithmic bias, and societal inequalities, highlighting the reproduction of social, economic, and political inequalities.
- Socio-technological imagination: Explore the ways in which AI-surveillance can be designed in human-centric ways that amplify public safety by fostering greater care and trust.
The deadline for proposals is January 15st, 2024. Email in a single, combined pdf file (file name = your name) the following to storbeck@law.eur.nl: title, short abstract no longer than 500 words, in English. The symposium is free of charge and has limited spaces.
Please feel free to contact us with any questions.
Marc Schuilenburg (Erasmus University / AI MAPS) - schuilenburg@law.eur.nl
Majsa Storbeck (Erasmus University / AI MAPS) - storbeck@law.eur.nl
Martijn Wessels (Erasmus University / TNO) – wessels@law.eur.nl
Join us at Erasmus University in April 2024 to engage in thought-provoking discussions that explore the intricate relationship between surveillance, chilling impacts, new technologies, and their consequences. We look forward to your participation.
Sincerely, Dutch Surveillance Studies, Erasmus University Rotterdam, the Netherlands
- Related content
- Related links
- Overview blogposts | AI Maps