On June 27th at 20.00 we organize a public event at Spui25 on the theme of Algorithmic Injustice.
Algorithms play an increasingly important role in our daily life but come with serious societal risks. For instance, in recent years we have seen many cases of algorithms that show unfair biased behavior towards particular groups or individuals. This led to growing concerns about harmful discrimination and reproduction of structural inequalities once these technologies become institutionalized in society, think about the Dutch Toeslagenaffaire.
However, in AI research and policy, the remedies against algorithmic discrimination are often narrowly framed as design challenges, rather than complex, structural, social-political problems. But is the solution always technological? Do we address harmful consequences of algorithms by fixing the data? And should engineers determine what is fair?
At this event, we bring together researchers from various disciplines. Su Lin Blodgett has been working on AI and fairness, Erin Beeghly on the wrong of stereotypes and documentary maker Nirit Peled documented firsthand stories of people who suffered at the digits of algorithms. During an interview-style conversation we will explore pressing matters around algorithmic injustice.
Here you can register for the SPUI25 event.
Dr. Su Lin Blodgett is senior researcher in the Fairness, Accountability, Transparency, and Ethics in AI (FATE) group at Microsoft Research Montréal. Dr. Blodgett is interested in examining the social and ethical implications of natural language processing technologies; she develops approaches for anticipating, measuring, and mitigating harms arising from language technologies, focusing on the complexities of language and language technologies in their social contexts, and on supporting NLP practitioners in their ethical work. She has also worked on using NLP approaches to examine language variation and change (computational sociolinguistics), for example developing models to identify language variation on social media.
Dr. Erin Beeghly is Associate Professor of Philosophy at the University of Utah. Her research interests lie at the intersection of ethics, social epistemology, feminist philosophy, and moral psychology. Her current book project, What's Wrong With Stereotyping? (under contract with OUP), examines the conditions under which judging people by group membership is wrong. She and Alex Madva are co-editors of the first philosophical introduction to implicit bias: An Introduction to Implicit Bias: Knowledge, Justice, and the Social Mind (Routledge 2020). Beeghly also writes and teaches about topics within legal theory, including discrimination law.
With this art, I tell stories.
Nirit is an independent filmmaker and writer based in the Netherlands. She weaves stories across multiple media to examine societal structures, both intimate and institutional. Drawing on techniques from journalism and documentary, she investigates the social impact of new technologies, structures of legality, systemic abuses of power and the nature of violence. Her latest documentary, MOTHERS tells the story of four women whose lives were forever changed when their adolescent sons entered a youth crime prevention program. TV archive and government documents reveal how their lives were impacted by an algorithmic reality that aims to assess the risks of their sons turning to crime. But can anyone’s life really be captured by data? Can they challenge the statistics that mark them as dangerous?
Naomi is a PhD-candidate in law and philosophy at the Institute for Information Law (IViR) interested in the role of law in online exclusion, speech governance, and platform power. Her research asks how European law should facilitate contestation of the content moderation systems governing online speech. The aim of facilitating this contestation is to minimise undue exclusion, often of already marginalised groups, from online spaces and democratise the power over how online speech is governed. Appelman is one of the founders of the Racism and Technology Center and together with bioinformatic Robin Pocornie filed a complaint at the Dutch Human Rights Institute for using online proctoring that discriminates against people of color.