The Danish welfare authority, Udbetaling Danmark (UDK), risks discriminating against people with disabilities, low-income individuals, migrants, refugees, and marginalized racial groups through its use of artificial intelligence (AI) tools to flag individuals for social benefits fraud investigations, Amnesty International said today in a new report. The report, Coded Injustice: Surveillance and Discrimination in Denmark’s Automated Welfare State , details how the sweeping use of fraud detection algorithms, paired with mass surveillance practices, has led people to unwillingly –or even unknowingly– forfeit their right to privacy, and created an atmosphere of fear. “This mass surveillance has created a social benefits system that risks targeting, rather than supporting the very people it was meant to protect,” said Hellen Mukiri-Smith, Amnesty International’s Researcher on Artificial Intelligence and Human Rights.
- Pages
- 3
- Published in
- United Kingdom
Table of Contents
- Recently added 1
- Denmark: AI-powered welfare system fuels mass surveillance and risks discriminating against marginalized groups – report 1
- WRITE A LETTER, CHANGE A LIFE 1
- ‘Sitting at the end of a gun’ 1
- Unfair, discriminatory algorithms 1
- A Social Scoring System? 1
- Background 1
- Related Content 2
- Denmark 2
- Denmark: “Syria is not safe” nationwide demonstrations against return of Syrian refugees 2
- Denmark: “Syria is not safe” nationwide demonstrations against return of Syrian refugees 2
- European governments donors’ discriminatory funding restrictions to Palestinian civil society risk deepening human rights crisis 2
- Denmark: NGOs sue the Danish state to stop arms exports to Israel 2
- ABOUT US 2
- RESOURCES 2
- GET INVOLVED 2
- LATEST 2
- WORK WITH US 2
- FOLLOW US ON: 3