In August, NERDS faculty Roberta Sinatra and Vedran Sekara were invited to Rigsrevisionen, the Danish Audit national agency, to give a talk on bias in algorithms.
They presented their latest research project, focusing on how algorithmic bias can affect decisions in sensitive areas of public policy. In particular, they focused on their FAccT paper about a Decision Support System used by the Danish social sector, where algorithms have been tested to assess the risk of maltreatment for children. Their work showed that the algorithm was biased with respect to age: for instance, a 16-year-old shoplifter was rated at higher risk than a 2-month-old baby living with two parents struggling with substance abuse. This illustrates how algorithmic outputs can amplify bias if not critically examined, and why human oversight remains crucial.
The talk at Rigsrevisionen also raised the broader question of whether algorithms can be used responsibly for risk assessments in complex social contexts, and emphasized the need for careful scrutiny when deploying algorithmic solutions, especially AI-driven, in the public sector.
The event was organized by Rigsrevisionen’s internal data analytics network and drew a large audience, underscoring the relevance of these issues for public accountability and governance.
Read Rigsrevisionen’s post about the talk on LinkedIn →
Read the Danish news piece “Kan algoritmer se ind i et barns fremtid?” →