One other variable, “presumed companion,” is used to find out whether or not somebody has a hid relationship, since single folks obtain extra advantages. This includes looking out knowledge for connections between welfare recipients and different Danish residents, reminiscent of whether or not they have lived on the similar handle or raised youngsters collectively.
“The ideology that underlies these algorithmic techniques, and [the] very intrusive surveillance and monitoring of people that obtain welfare, is a deep suspicion of the poor,” says Victoria Adelmant, director of the Digital Welfare and Human Rights Mission.
For all of the complexity of machine studying fashions, and all the info amassed and processed, there’s nonetheless an individual with a call to make on the laborious finish of fraud controls. That is the fail-safe, Jacobsen argues, however it’s additionally the primary place the place these techniques collide with actuality.
Morten Bruun Jonassen is one in every of these fail-safes. A former police officer, he leads Copenhagen’s management crew, a gaggle of officers tasked with guaranteeing that town’s residents are registered on the appropriate handle and obtain the right advantages funds. He is been working for town’s social companies division for 14 years, lengthy sufficient to recollect a time earlier than algorithms assumed such significance—and lengthy sufficient to have noticed the change of tone within the nationwide dialog on welfare.
Whereas the conflict on welfare fraud stays politically in style in Denmark, Jonassen says solely a “very small” variety of the instances he encounters contain precise fraud. For all of the funding in it, the info mining unit will not be his finest supply of leads, and instances flagged by Jacobsen’s system make up simply 13 % of the instances his crew investigates—half the nationwide common. Since 2018, Jonassen and his unit have softened their method in comparison with different models in Denmark, which are typically more durable on fraud, he says. In a case documented in 2019 by DR, Denmark’s public broadcaster, a welfare recipient mentioned that investigators had trawled her social media to see whether or not she was in a relationship earlier than wrongfully accusing her of welfare fraud.
Whereas he provides credit score to Jacobsen’s knowledge mining unit for attempting to enhance its algorithms, Jonassen has but to see important enchancment for the instances he handles. “Principally, it’s not been higher,” he says. In a 2022 survey of Denmark’s cities and cities carried out by the unit, officers scored their satisfaction with it, on common, between 4 and 5 out of seven.
Jonassen says folks claiming advantages ought to get what they’re due—no extra, no much less. And regardless of the dimensions of Jacobsen’s automated forms, he begins extra investigations primarily based on ideas from colleges and social staff than machine-flagged instances. And, crucially, he says, he works laborious to know the folks claiming advantages and the troublesome conditions they discover themselves in. “When you have a look at statistics and simply have a look at the display screen,” he says, “you don’t see that there are folks behind it.”
Further reporting by Daniel Howden, Soizic Penicaud, Pablo Jiménez Arandia, and Htet Aung. Reporting was supported by the Pulitzer Heart’s AI Accountability Fellowship and the Heart for Creative Inquiry and Reporting.
Discussion about this post