Lifestyle

I am not a number

View in browser  |  Your newsletter preferences
03.07.23
Obscure government algorithms are making life-changing decisions about millions of people around the world. Now, for the first time, we can reveal how one of these systems works. For Suspicion Machine, a four-part series from WIRED and Lighthouse Reports, we gained unprecedented access to one of the world’s most sophisticated welfare fraud detection algorithms.

 

We obtained not only the algorithm itself, but also the data that powered it and the handbook used by the data scientists who ran it. This allowed us to see how such a sophisticated system evaluated people as potentially committing benefits fraud. What we found was deeply concerning.

 

The system, used by the city of Rotterdam in the Netherlands until 2021, discriminated against people based on their ethnicity and gender. We also found evidence of fundamental flaws that made the system inaccurate and unfair.

 

From the length of your last romantic relationship to how often you play sports, risk-scoring algorithms combine invasive and banal data points to decide whether you are likely to have committed welfare fraud—and whether your benefits should be taken away. Our reporting allows you to see how the algorithm works for yourself, and to experiment with its risk-ranking system to understand how it discriminates against certain people based on characteristics they have no control over.

 

In our first story, we show you how algorithms discriminate against people based on their age and ethnicity. In our second story, we reveal the human impact of biased risk-scoring algorithms. Our third story interrogates the politics that has led to the rise of these broken systems. Finally, in our fourth story, we show how a combination of secretive governments and yet more secretive private companies has created a system in which lives are ruined—with little hope of justice.

 

What we found doesn’t just affect people living in Rotterdam. Risk-scoring algorithms are commonplace around the world, yet there has been little transparency around how they work—until now. Read all the stories in our four-part series and delve deep inside the Suspicion Machine.

 

James Temperton, News Editor

Find me on Twitter @jtemperton

THE SCORE | 16-MINUTE READ

Inside the Suspicion Machine

BY DHRUV MEHROTRA, JUSTIN-CASIMIR BRAUN, EVA CONSTANTARAS, AND GABRIEL GEIGER

Obscure government algorithms are making life-changing decisions about millions of people around the world. Here, for the first time, we reveal how one of these systems works.
THE SCORE | 10-MINUTE READ

This Algorithm Could Ruin Your Life

BY MATT BURGESS, EVALINE SCHOT, AND GABRIEL GEIGER

A system used by the Dutch city of Rotterdam ranked people based on their risk of fraud. The results were troubling.
THE SCORE | 8-MINUTE READ

How Denmark’s Welfare State Became a Surveillance Nightmare

BY GABRIEL GEIGER

Once praised for its generous social safety net, the country now collects troves of data on welfare claimants.
THE SCORE | 10-MINUTE READ

The Fraud-Detection Business Has a Dirty Secret

BY MORGAN MEAKER

When systems designed to catch welfare cheats go wrong, people find themselves trapped between secretive governments and even more opaque private companies.
More from WIRED
In case you missed it
Follow the untold story of the biggest dark web market takedown of all time with exclusive commentary from senior writer Andy Greenberg. Tap the button below to receive this limited series of six newsletters.
THE RISE AND FALL OF ALPHABAY

 

STAY IN THE KNOW

Sign Up for Our Daily Newsletter

Our biggest stories, delivered to your inbox every day. Get your glimpse of the future.

 

Categories: Lifestyle, News Updates

Leave a Reply