Updated
Updated · The Guardian · Apr 30
Charlotta Kronblad's lawsuit over Gothenburg school algorithm dismissed by court
Updated
Updated · The Guardian · Apr 30

Charlotta Kronblad's lawsuit over Gothenburg school algorithm dismissed by court

2 articles · Updated · The Guardian · Apr 30
  • The court ruled against Kronblad, whose 2020 lawsuit challenged Gothenburg’s school placement algorithm that misallocated roughly 700 children, including her son, to distant schools due to flawed distance calculations.
  • Kronblad argued the algorithm violated legislation, but the court required her to prove the system’s unlawfulness without access to its code, ultimately dismissing her evidence based on reconstructed placements.
  • The case highlights difficulties in contesting algorithmic decisions, with courts deferring to technology and placing the burden of proof on affected citizens, echoing similar algorithmic scandals in Europe.
An algorithm wronged 700 children. Why was the mother who exposed it asked for impossible proof?
The UK had the Post Office scandal. Is Gothenburg’s school crisis the next major example of algorithmic injustice?
When a government algorithm fails, who truly pays the price: the city, or the children it harmed?
How can you win a legal battle against an algorithm when you are not even allowed to see the code?
With the EU AI Act mandating transparency, are 'black box' government systems living on borrowed time?

The Gothenburg School Algorithm Scandal: How a Flawed System Disrupted 700 Students and Exposed Legal Failures

Overview

In 2020, Gothenburg introduced a school placement algorithm that used straight-line distances, ignoring real obstacles like rivers. This caused hundreds of children to be assigned to schools they could not reasonably walk to, triggering a cascade of displacements affecting around 700 students. Despite mounting complaints, city officials denied access to the algorithm's code, turning it into a black box shielded from scrutiny. Legal challenges failed because outdated laws placed the burden of proof on individuals, preventing courts from examining the algorithm's legality. This scandal highlights how opaque algorithms can cause widespread harm, deepen inequalities, and erode public trust, underscoring the urgent need for transparency, accountability, and legal reform in public algorithmic systems.

...