Democratizing oversight


Should we move towards a more collective approach in supervising algorithms in city systems? Thijs Turèl and Sjoerd Verhagen wrote an article on how citizens could play an important role in making society’s algorithms more transparent and democratic.
Transparante laadpaal, designed by The Incredible Machine
© Studio Masha Bakker

As algorithms are becoming more and more ubiquitous in our lives, there is a growing need to democratize their supervision. By this, we mean that not only governments and organizations should provide and grant more insight into these systems, but also citizens need to be involved. Society must expand the opportunities for everyone to both examine and modify these algorithms.

Algorithms are everywhere

The omnipresence of algorithms is becoming increasingly prevalent: there are algorithms that decide where and when cabs can drive through inner cities; algorithms that decide when the garbage in the street is picked up; algorithms that decide to turn off your electric share scooter if you drive down the wrong street. If you want to drive past an elementary school around 08:30 in the morning, again – an algorithm may in the future decide to take a different route. These smart infrastructure systems are great, as they assist us in measuring, analyzing, and steering processes into desired directions. It makes our cities safer, more efficient and comfortable.

Though, there are abundances of examples that show that things can go south in creating these smart systems. One current example is the hack that paralyzed the municipality of Hof van Twente in November 2020. Or think of last January, when malicious employees of the GGD (Dutch public health organization) resold corona test results for 50 euros each.

Not only outside influences have a share in leading to these missteps. Overambitious government institutions have misused digital smart systems too. The CBS (Dutch National Institute of Statistics) and T-Mobile utilized the telecom network as a digital track and trace system for T-Mobile customers since 2017. CBS was probably doing its job with the right intentions: gathering public insights for Dutch citizens. Still, tracking citizens is a slippery slope to larger abuses of power. The end does not justify all means.

What is especially striking is that only a very small number of people can detect when a digital infrastructure is malfunctioning. In a worst-case scenario, an algorithm could be making errors for years without anyone noticing. Although there has been some development, the Dutch Data Protection Authority has had a doubling of its budget, there are still 10.000 complaints waiting for the administrative body to address. All of this shows that there are a lot of things that can and will go wrong in developing and utilizing these smart infrastructure systems. To combat this, we need strong supervision of digital domains. And these institutional supervisors are not going to solve this alone.

Where do we go from here?

Our solution: we democratize the supervision of digital services, so that not a few, but many can contribute to identifying and correcting improperly functioning software. We envision a world in which supervisors are assisted by an ecosystem of citizens, advocacy groups and politicians, all playing their role in the supervisory process. But how do we build a society with a self-correcting capability for digital systems?

The SyRI (System Risk Indication) scandal, where people were given a risk score without being aware of it, portrays that transparency is a prerequisite of democratization of supervision. By opening the black box of digital systems, citizens who first-hand experience the effects of algorithms are provided with tools to supervise, control and detect (malfunctioning) digital systems. By providing insight into algorithmic decision-making.

One example of what this could look like is the transparent charging station. In the future, the majority of cars will be electric but because the capacity of the electricity grid is limited, not every car can be charged at the same time and at the same rate. This transparent charging station is outfitted with an interface that tells exactly why the smart charging station charged your car in the way it did or didn’t. In this way, citizens can determine whether this algorithm performs fairly in their interests.

Algorithms of the people, by the people, for the people

A problem is that most people don't care and aren’t interested enough to bother themselves with algorithmic malfunctioning. How do we go about this?

Fortunately, we do not have to reinvent the wheel. We can take inspiration from the Public Transport domain. For decades, our interests as public transport passengers have been looked after by the Rover Travelers Association. Volunteers devote themselves selflessly to checking whether trains run on time. They have acquired such a prominent position that they report to the State Secretary, who uses their insights to reprimand the NS (Dutch railway operator) from time to time.

Let us imagine a Rover-like association that checks the operation of garden-variety algorithms. Steps into this direction are already happening to some extent: the City of Amsterdam is currently developing an Algorithm Register that gives insight into its algorithm systems, allowing feedback and participation in building human-centred algorithms. In addition, the NGO AlgorithmWatch from Germany has recently been set up, researching how automatic processes and decisions are currently impacting our everyday lives in 16 European Countries.

Moreover, organizing can assist us in combining our experiences of how we have been mistreated by algorithms. For instance, Uber was found to sometimes pay delivery drivers solely for sky-wide distances rather than actual travelled distance. In response, ‘UberCheats’ was developed, an app for Uber delivery drivers that keeps records of the actual distance travelled and trips made, and what Uber owes them. Through aggregating data, delivery drivers are in a stronger position to get Uber to change the algorithm.

Just as the transparent charging station makes the algorithm transparent, the black boxes of other digital systems currently in use need to be opened. In laying bare the algorithmic decision-making, tools can be offered to citizens to offer supervision. Through this, an ecosystem of informed citizens and informed citizen groups can play an important role in making our society more democratic.

This article is also available on LinkedIn.
Read it here and feel free to leave your opinion.