Skip down to main content

Our new toolkit poses critical questions about government technologies

Our new toolkit poses critical questions about government technologies

Published on
27 May 2020
Written by
Peaks Krafft

As in many countries, the U.K. government is racing to adopt technological approaches to the Coronavirus crisis. The government is pursuing in-house software-driven approaches as well as partnerships with major tech firms such Apple and Google (of the Five Eyes international spying program), and the CIA-backed surveillance firm Palantir.

These developments raise concerns about how data collected for the crisis response may cause more harm than good. Coronavirus technologies can be used for other purposes such as enforcing criminal laws or migration laws; uneven enforcement may disparately impact marginalized groups; and devoting substantial resources to untested or ineffective technology presents a substantial opportunity cost.

With the increasing use of invasive digital surveillance technologies being considered and deployed at this time, the public must be ready to hold the government accountable for recklessly adopting technologies that threaten civil liberty and social justice. Key critical questions about intentions and safeguards must be continually posed to vendors, policy makers, and government agencies.

Today the Oxford Internet Institute is releasing a new toolkit to do just that: the Algorithmic Equity Toolkit.

The Algorithmic Equity Toolkit (AEKit) is a new initiative by OII researcher, Peaks Krafft, together with their research collective the Critical Platform Studies Group, the American Civil Liberties Union of Washington (ACLU-WA), and the University of Washington Data Science for Social Good program.

The AEKit includes four components that can be used to better understand artificial intelligence (AI) and surveillance technologies, and that can be used to support posing critical questions about those technologies: (1) a flowchart to identify AI, (2) a list of key definitions, (3) a worksheet to think through the stakeholders involved in a technology, and (4) a list of critical questions to ask about AI or surveillance software.

“Invisible, powerful, and often biased tools are being adopted without public oversight or accountability to make important, and even life-or-death decisions, including whether you get a job or housing, what you pay for health care, how your community is policed, how much bail is set, and how long your sentence is,” said Jennifer Lee, ACLU-WA’s Technology & Liberty Manager. “Being able to identify automated decision systems and their impacts is an important first step to intervening in their use.”

Peaks Krafft, Meg Young, and Mike Katell, of the Critical Studies Platform Group said, “We co-designed this toolkit with community advocates to broaden their efforts at exposing threats raised by new digital technologies. We encourage anyone concerned with fairness to use this new resource to pose critical questions about technology oversight to your local representatives, lawmakers, and government officials.”

The AEKit is available for free on online. Also see Frequently Asked Questions about the AEKit.

Related Topics: