Facing the Risk

Mapping the Human Rights Risks in the Development and Deployment of Facial Recognition Technology

Thanks to a decade of rapid progress in the field of computer vision, facial recognition technology (FRT) has become a commercial product available to almost any government or business in the world. Proponents hope that facial recognition may support public safety initiatives and improve access to services, but the risk of errors and abuse means that FRT deployments carry substantial risks to a variety of fundamental rights and freedoms if not deployed in the context of strong operational safeguards and comprehensive legal protections.

The first part of our report looks at the facial recognition development process. Though developers are not the ones determining how and when FRT is used, they nonetheless bear responsibility for ensuring that their products and services do not end up contributing to violations of fundamental rights and freedoms through failure or abuse. Because this technology has only emerged recently, its ecosystem of developers and suppliers is still poorly understood by those working on the risks of surveillance technologies. This report seeks to help illuminate the structure of this industry and examine the ways that FRT developers can contribute to or mitigate human rights impacts through their decisions. Using the UN Guiding Principles on Business and Human Rights (UNGPs) as a guide, the report presents a set of recommendations for each group of actors in the FRT supply chain to demonstrate how firms can change their business practices to ensure they respect internationally recognized human rights.

The second part of our report examines current trends in FRT deployment around the world, with the purpose of understanding how FRT will come to be used by different actors, how operators are thinking about the risks the technology poses, and what more can be done to ensure that deployments respect fundamental human rights and freedoms. The current lack of governance for FRT has helped prompt discussions of bans and moratoria on multiple continents. More robust discussions are urgently needed to determine whether there are use cases that are fundamentally incompatible with human rights, as well as how operators and policymakers can craft an appropriate and tailored governance framework that takes into account the full spectrum of potential impacts. This report provides a contribution to that discussion.

This report was made possible through a grant from the U.S. State Department Bureau of Democracy, Human Rights, and Labor.

Amy K. Lehr
Senior Associate (Non-resident), Human Rights Initiative

William Crumpler

Research Associate, Strategic Technologies Program