May 16, 2019 - Privacy Center Releases Two Major Reports on the Use of Facial Recognition Technology

Today, the Georgetown Center on Privacy & Technology released two reports detailing the widespread use and misuse of face surveillance by police departments nationwide. The reports build on the Center’s groundbreaking work on face recognition, The Perpetual Line-Up, which drew national attention for its finding that half of all adult Americans have their identities used in police face recognition databases.

In the first new report, Garbage In, Garbage Out, CPT reveals that police departments regularly use altered images to generate face recognition matches when images of suspects don’t provide results. To generate leads, police departments submit blurred photos, parts of faces swapped with other photos, and in some cases, submit photos of celebrities that investigators believe look similar to suspects. The report explains that most police departments do not have any prohibitions on using face surveillance systems in this way:

“There are no rules when it comes to what images police can submit to face recognition algorithms to generate investigative leads... The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs… Unfortunately, police departments' reliance on questionable probe photos appears all too common.”

The second report, America Under Watch, documents the widespread deployment of face surveillance systems in major U.S. cities, such as Chicago, New York, and Washington. The report demonstrates that major police departments have the ability to “scan live video from cameras located at businesses, health clinics, schools, and apartment buildings.” Even departments that claim to not use face surveillance have “paid to acquire and maintain the technology for years.”

The reports come as policymakers and industry are grappling with the reality of face surveillance. Earlier this week, the City of San Francisco voted to ban the use of facial recognition technology by police and city agencies. Microsoft, itself a developer of facial recognition systems, has called for regulations of the technology.

The reports are the work of the Privacy Center’s Clare Garvie (L’16) and Executive Director Laura Moy.

Garvie will testify about the report at a hearing on Wednesday May 22nd before the U.S. House of Representatives’ Committee on Oversight and Reform.

Read the reports at and And for a robust conversation on Twitter, check out @ClareAngelyn, @LauraMoy, and @GeorgetownCPT.