By Ally Mark, AAAF Fellow
Facial recognition technology (FRT) is everywhere these days. It unlocks your phone and puts filters over your Snapchats. It watches out for stalkers at Taylor Swift concerts and even helps you find love. But the deeper FRT integrates into our social fabric, the more we risk sliding towards darker uses of the technology.
A Regulatory Wild West
Fears of a “Big Brother” situation contribute to the concerns from privacy and civil rights advocates and ordinary citizens. Yet, too few of those in power seem worried enough to address the lack of regulations and policies governing its use. Instead, cities like San Francisco, Oakland, and Somerville, MA, are leading the charge. These municipalities recently enacted moratoriums on FRT, in the absence of state and federal action.
This means no cohesive set of regulations exists to let people know if a technology platform uses FRT. Nothing requires police to disclose its use during arrests or trials. And no laws protect your data once companies like Amazon capture your face. Nothing. It’s essentially the wild west in the land of FRT.
Despite unusually bipartisan concern in Congress, legislative efforts died in the Senate and slowed in the House. Through their hesitation, our governments have allowed FRT to spread rapidly into every corner of society. Hints of overreach and erosion of privacy rights continue to pile up, about 28 stories below impeachment, coronavirus, the 2020 election, and Brexit.
Yes, FRT is probably in your life
These are a few ways that FRT is slowly working its way into your every day:
- Remember when Facebook helped you tag your friends in photos? FRT enabled that feature, but you probably didn’t even know that’s what Facebook used. Yes, Facebook, a company not known for protecting user privacy, collects facial data from about two billion people across the planet. The recent subject of all kinds of criticism, Facebook operates without regulations on how it can use our facial data.
- Schools and universities are increasingly drawn to using FRT to monitor their classrooms and campuses. Why? Well, FRT can take attendance and watch for suspended students. Colleges hope to automate entry into cafeterias. Disturbingly, some school districts desperate to assuage fears of mass shootings have turned to FRT for surveillance on school grounds.
- A wide spectrum of businesses now integrate FRT into their systems. FRT monitors retailers to catch known shoplifters, help restaurants remember customer orders, enhance hotel security, and cracks down on rowdiness and illegal betting at sports stadiums.
- The FBI can search somewhere around 640 million facial images without probable cause or a warrant. This translate to a repository of roughly half of all American adults. While the FBI collects only criminal mugshots, agents can also access state databases of license photographs, thanks to the cooperation of unelected state officials.
I can get arrested by FRT?
Not in America. In China, however, omnipresent cameras can catch you jaywalking and FRT systems help officials issue you a ticket. Most sinisterly, the crackdown on Uighur Muslims in Xinjiang Province is made possible by advances in surveillance technology, namely FRT.
Law enforcement use of FRT is particularly in need of oversight because of the serious consequences of potential misuse. Back in the US, nonpartisan federal watchdog agency, GAO, reported that the FBI may have set their FRT system up without properly following privacy laws and public notice policies. Despite GAO recommendations issued back in 2016, the Bureau had still not adequately validated the system’s accuracy as of June 2019.
More concerning, the FBI relies heavily on local and state law enforcement partners, but does not audit the accuracy of those systems. Similarly, these local and state agencies often acquire and use these systems with very little oversight. Lack of clear usage policies results in improper practices like submitting sketches or celebrity lookalikes. Some even cooperate with ICE to catch illegal immigrants, through informal, unregulated relationships.
FRT in the AAPI Community
On top of all that, FRT systems struggle to accurately recognize members of the AAPI community in particular. The government standards-making agency, NIST, released a report on the “demographic effects” that FRT algorithms displayed during testing. In the report, the agency found that amongst the almost 200 algorithms tested, AAPI recognition performed among the worst.
While such effects may seem insignificant now, TSA and CBP have big plans for FRT. DHS aims to install FRT in the top 20 biggest airports for all international flights. And because some of the busiest flight routes and airports serve areas with large AAPI populations, we may see a disproportionate impact. AAPIs consistently and historically face problems of under-recognition in algorithm testing and academic studies. As a result, we may open ourselves up to the risk of false recognitions by systems that currently have slim to no oversight.
FRT may bring about extremely useful, positive changes to our society, like fulfilling its potential to catch dangerous criminals. But the technology does not exist in a vacuum or in a perfect society. It exists in a world flawed and biased and populated by both powerful people who operate FRT systems and vulnerable people who may suffer from the potential erosion of civil and privacy rights.