Why Facial Recognition Technology Is Flawed

By Vivian D. Wesson

July 7, 2020

Why Facial Recognition Technology Is Flawed

7.7.2020

By Vivian D. Wesson

What do Steve Talley and Robert Julian-Borchak Williams have in common? Both men share the dubious distinction of false arrest by law enforcement using facial recognition technology. In December 2015, the Denver police using facial comparison technology falsely arrested Mr. Talley when he was identified as a suspect in an armed bank robbery.[1] Prior to making the arrest, the police had not investigated whether Mr. Talley had an alibi, relying instead on the software match. Similarly, Mr. Williams, accused of shoplifting watches, had an alibi for the time of his alleged offense, which the Detroit police did not verify before his arrest.[2] In both cases, law enforcement erroneously relied on technology as dispositive, but facial recognition software is not magic – “It’s meant to serve as an investigative tool not evidence.”[3]

While there are arguably benefits to facial recognition technology, its present unregulated application by law enforcement raises grim concerns about its role in exacerbating systemic racism and infringing civil liberties. Following the tragic murder of George Floyd, an unarmed black man suffocated by a white police officer,[4] opponents of this technology have viewed it with even greater scrutiny, stressing the need to both regulate the technology and reform its use.[5] In a country that champions the right to free speech, peaceful assembly, and social equality, we must question whether (and how) the police should deploy facial recognition systems in our democratic society.

Why facial recognition technology is flawed

Facial recognition technology owes its genesis to the development of computer vision in the 1960s.[6] The software underlying a facial recognition system relies on machine learning and complicated algorithms. Algorithms are ubiquitous in this digital age. In the case of facial recognition, however, algorithmic biases have developed within the code because the system was not adequately trained with diverse data.[7] Algorithmic biases (i.e., believing without verifying a computer’s accuracy assessment) have led to unfairness and inequities when seeking employment, securing a loan, and, most egregiously, being misidentified for a crime.[8]

How law enforcement applies facial recognition

Facial recognition technology has assisted law enforcement in locating and identifying missing people and in identifying deceased persons. In an opinion supporting the technology’s use for public safety, New York police commissioner, James O’Neill, wrote “Used properly, the software effectively identifies crime suspects without violating rights.”[9] He further noted that facial recognition technology has cleared suspects, stating “No one can be arrested on the basis of the computer match alone.”[10] As witnessed in Mr. Talley’s and Mr. Williams’ cases, though, the technology can be improperly applied with tragic results. In those cases, law enforcement failed to recognize that “the technology is no magic bullet.”[11]

How facial recognition should be reformed

Recognizing the inherent biases in facial recognition systems and how they can be (and have been) misused, the question is how these systems can be reformed and regulated to ensure their constructive use by law enforcement.[12] An answer to this question will take several forms: (1) limitations on facial recognition software use, (2) state and federal legislation, (3) judicial intervention, and (4) diversification in the technology workforce.

Although a few large technology companies have announced temporary bans on facial recognition technology sales to police departments (e.g., IBM,[13] Amazon,[14] and Microsoft[15]), the true players in this market, NEC, Idemia, and Clearview AI, have failed to take any action with respect to their sales practices. NEC’s founder and CEO, Hoan Ton-That, has remarked that his company would not offer the technology to certain countries,[16] but believes that the system can be used to “combat racism.”[17] Notwithstanding Clearview AI’s anti-racism assertion, it has been sued in multiple states for its claim to “end your ability to walk down the street anonymously,”[18] a chilling specter of a panopticon society.[19] The courts’ ultimate rulings in the pending Clearview AI litigation will serve as a litmus test of our ability to curb this technology’s malevolent applications through judicial intervention.

Federal and state legislators must also play a critical role in this reformation. The laws they adopt should protect the right to privacy and other civil liberties impugned by facial recognition technology. Currently, all states have introduced or adopted a bill to address these concerns in some form[20] but Congress has yet to act. Washington State leads the way in legislative reform, noting, “[L]egislation is required to establish safeguards that will allow state and local government agencies to use facial recognition services in a manner that benefits society while prohibiting uses that threaten our democratic freedoms and put our civil liberties at risk.”[21] Acknowledging the chilling effect that mass surveillance has on peaceful protests, Washington’s and California’s laws forbid the use of facial recognition technology on any individual participating in “a particular noncriminal organization or lawful event.”[22] Further, the Washington law expressly forbids profiling or the use of predictive law enforcement tools.[23] New York’s law is more circumscribed, banning the use of facial recognition or biometric surveillance in police officer cameras only.[24] Boston, MA has joined San Francisco, CA in voting unanimously to “ban surveillance technologies that automatically identify and track people based on their face.”[25]

Federal legislation entitled “Facial Recognition and Biometric Technology Moratorium Act of 2020” will soon be introduced to “prohibit biometric surveillance by the Federal Government without explicit statutory authorization and to withhold certain Federal public safety grants from State and local governments that engage in biometric surveillance.”[26] Citing the December 2019 report from the National Institute of Standards and Technology[27] in support of their proposed bill, legislators noted that “a growing body of research points to systematic inaccuracy and bias issues in biometric technologies.”[28] Senator Merkley stated, “At a time when Americans are demanding that we address systemic racism in law enforcement, the use of facial recognition technology is a step in the wrong direction. Studies show that this technology brings racial discrimination and bias.”[29] Co-author of the bill, Congresswoman Pressley, emphasized “Facial recognition technology is fundamentally flawed, systemically biased, and has no place in our society.”[30]

Transformation of facial recognition systems should also include reform at the developmental stage. We can achieve this by educating and training programmers from diverse backgrounds. As author and optics research scientist Janelle Shane writes, “Programmers who are themselves marginalized are more likely to anticipate where bias might be lurking in the training data and to take these problems seriously.”[31] In her TED talk, Joy Buolamwini, a computer scientist at the MIT Media Lab, encourages inclusivity in her coding movement, noting that it matters who codes, how we code, and why we code and that technology should “work for all of us, not some of us.”[32]

Conclusion

Absent issuing everyone a pair of anti-facial recognition glasses[33] or employing creative face make-up techniques,[34] we need immediate practical solutions to the negative repercussions from the use of facial recognition technology by law enforcement. Limiting use, imposing strict regulations, and diversifying the technology workforce are logical places to start with a technology that does not magically work in every instance.

Vivian D. Wesson serves as Chief Intellectual Property Counsel to Marsh & McLennan Companies, for which she manages and protects the IP assets of its businesses, as well as advises on data strategy. Vivian is also Chair of the New York State Bar Association’s Committee on Attorney Professionalism and its Technology Subcommittee.


[1] Allee Manning, A False Facial Recognition Match Cost This Man Everything, Vocativ (May 1, 2017 12:35pm ET), https://www.vocativ.com/418052/false-facial-recognition-cost-denver-steve-talley-everything/index.html.

[2] Hill, supra note 1. See also Shira Ovide, When the police think software is magic, N.Y. Times, June 25, 2020, https://www.nytimes.com/2020/06/25/technology/facial-recognition-software-dangers.html.

[3] Manning, supra note 2. See also Ovide, supra note 3 (“The police are supposed to use facial recognition identification only as an investigative lead.”

[4] Brittany Shammus et al., Murder charges filed against all four officers in George Floyd’s death as protests against biased policing continue, Wash. Post (June 3, 2020), https://www.washingtonpost.com/nation/2020/06/03/george-floyd-police-officers-charges.

[5] See Kade Crockford, What you need to know about face surveillance, TEDxCambridge Salon (Nov. 2019), https://www.ted.com/talks/kade_crockford_what_you_need_to_know_about_face_surveillance (“Protecting our open society is much more important than corporate profit.”). See also Irini Ivanova, Why face-recognition technology has a bias problem, CBS News Moneywatch (June 12, 2020 7:57 AM), https://www.cbsnews.com/news/facial-recognition-systems-racism-protests-police-bias (reporting how protests over racially biased policing are putting a spotlight on the tools of law enforcement, including widely used, but completely unregulated, facial recognition technology).

[6] See Stuart Russell & Peter Norvig, Artificial Intelligence: A Modern Approach 928–968 (discussing image formation and early image-processing operations).

[7] Joy Buolamwini, How I’m fighting bias in algorithms, TEDxBeacon Street (Nov. 2016), https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms (Ms. Buolamwini speaks about her formation of the Algorithmic Justice League to combat the “coded gaze.”).

[8] Gary Marcus & Ernest Davis, Rebooting AI: Building Artificial Intelligence We Can Trust 57 (2019) (a company selling a deep learning based facial recognition system to a police department may not be able to tell the law enforcement agency much in advance about whether or how the system will work).

[9] James O’Neill, How Facial Recognition Makes You Safer, N.Y. Times (June 9, 2019), https://www.nytimes.com/2019/06/09/opinion/facial-recognition-police-new-york-city.html.

[10] Id.

[11] Jennifer Valentino-DeVries, How the Police Use Facial Recognition, and Where It Falls Short, N.Y. Times (Jan. 12, 2020), https://www.nytimes.com/2020/01/12/technology/facial-recognition-police.html.

[12] Geoffrey A. Fowler, Black Lives Matter could change facial recognition forever — if big tech doesn’t stand in the way, Wash. Post (June 12, 2020 11:13AM), https://www.washingtonpost.com/technology/2020/06/12/facial-recognition-ban; Rory Cellan-Jones, Tech Tent: Facial recognition faces a reckoning, BBC News (June 12, 2020), https://www.bbc.com/news/technology-53024985; Ari Levy, Activists want to ban police from using facial recognition — Amazon and Microsoft just opened the door, CNBC (June 12, 2020), https://www.cnbc.com/2020/06/12/amazon-microsoft-face-opponents-of-police-use-of-facial-recognition.html.

[13] Bobby Allyn, IBM Abandons Facial Recognition Products, Condemns Racially Biased Surveillance, NPR News (June 9, 2020 8:04 PM), https://flipboard.com/@npr/ibm-abandons-facial-recognition-products-condemns-racially-biased-surveillance/a-2_VF2BR2Q2iTXkdNeJSQ6A%3Aa%3A3195441-53019ecca4%2Fnpr.org; Arvind Krishna, IBM CEO’s Letter to Congress son Racial Justice Reform, THINKPolicy Blog (June 8, 2020), https://www.ibm.com/blogs/policy/facial-recognition-susset-racial-justice-reforms.

[14] Elinor Aspegren, Amazon bans police use of facial recognition software for one year amid national protests against racial inequality, USA Today (published June 10, 2020 8:26 PM, updated June 11, 2020 9:58 AM), https://www.usatoday.coiam/story/news/nation/2020/06/10/george-floyd-protests-amazon-police-use-facial-recognition/5338536002; Ari Levy & Lauren Hirsch, Amazon bans police use of facial recognition technology for one year, CNBC (published June 10, 2020 5:25 PM, updated June 11, 2020 3:08 PM), https://www.cnbc.com/2020/06/10/amazon-bans-police-use-of-facial-recognition-technology-for-one-year.html; Amazon, We are implementing a one-year moratorium on police use of Rekognition, Blog (June 10, 2020), https://blog.aboutamazon.com/policy/we-are-implementing-a-one-year-moratorium-on-police-use-of-rekognition.

[15] Jay Greene, Microsoft won’t sell police its facial-recognition technology, following similar moves by Amazon and IBM, Wash. Post (June 11, 2020 2:30 PM), https://www.washingtonpost.com/technology/2020/06/11/microsoft-facial-recognition.

[16] See John Oliver, Last Week Tonight (HBO broadcast Jun. 15, 2020), https://youtu.be/jZjmlJPJgug.

[17] Id.

[18] Calderon v. Clearview AI, Inc., No. 20 civ. 1296 (CM), 2020 BL 201620 at *2 (S.D.N.Y. May 29, 2020).

[19] See Crockford, supra note 6. A “panopticon” is a type of institutional building and a system of control designed by the English philosopher and social theorist Jeremy Bentham in the 18th century. A guard can see every cell and view all inmates, but the inmates cannot see into the tower. These inmates will never know whether or not they are being watched.

[20] See Susan Crawford, Facial Recognition Laws Are (Literally) All Over the Map, Wired (Dec. 16, 2019), https://www.wired.com/story/facial-recognition-laws-are-literally-all-over-the-map (“From Portland to Plano, local governments are placing different limits on the use of biometric data.”).

[21] H. S.B. 6280, 66th Leg., 2020 Sess. §1(1) (Wash. 2020) (hereinafter the “Washington FR Bill”). Beneficial uses of facial recognition technology are allowed under the Washington FR Bill, including locating or identify missing persons (including missing or murdered indigenous women, subjects of Amber and silver alerts, and other possible crime victims), and identifying deceased persons, all for the purposes of keeping the public safe. In Washington State, using facial recognition technology for ongoing surveillance is strictly prohibited and allowed only if the government agency has obtained a warrant, exigent circumstances exist, or the agency has a court order for locating or identifying a missing person or identifying a deceased person.

[22] Washington FR Bill at §11(2); A.B. 2261 §1798.360(b)(1)(B) (Ca. 2020).

[23] Washington FR Bill at §11(2).

[24] S.B. 6776 §837-u(2) (N.Y. 2019).

[25] Nik DeCosta-Klipa, Boston City Council unanimously passes ban on facial recognition technology, The Boston Globe (June 24, 2020), https://www.boston.com/news/local-news/2020/06/24/boston-face-recognition-technology-ban. See also Kate Conger et al., San Francisco Bans Facial Recognition Technology, N.Y. Times (May 14, 2019), https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html.

[26] S.B. 116th Cong. 2d Sess. (draft 2020).

[27] Patrick Grother et al., Nat’l Inst. of Standards & Tech., NISTIR 8280, at 2 (Dec. 2019).

[28] Sen. Ed Markey, Senators Markey and Merkley, and Reps. Jayapal, Pressley to Introduce Legislation to Ban Government Use of Facial Recognition, Other Biometric Technology (June 25, 2020), https://www.markey.senate.gov/news/press-releases/senators-markey-and-merkley-and-reps-jayapal-pressley-to-introduce-legislation-to-ban-government-use-of-facial-recognition-other-biometric-technology.

[29] Id.

[30] Id.

[31] Janelle Shane, You Look Like a Thing and I Love You 182 (2019).

[32] Buolamwini, supra note 9 (As part of her Algorithmic Justice League, Ms. Buolamwini inspires people to take action against algorithmic bias by reporting it, requesting software audits, and becoming a tester of the technology). See also Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (2016) (The author discusses the proliferation of “WMDs” – widespread mysterious and destructive algorithms used to make decisions that affect every aspect of our lives.).

[33] Luke Dormehl, ‘Adversarial glasses’ can fool even state-of-the-art facial-recognition tech, Digital Trends (Jan. 11., 2018), https://www.digitaltrends.com/cool-tech/facial-recognition-glasses-security (researchers at Carnegie Mellon University and the University of North Carolina at Chapel Hill have developed anti-facial-recognition glasses).

[34] James Tapper, Hiding in plain sight: activists don camouflage to beat Met surveillance, The Guardian (Feb. 1, 2020 1:39 PM), https://www.theguardian.com/world/2020/feb/01/privacy-campaigners-dazzle-camouflage-met-police-surveillance.

 

[1] Allee Manning, A False Facial Recognition Match Cost This Man Everything, Vocativ (May 1, 2017 12:35pm ET), https://www.vocativ.com/418052/false-facial-recognition-cost-denver-steve-talley-everything/index.html.

[1] Hill, supra note 1. See also Shira Ovide, When the police think software is magic, N.Y. Times, June 25, 2020, https://www.nytimes.com/2020/06/25/technology/facial-recognition-software-dangers.html.

[1] Manning, supra note 2. See also Ovide, supra note 3 (“The police are supposed to use facial recognition identification only as an investigative lead.”

[1] Brittany Shammus et al., Murder charges filed against all four officers in George Floyd’s death as protests against biased policing continue, Wash. Post (June 3, 2020), https://www.washingtonpost.com/nation/2020/06/03/george-floyd-police-officers-charges.

[1] See Kade Crockford, What you need to know about face surveillance, TEDxCambridge Salon (Nov. 2019), https://www.ted.com/talks/kade_crockford_what_you_need_to_know_about_face_surveillance (“Protecting our open society is much more important than corporate profit.”). See also Irini Ivanova, Why face-recognition technology has a bias problem, CBS News Moneywatch (June 12, 2020 7:57 AM), https://www.cbsnews.com/news/facial-recognition-systems-racism-protests-police-bias (reporting how protests over racially biased policing are putting a spotlight on the tools of law enforcement, including widely used, but completely unregulated, facial recognition technology).

[1] See Stuart Russell & Peter Norvig, Artificial Intelligence: A Modern Approach 928–968 (discussing image formation and early image-processing operations).

[1] Joy Buolamwini, How I’m fighting bias in algorithms, TEDxBeacon Street (Nov. 2016), https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms (Ms. Buolamwini speaks about her formation of the Algorithmic Justice League to combat the “coded gaze.”).

[1] Gary Marcus & Ernest Davis, Rebooting AI: Building Artificial Intelligence We Can Trust 57 (2019) (a company selling a deep learning based facial recognition system to a police department may not be able to tell the law enforcement agency much in advance about whether or how the system will work).

[1] James O’Neill, How Facial Recognition Makes You Safer, N.Y. Times (June 9, 2019), https://www.nytimes.com/2019/06/09/opinion/facial-recognition-police-new-york-city.html.

[1] Id.

[1] Jennifer Valentino-DeVries, How the Police Use Facial Recognition, and Where It Falls Short, N.Y. Times (Jan. 12, 2020), https://www.nytimes.com/2020/01/12/technology/facial-recognition-police.html.

[1] Geoffrey A. Fowler, Black Lives Matter could change facial recognition forever — if big tech doesn’t stand in the way, Wash. Post (June 12, 2020 11:13AM), https://www.washingtonpost.com/technology/2020/06/12/facial-recognition-ban; Rory Cellan-Jones, Tech Tent: Facial recognition faces a reckoning, BBC News (June 12, 2020), https://www.bbc.com/news/technology-53024985; Ari Levy, Activists want to ban police from using facial recognition — Amazon and Microsoft just opened the door, CNBC (June 12, 2020), https://www.cnbc.com/2020/06/12/amazon-microsoft-face-opponents-of-police-use-of-facial-recognition.html.

[1] Bobby Allyn, IBM Abandons Facial Recognition Products, Condemns Racially Biased Surveillance, NPR News (June 9, 2020 8:04 PM), https://flipboard.com/@npr/ibm-abandons-facial-recognition-products-condemns-racially-biased-surveillance/a-2_VF2BR2Q2iTXkdNeJSQ6A%3Aa%3A3195441-53019ecca4%2Fnpr.org; Arvind Krishna, IBM CEO’s Letter to Congress son Racial Justice Reform, THINKPolicy Blog (June 8, 2020), https://www.ibm.com/blogs/policy/facial-recognition-susset-racial-justice-reforms.

[1] Elinor Aspegren, Amazon bans police use of facial recognition software for one year amid national protests against racial inequality, USA Today (published June 10, 2020 8:26 PM, updated June 11, 2020 9:58 AM), https://www.usatoday.coiam/story/news/nation/2020/06/10/george-floyd-protests-amazon-police-use-facial-recognition/5338536002; Ari Levy & Lauren Hirsch, Amazon bans police use of facial recognition technology for one year, CNBC (published June 10, 2020 5:25 PM, updated June 11, 2020 3:08 PM), https://www.cnbc.com/2020/06/10/amazon-bans-police-use-of-facial-recognition-technology-for-one-year.html; Amazon, We are implementing a one-year moratorium on police use of Rekognition, Blog (June 10, 2020), https://blog.aboutamazon.com/policy/we-are-implementing-a-one-year-moratorium-on-police-use-of-rekognition.

[1] Jay Greene, Microsoft won’t sell police its facial-recognition technology, following similar moves by Amazon and IBM, Wash. Post (June 11, 2020 2:30 PM), https://www.washingtonpost.com/technology/2020/06/11/microsoft-facial-recognition.

[1] See John Oliver, Last Week Tonight (HBO broadcast Jun. 15, 2020), https://youtu.be/jZjmlJPJgug.

[1] Id.

[1] Calderon v. Clearview AI, Inc., No. 20 civ. 1296 (CM), 2020 BL 201620 at *2 (S.D.N.Y. May 29, 2020).

[1] See Crockford, supra note 6. A “panopticon” is a type of institutional building and a system of control designed by the English philosopher and social theorist Jeremy Bentham in the 18th century. A guard can see every cell and view all inmates, but the inmates cannot see into the tower. These inmates will never know whether or not they are being watched.

[1] See Susan Crawford, Facial Recognition Laws Are (Literally) All Over the Map, Wired (Dec. 16, 2019), https://www.wired.com/story/facial-recognition-laws-are-literally-all-over-the-map (“From Portland to Plano, local governments are placing different limits on the use of biometric data.”).

[1] H. S.B. 6280, 66th Leg., 2020 Sess. §1(1) (Wash. 2020) (hereinafter the “Washington FR Bill”). Beneficial uses of facial recognition technology are allowed under the Washington FR Bill, including locating or identify missing persons (including missing or murdered indigenous women, subjects of Amber and silver alerts, and other possible crime victims), and identifying deceased persons, all for the purposes of keeping the public safe. In Washington State, using facial recognition technology for ongoing surveillance is strictly prohibited and allowed only if the government agency has obtained a warrant, exigent circumstances exist, or the agency has a court order for locating or identifying a missing person or identifying a deceased person.

[1] Washington FR Bill at §11(2); A.B. 2261 §1798.360(b)(1)(B) (Ca. 2020).

[1] Washington FR Bill at §11(2).

[1] S.B. 6776 §837-u(2) (N.Y. 2019).

[1] Nik DeCosta-Klipa, Boston City Council unanimously passes ban on facial recognition technology, The Boston Globe (June 24, 2020), https://www.boston.com/news/local-news/2020/06/24/boston-face-recognition-technology-ban. See also Kate Conger et al., San Francisco Bans Facial Recognition Technology, N.Y. Times (May 14, 2019), https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html.

[1] S.B. 116th Cong. 2d Sess. (draft 2020).

[1] Patrick Grother et al., Nat’l Inst. of Standards & Tech., NISTIR 8280, at 2 (Dec. 2019).

[1] Sen. Ed Markey, Senators Markey and Merkley, and Reps. Jayapal, Pressley to Introduce Legislation to Ban Government Use of Facial Recognition, Other Biometric Technology (June 25, 2020), https://www.markey.senate.gov/news/press-releases/senators-markey-and-merkley-and-reps-jayapal-pressley-to-introduce-legislation-to-ban-government-use-of-facial-recognition-other-biometric-technology.

[1] Id.

[1] Id.

[1] Janelle Shane, You Look Like a Thing and I Love You 182 (2019).

[1] Buolamwini, supra note 9 (As part of her Algorithmic Justice League, Ms. Buolamwini inspires people to take action against algorithmic bias by reporting it, requesting software audits, and becoming a tester of the technology). See also Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (2016) (The author discusses the proliferation of “WMDs” – widespread mysterious and destructive algorithms used to make decisions that affect every aspect of our lives.).

[1] Luke Dormehl, ‘Adversarial glasses’ can fool even state-of-the-art facial-recognition tech, Digital Trends (Jan. 11., 2018), https://www.digitaltrends.com/cool-tech/facial-recognition-glasses-security (researchers at Carnegie Mellon University and the University of North Carolina at Chapel Hill have developed anti-facial-recognition glasses).

[1] James Tapper, Hiding in plain sight: activists don camouflage to beat Met surveillance, The Guardian (Feb. 1, 2020 1:39 PM), https://www.theguardian.com/world/2020/feb/01/privacy-campaigners-dazzle-camouflage-met-police-surveillance.

Six diverse people sitting holding signs
gradient circle (purple) gradient circle (green)

Join NYSBA

My NYSBA Account

My NYSBA Account