You are on page 1of 4
United States Senate WASHINGTON, DC 20510 September 17, 2018 ‘The Honorable Victoria A. Lipnie Acting Chair U.S. Equal Employment Opportunity Commission 131 MSt.N.E. ‘Washington, D.C. 20507 ‘The Honorable Chai R. Feldblum Commissioner U.S. Equal Employment Opportunity Commission 131 MSUN.E. ‘Washington, D.C. 20507 The Honorable Charlotte A. Burrows Commissioner USS. Equal Employment Opportunity Commission 131 MSN. E. Washington, D.C. 20507 Dear Acting Chair Lipnic, Commissioner Feldblum, and Commissioner Burrows: Facial analysis technologies ate gaining use in the workplace and are becoming popular tools in hiring.' While they may offer benefits, we are concerned by the mounting evidence that these tools may perpetuate gender, racial, age, and other biases.’ We are writing to ask you how the Equal Employment Opportunity Commission (EEOC) is addressing the potentially discriminatory impacts of facial analysis technologies in its enforcement of workplace anti- discrimination laws. As you know, the EEOC is responsible for enforeing the many federal laws that make it illegal to discriminate against a job candidate or employee based on race, color, religion, sex, national origin, age, disability, or genetic information? In 2017 alone, the EEOC received 28,528 * Riley, Tonya. “Get Ready, This Year Your Next Job Interview May Be With An A.l. Robot,” March 13, 2018. CNBC.com, hitps://www.cnbe.com/201 8/03/13/ai-job-recruiting-tools-offered-by-hirevue-mya-other-start-ups,hml ? Frankle, Clare and Jonathan Garvie, “Facial-Recognition Software Might Have a Racial Bias Problem,” The Atlantic, April 7, 2016. hitps://www theatlantic.comvtechnology/archive/20 6/04/the-underlying-bias-of-facial- recognition-systems/476991/ Buolamwini, Joy. “When the Robot Doesn't See Dark Skin,” New York Ti bttps:/www:nytimes,com/20 18/06/2I/opinion/facial-analysis-technology.. 5 Equal Employment Opportunity Commission, “About EEOC,” undated. hrtps:/Wwww.eeoe gov/ecoe/index.cfin complaints related to racial discrimination, 25,605 complaints related to sexual discrimination, and 18,376 complaints related to age discrimination." Facial analysis technologies use images or videos of a person's face to identify them or to infer characteristics about them, such as their mood or health.> Prominent applications include identification for security® and tracking employee attendance’, as well.as for screening job candidates for emotional, social, or other characteristics that employers believe may correlate with job performance." While some have expressed hope that facial analysis can help reduge human biases”, a growing body of evidence indicates that it may actually amplify those biases.'® Research from MIT, for example, shows that leading facial recognition algorithms are 30 times more likely to misidentify darker-skinned women than lighter-skinned men.!? The American Civil Liberties Union recently showed that facial recognition technology incorrectly identified 28 members of Congress as people who had been arrested. Specifically, the technology disproportionately misidentified African American arid Latino members.” Such disparities can encode and magnify gender, racial, and other biases that exist in our society and which the EEOC is working so hard to combat. Equal Employment Opportunity Commission, “EEOC Releases Fiscal Year 2017 Enforcement and Litigation Data,” January 25, 2018. hips: www.ecoe.goviecoeinewsroomtelease/I-25-18.efin 5 Sandoiu, Ana. “Why Facial Recognition Is The Future Of Diagnostics,” Medical News Today, December 9, 2017 bntps://wunw.medicalnewstoday.comiarticles/3205 16,php “Maurer, Roy. “More Employers Are Using Biometric Authentication," Society for Human Resource Management, April 6, 2018. htfps:/'www.shrm.orgtesourcescndtools/hr-topiesitechnology pagesernployers-osing-biometric- authentication.asps. Smith, Michelle Lanter. “The HR Benefits of Biometrie Time And Attendance,” Benefits Pro, December, 29, 2017. biips:/mwwbenefitspro.conv2017/12/29/the-hr-benefits-of-biometrie-time-andeattendance! § Chandier, Simon, “The Al Chatbot Will Hire You Now,” Wired, Sepiember 13,2017. biips:/ovww.wired.com/storyithe-ai-chatbot-will-hire-you-now! * Larsen, Loren, “HireVue Assessments and Preventing Algorithiis Biss,” HireVue Blog, une 22,2018, htips// ww hirevoe-com/blog/hirevuc-assessments-and-preventing-xlgorithmic-bias *°Buolaniini Joy. “Gender Stiades: Intersectional Phenotypic and Demographic Evaluation of Face Datasets and Gender Classifiers," Master's Thesis, Massachusets Institute of Technology, September, 2017. htips:dam-prod media mit edu/x!20 18/02/05/buolastwini-ms-I7_WtMjoGY pdf Nean, Mei and Patrick Grother, “Face Recognition Vendor Test (FRVT) Performance of Automated Gender Classification Algorithms,” National Institute of Standards and Technology, NISTIR 8052, April, 2015, [ups//dx.doi.org/10.60287NIST.1R, 8052 Phillips, P. Jonathon, e¢@l, “An Obher-Race Bffect for Face Recognition Algorithms.” ACM ‘Transactions on Applied Perception (TAP), Vol. 8, No. 2, 2041. itps:/dl aémorg/cttasion.clin?id=1 870082 Kiare, Brendan F., et al., "Face Recognition Performance: Role of Demozraphit Information,” EEE Transections ‘on Information Forenisics and Security, Vol.7, No: 6;pp. 1789-1801, December 2012, http:fiecexplor. ieee. orgstarapistampjsp?tp>&armumber=6327355éuismumber6342844 ** Buolamvini, 2017. * singer, Natasha, “Amazon's Facial Recognition Wrongly Identifies 28 Lawmakers, A.C.L.U. Says," New York ‘Times, July'26, 2018. hops:!'www.nytimes.com/2018H7/A6Hechnology/amazon-actu-facial-recognition- ‘congress html Suppose, for example, that an Affican American woman seeks ajob at a company that tuses facial analysis to assess how well a candidate's mannerisms are similar to those of its top managers, First, the technology may interpret her mannerisms less accurately than a white male candidate, Second, if the company’s top managers are homogeneous, e.g., white and male, the very:characteristies being sought may have nothing to do with job performance but are instead artifacts of belonging to this group, She may be as qualified for the job as a white male candidate; but ficial analysis may not rate her as highly because her cues neturally differ. Third, iffa particular history of biased promotions led to homogeneity in tap managers, then the facial analysis technology could encode and then hide this bias behind a scientific veneer of objectivity. Legal scholars have studied how algorithmic techniques such as facial analysis can violate workplace anti-discrimination laws, exacerbating employment discrimination while sinnultaneously making it harder to identify or explain,” They write: “Unthinking reliance on data mining can deny historically disadvantaged and vulnerable groups full participation in society. Worse still, because the resulting discrimination is almost always an unintentional emergent property of the algorithm's use-rather than a conscious choice by its programmers, it can be unusually hard to identify the source of the problem or to explain it to a court." ‘We request that the EEOC develop guidelines for employers on the fair use of facial analysis technologies and how this technology may violate anti-discrimination laws, We ask that by September 28, 2018, you provide us with a timeline of your plans to develop these guidelines. We further request that you address the following questions by September 28, 2018 so that swe can better understand how the BEOC is prepared to act in the new frontier of algorithmic. justice: © Could the use of biased favial analysis technology in the workplace violate: © Title VII of the Civil Rights Act of 1964, which probibits discrimination against sorieone on the basis of race, color, teligion, national origin, or sex? © The Equal Pay Act of 1963, whiich prohibits discrimination because of age? © Title of the Americatis with Disabilities Act of 1990, which prohibits discrimination against a qualified individual with a disability? © Any other laws that the EEOC enforces? “® Barocas, Solon and Andresy D. Selbst, "Big Data’ Disparate Impact," California Law Review, Vol. 104, No. 3, ‘line 2016, p. 671-732. hps:(Hheinonline.org/HOL/Ph-bein,joumals/clr1048ei=695. “Barocas and Selbst, 2016, p67. © Has the EEOC received any complaints to date about facial analysis technology leading to discrimination in hiring or in other workplace decisions? © How is the EEOC prepared to investigate such claims given that detecting algorithmic discrimination can be difficult? © Inthe case of a disparate impact, what must employers do to demonstrate job relatedness of facial analysis tests? ‘® How does the EEOC approach validation of employment selection procedures based on facial analysis technology? ® Would the EEOC accept validations for selection procedures based on facial analysis technology that only consider their overall accuracy? Or must employers demonstrate comparable levels of accuracy across groups, including intersectional groups'*? ‘Thank you for your attention to this important matter. Sincerely, C oP ' Késaald D. Harris Patty wary 5 United States Senator United States Senator United Stats Senator 'S U.S, Equal Employment Opportunity Commission, “EEOC Compliance Manual Section 15: Race & Color Discrimination,” April 19, 2006. htps:/www.eeoe.govipolicy/docs/race-color.pdr

You might also like