August 1, 2019 - Executive Director Alexandra Givens Speaks at National Employment Conference on Algorithmic Fairness & the Rights of People with Disabilities

Our Executive Director Alexandra Givens spoke today about her work on algorithmic fairness and the rights of people with disabilities at the National ILG Conference, the largest convening for federal contractors focused on affirmative action, equal employment opportunity, diversity and inclusion.

Givens focused on the use of AI-driven tools that are currently being marketed for sourcing job candidates, evaluating applicants, and evaluating employee performance. Drawing on case studies, she highlighted the particular risks such tools may pose for excluding individuals with disabilities. Products purporting to conduct “sentiment analysis” of candidates’ personalities based on recorded video interviews, for example, are significantly likely to disadvantage individuals whose facial presentation may differ from perceived norms. Products that ask employers to create “success profiles” for effective job candidates may replicate selection bias in those profiles, such as the absence of people with disabilities.

Givens cautioned that specific challenges make it difficult to detect and address these biases for people with disabilities. For example, people with disabilities often don’t disclose their disability status to a future employer, making it hard to track whether they are being disproportionally screened out. In addition, the wide variety of disabilities make it virtually impossible to create fully representative training data that can train algorithms in a more inclusive way.

The Institute recently announced an extensive new project to explore algorithmic fairness and the rights of people with disabilities. Among other things, the project is assessing how rights guaranteed under the Americans with Disabilities Act, the Rehabilitation Act’s Sections 501, 503 and 504, and other statutes protect employees in this context — and the obligations on employers to validate whether such tools may have discriminatory effects. To learn more about the project or join its mailing list, click here.