New York Case Highlights Risk Of AI As A Job-Screening Tool
Our Fairfax, VA business lawyer found a recent case from New York that provides two lessons to employers: that the use of artificial intelligence as a job-screening tool creates liability risk and that age discrimination is taken seriously. In May 2022, the U.S. Equal Employment Opportunity Commission brought a complaint against iTutorGroup, a provider of online education and tutoring services to students in China, alleging that it violated the federal Age Discrimination in Employment Act, which protects workers 40 and older from job discrimination based on their age.
Using Discriminatory Tactics
According to the EEOC, the company programmed its AI-based job application software for prospective tutors automatically to reject female applicants age 55 or older and to reject male applicants 60 or older. The case arose when an applicant who was initially rejected resubmitted their application with a more recent birthdate and then received an interview. The applicant filed a charge with the EEOC, which then sued iTutorGroup in federal court on that applicant’s behalf as well as more than 200 other qualified U.S.-based applicants who were allegedly rejected due to their age.
Paying Out A Settlement
Though the company denied any wrongdoing, it agreed to settle soon after the EEOC filed suit. Under the terms of the settlement, iTutorGroup must pay $365,000, to be distributed to applicants who were automatically rejected due to their age. Additionally, while iTutorGroup stopped hiring tutors in the U.S., if it wishes to resume U.S.-based hiring, it will be required to provide extensive ongoing training to those involved in the hiring process and to issue a robust anti-discrimination policy. It also may no longer request applicants’ birthdates.
Should iTutorGroup resume hiring in the U.S., it will be subject to EEOC monitoring for five years, and it will be required to notify and interview those applicants it allegedly rejected due to age. None of this suggests that employers should stop using AI tools altogether, but any employer using AI as part of its job-screening process should certainly consult with an employment attorney to ensure it’s not being deployed in a way that will get the company in hot water.
Get Help When You Are Hiring
While AI has made significant strides in automating the job-screening process, there are several inherent problems that need careful consideration. First and foremost, the risk of bias in AI algorithms can lead to discriminatory outcomes, perpetuating existing inequalities and hindering diversity and inclusion efforts. Additionally, AI systems may lack a nuanced understanding of human candidates, potentially overlooking valuable qualities and experiences that are not easily quantifiable.
The overreliance on AI for job screening can also diminish the human touch in recruitment, eroding the crucial element of empathy and intuition that human recruiters bring to the process. Moreover, the black-box nature of some AI algorithms makes it challenging to explain and justify hiring decisions, which can lead to legal and ethical issues. When you are overhauling your hiring system or have any concerns about your process, reach out to Mahdavi, Bacon, Halfhill & Young, PLLC for help.