A landmark lawsuit against Workday‘s algorithm-based hiring technology could redefine the legal boundaries for AI in employment screening. The case, which a California judge has allowed to proceed as a collective action, represents a critical test of whether automated screening tools violate anti-discrimination laws when they potentially disadvantage protected groups. As companies increasingly adopt AI for hiring decisions, this precedent-setting litigation highlights the tension between technological efficiency and workplace fairness.
The big picture: A California judge has green-lit a collective action lawsuit against HR software company Workday, alleging its algorithm-based applicant screening technology discriminates against older job seekers.
Why this matters: The case could establish legal precedent for how companies can use algorithms and AI in hiring decisions as automation becomes increasingly common in recruitment.
Key details: The lawsuit originated with Derek Mobley, who claims Workday’s algorithms caused him to be rejected from more than 100 jobs over seven years based on his age, race, and disabilities.
What they’re saying: Workday has denied allegations that its technology discriminates, characterizing the judge’s decision as a “preliminary, procedural ruling” that “relies on allegations, not evidence.”
The broader context: AI hiring tools face mounting scrutiny from civil rights organizations concerned about algorithmic bias perpetuating workplace discrimination.
Between the lines: Workday markets its “HiredScore AI” as using “responsible AI” to grade top candidates and reduce application screening time, highlighting the tension between efficiency claims and potential discriminatory impacts.