Well, here it is, a federal judge just let a discrimination lawsuit move forward against Workday, the tech giant behind hiring software used by over 10,000 companies. Who is the plaintiff? A Black man over 40 with anxiety and depression who says he was auto-rejected more than 100 times by companies using Workday's AI.
He alleges the algorithm itself is biased, filtering out applicants based on race, age, and disability. This isn't just speculation, and he argues that it is supported by studies, which have shown AI hiring tools regularly replicate the same discrimination humans are supposed to avoid.
In July 2024, we blogged about New York’s DFS warning insurers, if you’re using AI and third-party tools, you’re responsible for making sure they don’t discriminate. That means audits, transparency, and clear legal accountability, even if the tool wasn’t built in-house.
In the New York Law Journal, we outlined exactly what a proper AI audit looks like, because when the lawsuits come, and they are coming, ignorance isn’t a defense, but a proper audit and intervention are very good defenses.
Workday says it “opposes discrimination.” Great. But denying wrongdoing doesn’t stop a lawsuit from moving forward, or a reputation from unraveling. If you’re using AI in hiring or other decision-making, the Workday case is a giant red flag. Start auditing NOW.
If your software is doing the sorting, you better know how it’s doing it, and who it’s leaving out.
So here’s the question: Have you audited your AI tools yet?