Rejected by design

Last week, a federal judge ruled that a class action lawsuit against Workday can move forward. The allegation? That Workday’s AI-powered hiring tools discriminated against older, Black, and disabled applicants—screening them out before a recruiter ever saw their résumé.

For many of us, this news doesn’t feel shocking. It feels familiar. It confirms something job seekers have quietly suspected for years: that rejection doesn’t always come after a fair evaluation. Sometimes, you’re simply never seen.

Workday is the one facing the lawsuit, but they’re not the only player in this story. Thousands of companies licensed their tools, and many more adopted similar systems. These platforms didn’t sneak in the back door. They were pitched, purchased, and rolled out—by teams tasked with making hiring more efficient. Someone scoped the project. Someone made the business case. Someone approved it. Someone watched the results and decided they were good enough.

The AI didn’t do this in isolation.

What makes this lawsuit possible is traceability. Most hiring systems operate like black boxes—opaque to applicants, legally insulated by complexity. But in this case, the plaintiffs were able to argue that Workday’s system screened them out before any human ever read their résumé. The court agreed it was worth examining further.

The assumption, especially in public discourse, is that discrimination only happens when it’s intentional. But that’s not how AI works. These systems are trained on historical data—who was hired, who advanced, who was deemed successful. If that history is biased—and in most organizations, it is—the system learns to replicate those patterns. No one has to explicitly instruct the AI to deprioritize older candidates. It simply picks up that people with certain age markers on their résumés don’t tend to get hired. It doesn’t need to be told to flag gaps in employment. It just notices that people with seamless career paths tend to move forward faster.

That’s not a hallucination. That’s a pattern. And when no one audits the outputs or asks who’s missing from the pool, it continues.

This isn’t a story about one vendor. It’s about a corporate culture that rewards speed and scale over fairness and accountability. We’ve seen public commitments to DEI quietly walk back. We’ve seen programs defunded and diversity roles eliminated. We’ve seen political cover for abandoning the work entirely. Now, conveniently, the market offers tools that automate exclusion while shielding leadership from responsibility. If the results are inequitable, blame the algorithm.

It’s cleaner that way.

The uncomfortable truth is that for many organizations, these outcomes aren’t a surprise. They’re acceptable. Technology didn’t invent bias—it just made it faster and easier to implement at scale. That’s not a glitch. That’s a reflection of priorities.

I’ve spent years trying to be seen by systems that were never designed to recognize what I bring. Optimizing, editing, refining. Always under the assumption that if I could just get through, I’d be evaluated on merit. But the more I’ve learned—and the more headlines like this one confirm—the clearer it becomes: the system isn’t neutral. It’s selective. And its selectivity is not always tied to performance.

I’m not shocked by this lawsuit. I’m not outraged. I’m not even especially hopeful that it will change much. But I am clear. I no longer believe that these systems are broken. I believe they’re working exactly as intended.

So I work around them. Sometimes within them. But never under the illusion that they see me clearly. This lawsuit won’t fix the system. It might not even dent it. But it gives language to what so many of us have experienced. It names a pattern that too often gets framed as personal failure.

We deserve better systems. But until they exist, we’ll build what we can outside of them—and we’ll keep telling the truth about how we got here.

Formal notices haven’t gone out yet, but you can read the case summary here.

Carol A. Tiernan

Carol Tiernan is a marketing strategist and systems builder with three decades of experience turning complexity into clarity. She’s led growth and transformation across cybersecurity, SaaS, fintech, higher ed, and more—building scalable demand engines, repositioning legacy brands, and aligning marketing with revenue. Through her consulting work and thought leadership, she helps founders and executives build marketing that actually works.

Previous
Previous

What am I growing?

Next
Next

The line in the sand