Brand Makers
Priya Nair appointed new CEO and MD of Hindustan Unilever, replaces Rohit Jawa
Imagine applying for dozens of jobs, confident that your skills and experience are exactly what companies are seeking. But every time, all you get is silence or a quick rejection. What if it’s not about your abilities at all, but an unseen algorithm deciding your chances?
AI-powered hiring tools have quietly taken over the recruitment process at most companies. Instead of a person reading through applications, software now scans your resume. This technology promises efficiency, but it can easily let fairness fall by the wayside.
Take the example of Derek Mobley, an IT professional from North Carolina. After sending out more than 100 job applications, he noticed a disturbing pattern. Despite being highly qualified, he either heard back right away with bad news or never got a response at all. The companies he applied to all used the same recruiting software.
Derek, who is African-American and lives with anxiety and depression, started to suspect there was more going on than bad luck. It was almost like these rejections were coming from a computer server rather than a human. He checked his qualifications and experience, but nothing added up. Convinced the software was at fault, Derek filed a lawsuit against Workday, claiming its hiring algorithm rejected him because of his age, race, and disabilities. His case puts a spotlight on the quiet ways technology can reinforce bias.
Nearly nine out of ten companies now use some kind of AI in the hiring process. Platforms like Workable and Bamboo HR automatically decide which candidates advance. But who decides what “best fit” means? According to studies, including one by the University of Washington, these tools can reproduce biases that reflect the world outside the office. Gender, race, and social class often get filtered in ways that are hard to see.
One well-known example is Amazon’s experiment with an automated résumé screener. The system learned on its own to penalize women. Some AIs also give extra weight to resumes with elite colleges and certain types of language, sidelining talented applicants from marginalized communities.
How Bias Slips In: Three Hidden Pitfalls
Historical Data Bias: Algorithms learn from past hiring decisions. If a company hired mostly men for leadership roles, the AI “learns” that men are more suited to those jobs, turning inequality into a future rule.
Feature Selection Bias: Even neutral information like college names or zip codes can act as stand-ins for race or privilege, making it harder for outsiders to get noticed.
Label Bias: Many AI systems classify people using past performance reviews or hiring records. If those were biased in the first place, the AI repeats those mistakes and labels certain groups as less qualified.
When these tools are used year after year, bias does not just stay the same. It can actually get worse. If the system keeps seeing one type of candidate succeed, it becomes even more likely to recommend that type and ignore everyone else. Over time, the process locks out new voices and talent.
Spotting the problem is just the beginning. Some companies avoid adding people to review AI decisions because they think it will slow things down. Many hiring systems are proprietary “black boxes”—even the people running them often can’t explain how decisions are made. In many places, there are not enough laws or rules to force companies to check for fairness.
Even good intentions are not always enough. It’s hard to explain how an AI made its choices, audits are rarely done, and concerns about privacy keep growing, especially with software that analyzes faces or voices.
How Can We Trust These Tools?
Experts say fairness starts with:
Being open about the data and features the system uses
Designing AI to spot and avoid bias
Making sure a human looks at the results before final decisions are made
Governments and regulators requiring regular checks and clear benchmarks, following ideas like New York City’s rules for auditing AI hiring tools
Ultimately, every algorithm is built by people. If the teams making these tools are not diverse and accountable, old biases can become part of the code. Until the technology industry becomes more inclusive, hardwired bias could linger for a long time.
Each overlooked application and unfair rejection chips away at the trust we have in these systems. The cost is bigger than one role or one disappointment. It’s about making sure everyone gets a fair shot, and that talent rises to the top, not just what fits an algorithm’s narrow definition.
If you have ever been left wondering why your job application was ignored, the real answer could lie deep in lines of code, far from your control.
The Storyboard18 Digital Entertainment Summit (DES) unpacked India's strategy for leading the digital entertainment economy, with top policymakers where they putlined how talent, technology, and governance would fuel future-ready growth.
Read MoreAt the Storyboard18 Digital Entertainment Summit in New Delhi, policymakers and industry leaders outlined how talent, technology, and governance will drive India’s push to dominate the global entertainment economy.