Artificial intelligenceBrain power

Miscommunication leads AI-based hiring tools astray

Originally posted on The Horizons Tracker.

Nearly every Fortune 500 company now uses artificial intelligence (AI) to screen resumes and assess test scores to find the best talent. However, new research1 from the University of Florida suggests these AI tools might not be delivering the results hiring managers expect.

The problem stems from a simple miscommunication between humans and machines: AI thinks it’s picking someone to hire, but hiring managers only want a list of candidates to interview.

Lack of understanding

“When we ask these algorithms to select the 10 best resumes, we know there will be a second stage of interviewing. The AI doesn’t understand that,” the researchers explain.

Without knowing about this next step, the AI might choose safe candidates. But if it knows there will be another round of screening, it might suggest different and potentially stronger candidates.

The researchers developed a new algorithm that aligns with the hiring process. Tested on data from thousands of employees at a Fortune 500 company, this improved algorithm saved 11% on interview costs while still identifying high-quality candidates and reducing bias.

Selection vs screening

The issue is one of selection versus screening. Current AI tools are designed to pick top candidates to hire, but hiring managers just want a pool of candidates to interview.

Most resume stacks have a mix of strong and weak candidates. If the AI has to choose, it will avoid risky ones. But if it just needs to suggest potential hires, it can include high-risk, high-reward candidates for human review.

Moreover, an AI set to select hires can introduce new biases. The researchers tweaked an existing algorithm to clarify that it was for screening, not hiring. In tests on nearly 8,000 employees, this updated algorithm found better candidates and cut hiring costs.

“We need to be clear about the tasks we set for these human-AI teams,” the researchers conclude. “The AI might optimize for poorly defined goals if we don’t specify the task clearly.”

Article source: Miscommunication Leads AI-Based Hiring Tools Astray.

Header image source: Mikhail Nilov on Pexels.

Reference:

  1. Xu, H., & Zhang, N. (2024). Goal Orientation for Fair Machine Learning Algorithms. Production and Operations Management, 10591478241234998.
Rate this post

Adi Gaskell

I'm an old school liberal with a love of self organizing systems. I hold a masters degree in IT, specializing in artificial intelligence and enjoy exploring the edge of organizational behavior. I specialize in finding the many great things that are happening in the world, and helping organizations apply these changes to their own environments. I also blog for some of the biggest sites in the industry, including Forbes, Social Business News, Social Media Today and Work.com, whilst also covering the latest trends in the social business world on my own website. I have also delivered talks on the subject for the likes of the NUJ, the Guardian, Stevenage Bioscience and CMI, whilst also appearing on shows such as BBC Radio 5 Live and Calgary Today.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button