We are digging into a question that feels particularly urgent as artificial intelligence tools are making headlines for their ability to mimic human talent—in everything from writing articles to creating art—with increasing precision: What’s the ethical way to use these tools in hiring?

For time-strapped HR teams, there’s clear appeal in bringing some level of automation into the hiring process. AI technology can find talent by using the details of a job description to scour the Internet for matching LinkedIn profiles, resumes, GitHub histories, and more; screen and evaluate candidates using tests, surveys, and questionnaires; and handle onboarding tasks such as overseeing paperwork and managing trainings. At the same time, it can also be an instrument for perpetuating bias. “Anything that's wrong with our human-based systems, we will be teaching that to our AI children,” says Ramsey Isler, the director of product at The Markup, an online technology magazine. “And just like real children, sometimes you don't even realize we're teaching them our bad habits.”

One of the most prominent attempts to regulate AI in hiring, a 2021 New York City law requiring companies to conduct bias audits of their technology, has so far stalled. In December, the city pushed back enforcement of the law to April 2023 as employers called out a lack of clarity in how to comply. One thing, though, is clear: CHROs who want to use AI tools in hiring decisions would be smart to ease into them carefully and not over-rely on technology to do the job. “It's far too early in the age of AI to eliminate humans from the process,” says Isler.

Here’s what to consider if you want to use AI in talent acquisition:

Have an honest reckoning about whether the effort to implement it is worth the payoff.

One benefit of AI is helping organizations improve their own processes by showing them “what their cost per hire is, where their strongest performers come from, what channels they come from, where are folks dropping off in the process,” says Ryan Markman, CEO of Melior, a company that helps organizations use data in their hiring. But “as long as you're hiring 10, 20 plus people per year, there's probably an opportunity for you to take advantage of some technology, to try to make your process more data-driven in general,” he says. Smaller companies, however, may not have enough raw data for an AI system to analyze effectively.

“For a lot of companies, I just don't think it's worth it,” says Isler of The Markup. “By the time you try to weed out the biases, you might as well have just been having people who are well trained in diversity and equity and inclusion do the work and you'll come out better off. A lot of times these efforts to streamline the process and reduce the cost of it actually end up more costly than just hiring an expert person.”

Hold your vendors accountable.

Some 65% of companies that used AI tools in 2022 acquired them from an external source, according to a recent Deloitte report. If you are using an external vendor, be prepared to ask tough questions about their technology. “You have to be really careful and really watch the inputs that are going in to avoid that bias,” says Isler. Vendors should be able to show you how they control for bias in their systems. Markman of Melior mentions one company that regularly adjusts their models if they encounter biased results from product testing, something vendors in this space should be comfortable sharing with their clients “If you can't get a straight answer, it's probably not a great sign,” he says.

💬
WHAT TO SAY

Here are a few questions to ask vendors:

For a general understanding of what their tools can do:

- What data do we need in order to optimize our time spent/cost per hire?
- What internal data can we use?
- What external data can we use?

For minimizing bias:

- What are you doing to prevent bias?
- What data sets should we stay away from? (Filtering by zip codes, for example, can lead to bias by selecting people who live in wealthier communities over those who live in poorer communities.)
🔍
WHAT TO DO

Use “bias interrupters” in your hiring process.

Interviewing a diverse slate of candidates is not enough. In a previous interview with Charter,Iesha Berry, vice president and chief diversity and engagement officer at DocuSign, suggested training hiring managers as “bias interrupters” to be able to look past any preconceived ideas about a candidate based on details like their educational pedigree, what sports they play, or where they live. These trained interrupters can also serve as an additional layer of defense against any biases hiring technology may introduce or perpetuate.
🗣️
OUR TAKE

AI will continue to be part of our work lives for the foreseeable future in ways that stretch beyond hiring: Its uses for HR include performance management, leadership training, and more. (Look out for more from us on those applications in the coming months.) As with hiring, be mindful that use of these tools shouldn't supplant humans at the center of the decision-making process.

Key takeaways:

AI works best for companies that have access to large data sets.

CHROs have to ask tough questions of AI vendors to minimize bias in these systems.

AI should not replace humans at the center of the decision-making process.

Already a member? Sign in.

Join Charter Pro to gain access.

Indispensable for your job, expensable to your company. Become a member to get unlimited access to our news and insights, and:

  • Learn from experts in AI, flexible work, and equity
  • Quickly and confidently implement our customizable tools, templates, and benchmarks
  • Connect with leaders at our live events and virtual workshops
  • Drive business impact with essential analysis
  • Stay ahead on critical workplace issues
Then $299.00 per year
Trusted by change-ready leaders from innovative companies like:
Have questions? Contact us