AI & RoboticsNews

Why you should be using AI for hiring

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


A few weeks ago, VentureBeat published an article titled “Why you shouldn’t be using AI for hiring” that claimed shortcomings in AI-based hiring tools make them unfair. As someone who has worked in the recruiting tech sector for two decades and heads research and product innovation at an AI-based hiring platform company, I’d like to offer a counterpoint to that story.

The author of the story, CodePath CTO Nathan Esquenazi, presents several key points on why AI is problematic for high stakes decisions about people, including:

  • AI has a risk of bias
  • Data used to train AI may be biased
  • You can match people to jobs without fancy AI

On these points, the author is completely wro … err, actually correct. Completely correct. But I want to clarify a few points about AI in hiring because it can be quite useful in the right contexts.

First of all, we need to demystify the term “artificial intelligence.” When this phrase first came to prominence in the 1950s, it referred to a burgeoning effort to create machines that could mimic human problem-solving. It made sense in that context, and in the decades since it has captured the popular imagination more than probably any other scientific concept. The Terminator movie franchise has made billions of dollars, and Hollywood’s ideas of ultrasmart AI have shaped the trajectories of countless young engineers who work to bring them off the silver screen and into the real world. As computer scientist Astro Teller says, “AI is the science of how to get machines to do the things they do in the movies.”

Today, the term “AI” refers to a broad range of techniques that process data of various types. While these techniques originated from the metaphor of a computer that can “think” like a human, they don’t necessarily seek to replicate the brain’s capabilities. So really, the AI that is transforming our world with self-driving cars, medical image interpretation, and so much more, is just statistical analysis code. It can make sense of unstructured, complex, and messy data that traditional methods like correlation coefficients struggle with. And so there is nothing particularly “artificial” about most of the AI techniques used, nor could you call most of them “intelligent” on their own.

One of the awesome and scary parts of AI is that it allows researchers to study enormous sets of complex data and pull out predictive aspects of that data for use in various applications. This is what your fancy self-driving car is doing, and also what hiring-based AI can do. The dangerous part of this is that humans often don’t completely understand what factors AI is weighting in its predictions, so if there is bias in the dataset, it can and likely will be replicated at scale.

And here’s the thing: Bias is everywhere. It is a pervasive and insidious aspect of our world, and big datasets used to build AI reflect this. But while poorly developed AI may unknowingly amplify bias, the flipside of that coin is that AI also exposes bias. And once we know it is there, we can control it. (See, for example, the excellent documentary Coded Bias.) 

In my role at Modern Hire, I work with psychologists and data scientists who study candidate data to find ways to enhance what we call the “Four E’s of Hiring: Efficiency, Effectiveness, Engagement, and Ethics.” Essentially, every hiring process should save time, predict job/organization performance and retention, be engaging for candidates and recruiters, and be fair for all parties. With traditional, pre-AI statistics, we could easily score numerical data such as assessment responses, but we could not do the same for unstructured data such as resumes, background checks, typed responses, and interviews. Today, however, advanced AI techniques allow researchers to parse and score these types of data sources, and it’s game changing. 

We can now use AI to quantify qualitative data sources like interview responses. And once you can quantify something, you can see if it predicts outcomes that matter, like job and organizational performance — and you can also check to see if those predictions are biased against protected or other groups. Non-technology enabled interviews have a long history of being biased; we humans are effectively bias machines, with all sorts of cognitive biases to help us evaluate and quickly interpret the massive amount of information our bodies take in every second. Traditional interviews are nothing more than dates in that the interviewer chit-chats with the interviewee and builds a very unscientific impression of that person. But with AI, we can actually score interview responses automatically and evaluate those numerical results statistically.

At Modern Hire, we have developed a capability called Automated Interview Scoring (AIS) that does exactly this. What is important to understand is that we do not evaluate or score what a person looks like or sounds like. Those sources of data are filled with bias and irrelevant information. Our scoring begins with using only the transcribed words that a candidate speaks because that content is what the candidate gives us to use in the hiring process. Our philosophy is that only data candidates consciously give to us for use in the decision should be scored. In addition to this, we also provide a clear AI consent message to candidates, allowing them to opt-out of AI scoring. 

In the large samples of data we have studied with AIS, we have found that it can replicate the interview ratings of trained, subject matter expert interviewers. This is exciting because it happens instantaneously. But what about bias? Are these AIS scores biased against protected classes? In fact, our data has shown that AIS-generated scores are almost four times lower in bias than the scores from our trained subject matter experts. In this way, AIS reduces time and effort, replicates human ratings, and does all this with dramatically lower levels of bias. 

This article is far from endorsing AI that is used indiscriminately in the hiring process. If anything, it is less a refutation of the original article and more an extension. A hammer is a tool that can be used to tear down a house or to build one. AI is also a powerful tool and, when applied in a thoughtful, careful, rigorous, scientific way, can lead to great improvements in hiring technology. But we must always be extremely careful that the solutions we create help not just organizations but also individuals. As a psychologist myself, I desire to use technology tools to make hiring better for people, not just companies. And in this regard, we have never had technology as useful as AI. 

Eric Sydell, the EVP of Innovation at AI-based hiring platform company Modern Hire, where he oversees all research and product innovation initiatives. He is an industrial-organizational psychologist, entrepreneur, and consultant with more than two decades of experience working in the recruiting technology and staffing industries. He is also coauthor of the new book Decoding Talent: How AI and Big Data Can Solve Your Company’s People Puzzle, published by Fast Company Press. 

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers


Author: Eric Sydell, Modern Hire
Source: Venturebeat

Related posts
AI & RoboticsNews

The show’s not over: 2024 sees big boost to AI investment

AI & RoboticsNews

AI on your smartphone? Hugging Face’s SmolLM2 brings powerful models to the palm of your hand

AI & RoboticsNews

Why multi-agent AI tackles complexities LLMs can’t

DefenseNews

US Army buys long-flying solar drones to watch over Pacific units

Sign up for our Newsletter and
stay informed!