How A.I. Can Help Remove Bias From the Recruiting Process

Whether they know it or not, everybody is influenced by some degree of bias. That bias can become an especially big problem during the recruiting process.

On average, every corporate job opening receives 250 resumes. That is a significant number of candidates to screen and compare. When unconscious bias enters the mix, recruiting and hiring pros may pass over perfectly qualified candidates for silly reasons. As the Wall Street Journal explains, "Research has shown that men and women alike start to treat minorities differently within milliseconds of seeing them. Our brains automatically carve the world into in-group and out-group members and apply stereotypes within the blink of an eye."

Bias Affects Every Part of the Hiring Process

Education

According to research from Indeed, 37 percent of managers who went to top schools said they like to hire candidates who also went to prestigious universities. Only 6 percent of managers who didn't attend top schools said the same thing.

Meanwhile, 41 percent of managers who didn't go to top schools said they emphasize candidates' experience levels over their educational credentials. Eleven percent of managers who attended top schools said the same.

Gender

Using certain words in a job description can subtly encourage male applicants and discourage female applicants. Similarly, an internal Hewlett-Packard survey found that women will apply to jobs when they feel they meet 100 percent of the qualifications listed in the advertisement, whereas men will apply when they feel they meet 60 percent of those requirements.

Candidate Diversity

According to the Wall Street Journal article cited above, a major tech company incentivized recruiters to find more diverse engineering candidates in 2015. Recruiters made a valiant effort, reaching out to engineering schools in African countries, recruiting more diverse groups of interns, and making sure "that a 'buddy' from a similar demographic group took part in interviews."

Unfortunately, even these efforts didn't pay off because final hiring decisions were based on traditionally biased metrics, like how prestigious the candidates' alma maters were or how much previous experience they had at top tech firms.

Can A.I. Change the Game?

First, let's make it clear that artificial intelligence is not perfect – at least, not yet.

In theory, machines themselves have no biases. However, humans build the algorithms that power machine intelligences. Because humans are biased, their biases can often seep into the A.I.s they build.

For example, some U.S. states use a "crime-predicting" software to evaluate how likely people who have committed crimes in the past are to commit crimes again in the future. According to ProPublica, when courts used this software, "Black defendants were 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind."

This example serves to prove how dangerous A.I. can be when the people programming it are not vigilant about their own biases. Fortunately, many HR tech companies are learning to be more careful. In fact, several organizations in the field are actively working on ways to eliminate unconscious bias through A.I.

Unlike humans, A.I. programs can look objectively at large quantities of data and determine which candidates are best suited for a job. The algorithms powering recruiting and hiring A.I.s can be programmed to ignore gender, age, and ethnicity, eliminating much of the bias that plagues recruiting processes.

Once we learn to develop A.I. systems perfectly, they will be necessary tools for anyone who wants make hires completely free of bias. While we're not quite there yet, existing A.I. technologies are making great strides.

There are a few things we can do to speed up the process. From the beginning, we need to be training A.I.s to ask the right questions. Moreover, these questions need to be rigorously reviewed for potential bias, which can often be hidden in seemingly innocuous language patterns.

Using A.I. to focus on candidate skills is a step in the right direction. Here, we can learn a lesson from orchestras. In the 1970s, female musicians only accounted for about 5 percent of professional orchestra musicians. Once orchestras started using blind hiring practices – that is, having musicians audition behind screens so judges could only hear their playing – that number rose to 25 percent.

While not every interview can be blind, allowing A.I. to assess candidates based purely on their skills and abilities can help highly skilled applicants get further in the process than unconscious bias may have previously allowed them to.

A.I. chatbots are often scripted to ask the same questions of candidates regardless of their differences in background, gender, age, or ability. A human could follow a similar script exactly but still betray bias in tone or body language. Luckily, computer programs have no such problem. The same goes for candidate experience: A.I.s will follow the same protocol for every applicant no matter their background, ensuring each candidate receives the same experience every time.

Tech advisory firm Gartner predicts that by 2020, 85 percent of interactions between consumers and organizations will occur without the consumer interacting with an actual person on the enterprise side. As we hand over more and more responsibilities to computers, it becomes more important than ever to ensure our A.I.s don't have the same biases we do. If we are vigilant, A.I. could change hiring for the better – forever.

A version of this article originally appeared on Money Inc.

Noel Webb is cofounder and CEO of Karen.ai, a "cognitive recruiting assistant."