Training Hiring AI Not to be Biased

Artificial Intelligence (AI) and Machine Learning (ML) play integral roles in our lives.  In fact, many of you probably came across this blog post due to a type of one of these systems.  AI is the idea that machines should be taught to do tasks (everything from search engines to driving cars).  ML is an application of AI where machines get to learn for themselves based on available data.

ML is gaining popularity in the evaluation of job candidates because, given large enough datasets, the process can find small, but predictive, bits of data and maximize their use.  This idea of letting the data guide decisions is not new.  I/O psychologists used this kind of process when developing work/life inventories (biodata) and examining response patterns of test items (item response theory—IRT).  The approaches have their advantages (being atheoretical, they are free from pre-conceptions) and problems (the number of people participating need to be very large so that results are not subject to peculiarities about the sample).  ML accelerated the ideas behind both biodata and IRT, which I think has led to solutions that don’t generalize well.  But, that’s for another blog post.

What is important here is the data made available and whether that data is biased.  For instance, if your hiring algorithm includes zipcodes or a classification of college/university attended, it has race baked in.  This article has several examples of how ML systems get well trained on only the data that goes in, leading to all kinds of biases (and not just human ones).  So, if your company wants to avoid bias based on race, sex, and age, it needs to dig into each element the ML is looking at to see if it is a proxy for something else (for instance, many hobbies are sex specific).  You then have to ask yourself whether the predictive value of that bit is worth the bias it has.

Systemic bias in hiring is insidious and we need to hunt it down.  It is not enough to say, “We have a data driven system” and presume that it is not discriminatory.  If the ML driving it was based on inadvertent bias, it will perpetuate it.  We need to check the elements that go into these systems to ensure that they are valid and fair to candidates.

I’d like to thank Dennis Adsit for recommending the article from The Economist to me.

Blacks Welcome to Apply

The aftermath of George Floyd’s murder has many of us asking, “What can I do better?” when it comes to ending racism.  This is critical in that racial bias in hiring have changed little in 30 years.  HR and I/O psychology play a unique role in that we create the processes that allow for equal employment.

None of the suggestions below require lowering of standards.  Rather, it provides a framework for applying standards in an equitable way.  Science and good sense points us in this direction with these actions:

  1. Widen your recruitment net.  If you recruit from the same places, your workforce will always look the same.  There is talent everywhere—go find it.  Whether from a high school in a different part of town or a historically black college/university.
  2. Make Resumes Anonymous.  The science is very clear that anonymous resumes reduce racial and gender bias.  It is not an expensive process to implement and works for all kinds of business.
  3. Examine minimum qualifications carefully.  Whether based on job experience or education, these can serve as barriers to black job candidates.  The ground breaking employment discrimination lawsuit, Griggs v. Duke Power, was based on an invalid requirement that supervisors needed a high school diploma.  Don’t get me wrong—I want my surgeon to be an M.D. But, do your entry level positions really need a college degree?  Do your managers really need to be MBAs?  If you analyze the relationships between education/experience and job performance, you are likely to find that they are not as strong as you think.
  4. Use validated pre-employment and promotional tests.  As a rule, validated pre-employment tests do not adversely affect blacks and are certainly less biased than interviews (see below).  This is particularly true for work sample tests (show me what you can do) and personality tests.  However, cognitive ability tests, especially speeded ones, may lead to discrimination.  If you use them, analyze your cutting score to ensure that it is not set so high that qualified candidates are being screened out.
  5. Reduce reliance on interviews.  Interviews can be biased by race and ethnicity.  And, more often than not, they are far less valid than tests.  We need to convince hiring managers that they are not good judges of talent—very few people are.  Remember, interviewing someone to see if s/he is a “good fit” is another way of saying, “this person is like me.” 

  6. Make your interviews more structured.  This can be achieved by asking candidates the same questions and using an objective scoring methodology.   Adding structure to the interview process can reduce bias (and improve validity).

You may already be doing some of the above.  I would encourage you to do all of them.  The outcome is fairness AND better hires.  What could be better than that?

Thanks for coming by!

Please provide this information so we can stay in touch.

CLOSE