Do Women Have Less Managerial “Potential” Than Men?

Whether it is for succession planning or leadership development, many organizations are concerned about a person’s potential.  That it is, in the future, will be this person be performing at their current level (at least relative to peers) or is there something about them that will lead to an acceleration in their performance so that they’ll rise above many others.  This is often reflected in faster promotions as well.

The idea that managers can use their judgment to distinguish between who is a good performer now and who has potential is ridiculous.  What objective information would tell you that an average performer now will blossom in 5 years?  Initiative? Curiosity?  Of course, these things impact current performance, so they really are not related to potential if we assume that the two are not the same thing.

The other example frequently given is whether a person in a given role can perform the next role up.  For instance, can someone in a technical job supervise others or can a current manager be an effective executive.  This has less to do with potential than whether a person has a skill set that they have not been given an opportunity to show in their current role.  Note that there are plenty of valid ways (Assessment Centers being one) to determine whether people have supervisory/management/executive skills which are far more accurate than judgment.

Organizations find potential is sexier than evaluating skills, so potential gets a lot more attention.  And, since objectively measuring potential is time consuming/expensive and no one holds executives accountable for being wrong about potential, we get people sitting around and making up reasons why employee A is a “high po” and employee B is not.  You will not be shocked to find out that this approach is loaded with bias against women.

Whether it is due to unconscious bias or stereotypes that go against women being as self-promoting as men, female employees (at least according to the large study cited above) get the short end of the potential stick.  For example, in that research, women had higher performance ratings, but lower potential ratings.  This leads to fewer relative promotions for woman compared to men.

Now, it could be that the lower potential ratings were justified.  Perhaps men with similar potential ratings were performing better than the women down the road.  In the study the opposite was true.  Women with lower potential ratings outperformed men in the future.

I think we can draw a few conclusions from this study (and common sense):

  1. Organizations that are using potential ratings should look at their impact on women (and minorities).  They may not like what they see.
  2. No matter how much rating systems are dressed up (I’m looking at you, Nine Box), they are subject to huge amounts of biases and stereotyping.  Organizations that use this approach should look at the accuracy of the judgments and only use those who frequently are right.
  3. If you are interested in potential, define it (e.g., promotions, pay increases, etc.) and look at what predicts it over the long term after controlling for sex (since those promotional decisions are likely biased as well).  Then you have a system for understanding potential.

Organizations should plan for succession so that those being considered get the training and development opportunities they need to succeed.  However, using human judgment to identify who has potential and who does not is a fool’s errand.  It is unfair to women, which means it is wrong and negatively impacts your business.  There are ways to predict future performance in your organization.  You just need to put in the work to discover them.

Many thanks to Dennis Adsit for sending me the cited article.

Training Hiring AI Not to be Biased

Artificial Intelligence (AI) and Machine Learning (ML) play integral roles in our lives.  In fact, many of you probably came across this blog post due to a type of one of these systems.  AI is the idea that machines should be taught to do tasks (everything from search engines to driving cars).  ML is an application of AI where machines get to learn for themselves based on available data.

ML is gaining popularity in the evaluation of job candidates because, given large enough datasets, the process can find small, but predictive, bits of data and maximize their use.  This idea of letting the data guide decisions is not new.  I/O psychologists used this kind of process when developing work/life inventories (biodata) and examining response patterns of test items (item response theory—IRT).  The approaches have their advantages (being atheoretical, they are free from pre-conceptions) and problems (the number of people participating need to be very large so that results are not subject to peculiarities about the sample).  ML accelerated the ideas behind both biodata and IRT, which I think has led to solutions that don’t generalize well.  But, that’s for another blog post.

What is important here is the data made available and whether that data is biased.  For instance, if your hiring algorithm includes zipcodes or a classification of college/university attended, it has race baked in.  This article has several examples of how ML systems get well trained on only the data that goes in, leading to all kinds of biases (and not just human ones).  So, if your company wants to avoid bias based on race, sex, and age, it needs to dig into each element the ML is looking at to see if it is a proxy for something else (for instance, many hobbies are sex specific).  You then have to ask yourself whether the predictive value of that bit is worth the bias it has.

Systemic bias in hiring is insidious and we need to hunt it down.  It is not enough to say, “We have a data driven system” and presume that it is not discriminatory.  If the ML driving it was based on inadvertent bias, it will perpetuate it.  We need to check the elements that go into these systems to ensure that they are valid and fair to candidates.

I’d like to thank Dennis Adsit for recommending the article from The Economist to me.

Blacks Welcome to Apply

The aftermath of George Floyd’s murder has many of us asking, “What can I do better?” when it comes to ending racism.  This is critical in that racial bias in hiring have changed little in 30 years.  HR and I/O psychology play a unique role in that we create the processes that allow for equal employment.

None of the suggestions below require lowering of standards.  Rather, it provides a framework for applying standards in an equitable way.  Science and good sense points us in this direction with these actions:

  1. Widen your recruitment net.  If you recruit from the same places, your workforce will always look the same.  There is talent everywhere—go find it.  Whether from a high school in a different part of town or a historically black college/university.
  2. Make Resumes Anonymous.  The science is very clear that anonymous resumes reduce racial and gender bias.  It is not an expensive process to implement and works for all kinds of business.
  3. Examine minimum qualifications carefully.  Whether based on job experience or education, these can serve as barriers to black job candidates.  The ground breaking employment discrimination lawsuit, Griggs v. Duke Power, was based on an invalid requirement that supervisors needed a high school diploma.  Don’t get me wrong—I want my surgeon to be an M.D. But, do your entry level positions really need a college degree?  Do your managers really need to be MBAs?  If you analyze the relationships between education/experience and job performance, you are likely to find that they are not as strong as you think.
  4. Use validated pre-employment and promotional tests.  As a rule, validated pre-employment tests do not adversely affect blacks and are certainly less biased than interviews (see below).  This is particularly true for work sample tests (show me what you can do) and personality tests.  However, cognitive ability tests, especially speeded ones, may lead to discrimination.  If you use them, analyze your cutting score to ensure that it is not set so high that qualified candidates are being screened out.
  5. Reduce reliance on interviews.  Interviews can be biased by race and ethnicity.  And, more often than not, they are far less valid than tests.  We need to convince hiring managers that they are not good judges of talent—very few people are.  Remember, interviewing someone to see if s/he is a “good fit” is another way of saying, “this person is like me.” 

  6. Make your interviews more structured.  This can be achieved by asking candidates the same questions and using an objective scoring methodology.   Adding structure to the interview process can reduce bias (and improve validity).

You may already be doing some of the above.  I would encourage you to do all of them.  The outcome is fairness AND better hires.  What could be better than that?

Are Organizations Becoming Less Biased?

I’ve written quite a bit about bias in this blog. It is an important topic to me because I believe that people in HR and industrial psychology can be gatekeepers to a more fair society while improving organizational performance. Of course, bias in employment is merely an extension of what happens in the greater society. One of the assumptions about bias is that it is fairly stable so we have to almost trick people into being fair.

However, this study has some better news. Their analysis indicates that over a 20 year period bias against skin color and sexual orientation have been reduced. However, bias against weight has increased. Attitudes towards age and disability have stayed the same. Strangely, gender bias is not addressed.

The study raises many interesting questions about whether these changes are being experienced across demographic groups or only primarily within specific ones. However, it does provide some questions for HR practices, such as:

  • What steps can we take to reduce bias in hiring based on weight? Phone interviews instead of live ones?
  • Do we need to change our anti-discrimination training to focus more on weight and less on other issues?

The data does seem to point to those characteristics that we perceive as choices (being overweight) as having stronger biases than those that we have always perceived as innate (skin color) and those that the culture is now thinking of as such (sexual orientation).

Each organization can see where its implicit bias “blind spots” are by analyzing its hiring and promotional data. I understand that this can lead to some unkind truths. But, it will also allow for focus on areas where bias can be reduced.

Reducing Bias Through Structure

Finding examples of racial or gender bias in hiring or job evaluations is not hard.  The latest comes from a survey of lawyers.  My sense is that the results did not come from a random sample of attorneys, so I would not quote the group differences as gospel.  The authors recommended some specific ways that law firms and companies that hire lawyers can correct the bias in their HR processes.  There were two things I took from the study:

  • Many, but not all, of the recommendations came from a solid research base. It was good to see that their hiring suggestions included behaviorally based interviews, skills based assessments, and using behavioral definitions of culture.  Each of these suggestions introduces objectively and structure into the hiring process.
  • Given that attorneys have either brought employment lawsuits or have had to defend companies against them since 1964, did it really take this long to come up with some hiring process recommendations?

My consulting experience tells me that people who hire for professional jobs seem to think there is more magic and intuition in selection than those who staff for other types of positions.  This is especially true when hiring for a job they used to have.  They could not be more wrong.  Every job has a set of critical skills and abilities required to do it well.  It is possible to objectively measure these in candidates.  Doing so will likely reduce bias.

Eliminating Subtle Age Bias

Since age bias is something that could affect nearly all HR professionals, I am surprised that it does not get more attention. But, with the average age of employees in the U.S. going up (see here) and companies likely to recruit more older workers due to the unemployment rate being near recent lows, we are likely to see more attention paid to it, particularly in the technology field.

As with most bias, it can be introduced in a subtle way. For example, the term “digital native” describes those born roughly after 1990 that have had current technology (internet, smart phones, etc) pretty much their whole lives. A quick Indeed.com search shows many jobs where “digital native” is part of the description. Put another way, those older than 35ish should think twice before applying. Similarly, there is a whole literature (this article is an example) on how gender loaded terms in job postings affect who will respond to them.

Now, I get that you are advertising for tech jobs you are looking for employees who are completely comfortable in a digital environment and communicating with others who are. But, those are behaviors that can be assessed for with valid pre-employment tests without having to make assumptions about a person’s age.

And that is really the point about implicit bias—we make assumptions about groups without understanding people as individuals. We face a challenge in employee selection of creating processes that treat everyone fairly, but at the same time learn about them as individuals. It is a challenging needle to thread, but one that our businesses depend on us to do well. Using a combination of unbiased language and valid pre-employment tools can help us get there.

Or, if you would rather beat them than join them, you can open an art gallery that only focuses on artists ages 60 and older.

Blind Hiring

I wrote a few weeks ago about Intel’s drive to diversify its workforce. Regular readers know that I write about bias occasionally. It’s good that the topic makes it to the mainstream media occasionally when not related to a lawsuit.

The article talks about techniques to reduce bias. Some are old (truly blind auditions for musicians) and other are new, such as software that provides only the relevant hiring info without showing a person’s name, school attended, or other information that would potentially bias the hiring manager. This puts a premium on validated tests, which I like. Though, I’m sure that there are some readers who would argue that some of these tests are biased as well, but that’s a topic for another post.

This is all well and good, but as any logistics or customer service person will tell you, it’s the last mile that really matters. I can have as diverse of a candidate pool as I want, but if there is bias in the interviewing process, I will be rejecting qualified candidates for non-valid reasons. So, what’s a hiring manager to do?

First, give less weight to the interview and/or make it more valid. Why this barely better than a coin-flip technique makes or breaks a hiring decision when proven and validated techniques are shoved the side is beyond me. OK—I get it. People want to feel in control and have buy-in to the hiring process. But, can we at least be more rational about it? Interview scores should be combined with other data (with appropriate weighting) and the overall score should be used to make hiring decisions, not the one unreliable data point.

Second, why not blind interviewing? Hear me out. How many jobs really require someone to think on their feet and provide oral answers to complex questions? Sure, there are some (sales, for instance), but not that many. Why not have candidates submit written answers to interview questions? The scoring would be more reliable (evaluating grammar/spelling could be optional for jobs where it’s not critical), and accents, gender, and skin color would be taken out of the equation. Think about it.

Lastly, a diverse workforce is a result of a valid and inclusive selection process. When companies approach it the other way (working backwards from hiring goals by demographic group), they miss the point. Diversity isn’t about filling buckets. It’s about providing equal opportunity every step of the way when hiring.

For more information on valid pre-employment testing hiring practices, contact Warren Bobrow.

Thanks for coming by!

Please provide this information so we can stay in touch.

CLOSE