When Convenience Gets Under Your Skin

Whether it is Amazon planning on stores without cash registers, or being able to buy drinks in a club without your wallet, to tracking the movement of just about any goods you can think of, RFID (Radio-frequency identification) is part of lives. But, what if your CEO or CTO came to you and said, “What if our employees had an RFID chip implanted in them?”

As with a lot of tech, the argument in favor of it is about convenience. Employees could access buildings, rooms, computers, vending machines, etc just by walking past an RFID reader. No more reaching for or losing key cards.

So, a company in Wisconsin is trying it out. The non-squeamish volunteers will get the chip (about the size of a grain of rice) put into their hand between the thumb and forefinger.

I will avoid making ominous comparisons to 1984. But, I am curious as to what are the real productivity or engagement benefits of doing this. How much time is being wasted fumbling for security cards? Does this help prevent any security breaches? I am just not seeing the ROI, so I doubt that many companies will adopt this.

I am not anti-technology or else this blog would show up on a piece of paper. Nor do I expect that every tech idea to be a good or bad one. However, business decisions that affect employees should be made on something beyond, “This would be cool!” Someone at the companies adopting this technology did just that (probably after an amazing sales pitch). Does it establish them as having a forward thinking tech-enable culture? Sure. Does it also show them as favoring style over substance? I think again, the answer is, “Yes.”

We can help companies establish a culture of good decision making by facilitating data-driven discussions. Questions like, “What are our goals?” and “How do we determine if this innovation is successful?” is a good way to separate a fad from effective organizational initiatives.

Eliminating Subtle Age Bias

Since age bias is something that could affect nearly all HR professionals, I am surprised that it does not get more attention. But, with the average age of employees in the U.S. going up (see here) and companies likely to recruit more older workers due to the unemployment rate being near recent lows, we are likely to see more attention paid to it, particularly in the technology field.

As with most bias, it can be introduced in a subtle way. For example, the term “digital native” describes those born roughly after 1990 that have had current technology (internet, smart phones, etc) pretty much their whole lives. A quick Indeed.com search shows many jobs where “digital native” is part of the description. Put another way, those older than 35ish should think twice before applying. Similarly, there is a whole literature (this article is an example) on how gender loaded terms in job postings affect who will respond to them.

Now, I get that you are advertising for tech jobs you are looking for employees who are completely comfortable in a digital environment and communicating with others who are. But, those are behaviors that can be assessed for with valid pre-employment tests without having to make assumptions about a person’s age.

And that is really the point about implicit bias—we make assumptions about groups without understanding people as individuals. We face a challenge in employee selection of creating processes that treat everyone fairly, but at the same time learn about them as individuals. It is a challenging needle to thread, but one that our businesses depend on us to do well. Using a combination of unbiased language and valid pre-employment tools can help us get there.

Or, if you would rather beat them than join them, you can open an art gallery that only focuses on artists ages 60 and older.

Thanks for coming by!

Please provide this information so we can stay in touch.

CLOSE