“An algorithm must be seen to be believed.” -Donald Ervin Knuth
“We’re studying a phenomenon called “algorithm aversion”…We basically found that if you tell people, “You can go with an algorithm that is going to give you some advice, or you can go with your own opinion”…they’re actually OK with saying, “I’ll use the algorithm. However, once you give them some practice and let them see how their algorithm performs, all of a sudden they don’t want to use it anymore. That’s because they see the algorithm make mistakes. Once they see algorithms or computers make mistakes, they don’t want to use it anymore, even though the algorithm or computer is going to make a smaller mistake or more infrequent mistakes than they are going to make.” -Joseph Simmons
MBA students often refer to products and services that are approaching obsolescence as “horse and buggy whips” because the automobile industry severely disrupted that business. What that glib remark fails to acknowledge is that while whips had no place in a new driving society, many carriage part makers transitioned just fine into manufacturing automobile parts. Look up Timken Company some time if you don’t believe me. Today, many people are shouting that the dawn of automation is here, and robots are going to displace humans in all industries, including HR. Not so fast.
Computers won’t replace human’s in every industry quickly, particularly in HR. That said, some of the most exciting work in HR has to do with data. Firms are increasingly finding that counter to what our gut tell us about what qualities a certain job candidate should have, data often tells us differently. Xerox Services learned that when it came to predicting employee retention or productivity for call center employees, previous experience had no bearing on those qualities, but proximity to work from home definitely did. What algorithm aversion seems to do though is, say our algorithm to give us the green light on an employee, but we later find them to be a really bad hire. We circle back on our recollections about them. “I knew something was off about them. I’ve always said there’s no substitute for experience and this hire had serious issues with training. I’m going to trust my own judgement instead of a bunch of zeroes and ones. You can’t quantify intangibles”. We should do close to the opposite. We should add that outcome to the algorithm so it can account for our observed experiences and improve!
But data doesn’t predict everything. Google Maps occasionally leads my car to roads that are closed and I have to figure out a detour. Data can’t yet pick up on a candidate’s lack of confidence, or manipulative personality. Data can’t pick up on an employee struggling to fit into a new business unit’s culture, or when sexual harassment is taking place. Human behavior can often be very challenging to quantify.
Author – Taylor Bradford