- Biased data leads to biased analytics, and unlawful discrimination poses a significant risk to employers.
- People need to be at the core of decision making. In the eyes of the law, employers—not tools—are liable for discrimination.
- Plan for bias so you're on the lookout. Train users how to use the tool and interpret its findings. And carefully evaluate your results, particularly those based on gender, race, and other protected class information.
From time to time, we invite guest contributors to provide their personal perspectives about trending HCM topics. The views, opinions, and comments expressed below are solely those of the author and do not represent Ultimate Software. This post was commissioned by Ultimate Software and the author has or will receive compensation for their work.
Analytics are super cool. They can tell me how well I’m working out, what movie I might be interested in next, and when I can expect the t-shirt I just bought online. They can tell us a lot about our workplaces, like how to better organize a hospital room to improve patient care, what makes a good hire, or identify employees who may be stealing our trade secrets. While these are all great, admirable goals, analytics pose risk too. One of the biggest risks analytics pose is unlawful discrimination.
We’ve known that analytics can be discriminatory for a while now. We’ve seen discriminatory results in analytics in the justice system, advertising, and many, many others. Because we’ve seen discrimination elsewhere, when we use analytics to evaluate our people, discrimination could occur there too. Take for example, Amazon’s recruitment tool. Even when solving for the bias against women, the tool still discriminated against women so badly that the tech giant had to turn the analytic off. While most vendors are working really hard to avoid discriminatory results in their tools, discrimination remains a risk.
Why? Because analytics incorporates the biases we all have. We know that bias is in our performance data. We know that bias is in our hiring data. We know bias can get in our analytics though the design and even programming of the tool itself. Even when we work against these biases, it is almost impossible to completely eliminate bias. When bias turns to discrimination, we know we have a real problem as employers.
Unfortunately, we don’t really know how discrimination laws, like Title VII, the Age Discrimination in Employment Act, or the Americans with Disabilities Act, will apply to analytics tools in situations like Amazon’s. (If we did, we would tell you exactly what to do to be compliant, but alas…) Employment attorneys are making educated guesses. In some situations, analytics will be evaluated just like when we analyze neutral policies to see if they have a discriminatory effect. In others, determinations of whether a particular individual was discriminated against will be made. While there are a bunch of theories out there, what are employers to do while we wait for the law to catch up with available tech? Here are a few tips:
Keep people in people decisions
Analytics are tools, not decision makers. While analytics are great at assisting, employers still make employment decisions. This means people make decisions. Consider your map function on your phone. Even though it can get us almost everywhere, we can still choose a different route when we see a problem or prefer to drive by our favorite landmark. The same is true for people analytics. Think of your diversity and inclusion initiatives as an example where you might want to go in a different direction than what a tool indicates.
People still need to be at the core of decision making for another reason too. The law recognizes the employer as the decision maker – not the tool. If discrimination happens, the employer will be liable. This is yet another reason to use analytic tools carefully with the understanding that they could cause problems too.
Plan for bias
It may seem silly to assume bias, but if we plan for it, we can correct for it. The tool says Jimmy is going to leave you? Talk with Jimmy and his manager. It may be that Jimmy needs a raise or should be considered for a promotion. The tool says Ana is engaging in behavior indicative of a security threat, check with IT. Analytic tools are indicators – indicators that can be right, but we also will need to do our own investigation and figure out if the indications are right. We cannot rely on the tools alone because we know the tools could be biased.
Train tool users
Just like we can’t launch a new initiative without some training, we can’t use an analytic tool without some training. Train the people who will have access to the tool so they understand what the tool means, how to interpret what it tells them, and that they are ultimately responsible for decisions related to the tool. Training should include how bias could play a role and how they can work against that bias.
Review, review, review
There’s really no possible way to completely eliminate bias in analytics, so we need to evaluate the results the tools give us so we can work against the bias. Evaluate the results based on gender, race, and other protected class information we may have. If you see a problem, tell your vendor. Your vendor wants to reduce the effect of bias just as much as you do. One way to do that is listen to customer reviews of the tool. Trust me, they want to know if their results are biased. So, review and share your information.
Analytics are going to change how we do business. They will change the processes we use, the way we treat employees, and how we hire and fire. But before we jump wholeheartedly into the world of analytics, we need to understand the risks as well as the benefits. Ignoring the risks puts us and the tools in peril. This shouldn’t be use at your own risk but carefully use as a tool.