Screening Blog

Hire Thinking

woman with binoculars



By Kate Bischoff, Attorney
tHRive Law & Consulting LLC

Technology is everywhere – on your wrist, in your pocket, on your nightstand, and in your car and office.  Depending on your relationship with technology, you either love or hate it – all while using it regularly.

Human resources is getting into the game with technology vendors promising to “find the BEST candidates,” “address ALL your compensation needs,” and “solve ALL your problems.”  (Well, maybe that last one is a smidge hyperbolic.)  While technology does offer HR some significant benefits, like getting great insights based on an employer’s own data or taking some tactical, paper-pushing activities off of HR’s desk, technology poses some risks.  It’s just that few vendors talk about those risks.

It is important to note that when your information is stored elsewhere (ie. within a vendor’s software platform or at their office), it remains the responsibility of the employer to bear the risk. You need to ensure that your providers have systems in place to not only protect the vulnerability of the data, but to assist in avoiding risk.

Tech vendors are not on the hook for decisions that employers make. The technology they create helps employers make decisions, it doesn’t make the decisions for them.  Your GPS may tell you where to turn, but you still make the decision where and when to turn.  Your GPS merely provides an option.  The same is true with HR technology.  Employers will be on the hook for their decisions and are not likely to be able to shift any blame onto the tech vendor.

Whether it’s artificial intelligence, analytics, machine learning, robots, chatbots, or any of the other multitudes of technology, here are but a few of the risks employers face when it comes to using technology:

  1. Discrimination.  Vendors may claim that their products can help reduce bias in decision making because of their algorithms, but it is nearly impossible for an algorithm to be bias-free or to remove bias from data.  Algorithms are created by people, feed data created by people, and people are biased.  When you use an algorithm (whether it is machine learning, AI, or analytics), the results produced can show evidence of bias.  This bias could result in a discrimination lawsuit.  Plaintiffs’ attorneys and groups are learning all they can about this, even creating their own technology that can detect discrimination.
  1. Fair Credit Reporting Act.  When technology gathers even publicly available data and analyzes it, and that data is then used to make an employment decision, the technology runs up against FCRA.  We know that FCRA has several notice provisions. Most analytic vendors don’t know of this law. Employers need to ask questions to gauge the vendor’s knowledge and applicability of these laws.
  1. Data security.  These days, technology is often resting in “the cloud.”  As the saying goes, “There is no such thing as the cloud.  It’s just someone else’s computer.”  This saying is true.  A tech vendor is housing and securing your employee’s personally identifiable information, including Social Security Numbers, W-2 data, and family information.  This is sensitive stuff that hackers are eager to get their hands on.  With W-2 information, they can file false tax returns and get refund checks. With direct deposit information, they could reach into employee bank accounts. And with family benefit enrollment information, they could apply for credit cards.  All of this is bad news for employers.  While courts are currently split on who bears the liability when there’s a breach, you can bet your bottom dollar on who is at fault.
  1. Failing to notice issues. HR folks know that when an employee utters a few magic words (like “leave,” “harassing,” “illegal,” or “accommodation”), a bunch of different laws are triggered requiring specific and targeted attention.  Failing to notice those words could be costly for an employer.  Technology might not be aware of these triggers and could miss the issues.  Take for example a chatbot that can provide information on PTO hours.  If an employee asks for leave, will the chatbot know that how the employee intends to use the leave could involve several different policies or laws?  Only if the employer and the tech vendor understands that it does.  In a recent demonstration by a high-profile vendor, its chatbot did not.  Missing an FMLA request is a costly lawsuit.

Technology is cool, amazing, and has lots of benefits to offer employers. But it carries risks that can be reduced by asking vendors the right questions, understanding how to use the tech in such a way that reduces the risk, and engaging vendors who understand the risks and want to help reduce those risks for the employer.  If employers don’t do these things, they could easily find themselves in a legal storm without an umbrella, raincoat, or rain boots.

Scroll to Top