by Ronald Raether, CIPP/US and Mark Mao, CIPT & CIPP/US, Troutman Sanders
Cyber breaches continue to demonstrate that people are the greatest vulnerability for even the most sophisticated organizations. Although it is easy to feel prepared by talking about firewalls, detection software, and encryption methods, employees continue to be the easiest means for hackers to gain access. Accounting for the human factor requires that the organization take a holistic approach. Human nature should be factored into all aspects of data management, including product planning, incident response, and breach litigation.
Yes, Your Organization Can Survive Human Error
There is an increasing appreciation for how a data breach does not necessarily imply that an organization failed to adhere to the requisite standard of care. For example, in the regulatory context, the Federal Trade Commission (FTC) announced in August 2015 that it would not take any enforcement action against Morgan Stanley for an insider cyber breach. The incident involved Morgan Stanley allegedly configuring the access controls for one limited set of reports improperly, but correcting the problem as soon as it became aware of it.
The FTC was satisfied with Morgan Stanley’s efforts, noting: “[Morgan Stanley] had a policy limiting employee access to sensitive customer data without a legitimate business need, it monitored the size and frequency of data transfers by employees, it prohibited employee use of flash drives or other devices to download data, and it blocked access to certain high-risk apps and sites.” In closing, the FTC hinted that it might not pursue further action if an organization suffers a “human error” but had reasonably appropriate policies in place.
Similarly, in Lozano v. Regents of the University of California, BC55419 (L.A. Super. Ct., filed April 9, 2013), the plaintiff sought $1.25 million in damages against the UCLA health system, arguing that her medical records were improperly accessed by the current romantic partner of her ex-boyfriend, who allegedly used the credentials of a doctor to access and then publish plaintiff’s personal health information (PHI). Plaintiff argued that the health system failed to adhere to the requisite standard of care by not requiring a second form of security for access. UCLA disagreed, arguing that it used security protocols consistent with existing standards and that it should not be held responsible for “inside jobs.”
On Sept. 3, 2015, the jury agreed with UCLA, finding it was not legally liable for the breach. In business-to-business contexts, courts have also found that a well-prepared organization may not be negligent when accounts are compromised. For example, the court in Choice Escrow & Land Title, LLC v. BancorpSouth Bank, 754 F.3d 611, 613-614 (8th Cir. 2014) found the bank’s security protocols adequate despite an alleged account takeover. The lesson of these relatively new cases is that organizations may not necessarily be legally liable just because they suffer an incident as a result of human error. Surviving The Human Factor It is the natural for people to panic and blame each other in major crisis.
A well prepared organization should not only be prepared technologically, but also for human tendencies. First, well-prepared organizations should have written information security programs (WISPs) written in plain language as a guide for their employees. Regulators ask for WISPs almost as a matter of course. However, WISPs should be as much prescriptive as they are descriptive. If the WISP does not actually describe how the organization handles data, it will not provide employees much needed guidance.
Instead, the policies should reflect the practical realities of the business and describe the requirements and controls in way which can be understood and followed by the employees. Otherwise, the policies only become a standard for how the company has failed to act reasonable by regulators and class counsel. Thus, it is critical that organizations map out their technology and take data inventory when drafting their WISPs, which should also accord with their outward-facing privacy statements.
Organizations should interview both business and IT personnel to understand how they handle data. Indeed, WISPs should be tailored to the culture of the company so that compliance can be expected. A policy using common language, as opposed to legalese, will generally be best. Too often WISPs are written entirely by just the legal department or the technologists. If written by attorneys without meaningful technological experience, a WISP can miss important technical issues. WISPs written entirely by technologists can be overly technical and difficult for others to follow. Both legal and IT should therefore participate in the drafting process.
Without meaningful engagement by all stakeholders, it is too easy for drafters to protect only their interests. In such situations, it is not uncommon to find employees admitting that they did not or could not follow WISPs when interrogated during investigations or in a deposition.
Second, sufficient training is an important factor for a sound compliance program. Recent cases demonstrate that breaches often start with some form of human error. The most frequent attack vectors remain non-technical, such as unauthorized system access, misuse of privileges, use of stolen credentials, social engineering, and bribery and embezzlement. A common example is “spear phishing.” An employee with sufficiently high security credentials becomes a target, and he receives an email for “help” from someone with the apparent name of another known employee in the organization. Once he clicks on the link provided, malware is loaded onto his computer and the hacker gains access to critical parts of the organization.
While technology controls (such as intrusion detection devices or sound access controls) can limit what the hacker can do, without appropriate privacy policies and proper training, even employees in organizations with strong technological safeguards may be criticized for the unauthorized access and possibly create legal exposure.
Third, compliance needs to be tested and audited. Testing and auditing will serve as a regular reminder to employees and create awareness of emerging trends and threats. Auditing should include tests and practice runs, so that key personnel and employees can act rather than react when a real cyber incident occurs. Without meaningful practice, employees may be more inclined to blame each other, finger-point or attempt to hide facts which may implicate them rather than doing what is best for the company. Employees should get familiar with how to isolate incidents, preserve electronic evidence, handle public inquiry, and defer to responsible personnel.
Regular audits will remind employees to stay current on their training and obligations, and be mindful of potential threats. As Morgan Stanley demonstrated, a wellmanaged breach contributed to the FTC deciding to not take an enforcement action.
The question for most organizations these days is not whether they will be breached, but when. As the cases teach us, even the best prepared organizations cannot prevent malicious insiders and human error. However, when intruders breach an organization well-equipped with proper policies, training, and technology, it will be much more difficult for plaintiffs to claim that the organization should have any legal liability.