{workplacetechlaw]
Emerging
Technology &
Employment
Law
Steve Sheinberg
Contents
Introduction ......................................................................................................................................2
Cybersecurity and Cyber-Resiliency For Employment Lawyers .............................................................3
Cyber-Resilience: The Next Really Big Thing ............................................................................................. 3
Cybersecurity? .......................................................................................................................................... 7
Thinking Cybersecurity.............................................................................................................................. 9
Talking Cybersecurity .............................................................................................................................. 11
8 Cyber Risk Assessment Frameworks .................................................................................................... 13
Legal Analysis of Current Trends ....................................................................................................... 14
Do Data Breaches Hurt? Future Injuries in Data Breach Litigation......................................................... 14
Checklist: Employee Technology Policies (Part 1 of 2) ........................................................................... 17
Checklist: Employee Technology Policies (Part 2 of 2) ........................................................................... 21
ADA and Emerging Tech: Key Considerations for Employers ................................................................. 24
Electronic Privacy in the New Restatement of Employment Law........................................................... 26
NLRB: Employer Email Systems OK for Union Use .................................................................................. 28
Lessons Learned ............................................................................................................................... 29
What the Lenovo Malware Debacle Means............................................................................................ 29
Lessons from the Anthem Breach ........................................................................................................... 30
APWU v USPS @ NLRB: Some Best Practices .......................................................................................... 31
Employment Law Consideration Concerning Future Tech .................................................................. 33
Disruptive Technologies in the Workplace ............................................................................................. 33
New Tech & Employment Agreements................................................................................................... 35
8 Trends in the Transformation of Tech-Related Employment Law ....................................................... 37
Employment Law Risk and Early-Round Investors .................................................................................. 39
An Employer’s Guide to the President’s Cybersecurity Recommendations ........................................... 40
Quick Take: FTC’s New Report on the Internet of Things ....................................................................... 42
Big Data ........................................................................................................................................... 43
EEOC, Big Data and Why Disparate Impact is Not the Right Direction ................................................... 43
10 Questions: Confronting Allegations Based on Big Data ..................................................................... 45
Big Data: Critical Concerns for Employment Lawyers............................................................................. 48
ESI and Data Governance: Part 1 ............................................................................................................ 49
ESI and Data Governance: Part 2 ............................................................................................................ 51
1|Page
Introduction
These short essays are from my blog workplacetechlaw.com which focuses the intersection of emerging
technology and employment law.
Technologically-savvy employment lawyers are essential to modern risk management because:
Employees are often causally involved in security breaches (even unintentionally).
Employees have their private information stolen during breaches – and sue their employers
after it happens.
Employees will have their data tracked by wearables, their actions monitored by the internet of
things and they (often unintentionally) provide a great deal of data about themselves and their
employer to third parties.
Employers have to develop and enforce policies that are strong enough to protect the
organization, fluid enough to keep up with evolving technologies and are realistic enough to
respect the cognitive and psychological limits of their employees.
Employment litigators will need to increasingly respond to claims predicated on the statistical
analysis of massive data pools, including those handed over during discovery.
Employment agreements and exit strategies will include a technology-related untangling of
employer from employee.
To deal with these issues, employment lawyers will not only have to stay abreast of developing
technology but will need to be able to get deep under the hood of the technology itself. That will
require interacting with folks like CISOs and coders – new ground for many employment lawyers.
Thank you for reading. I am happy to answer any questions or welcome any comments. My contact
information is at the back of this document.
Steve Sheinberg
2|Page
Cybersecurity and Cyber-Resiliency For Employment Lawyers
Cyber-Resilience: The Next Really Big Thing
January 8, 2015
This is not the year of cybersecurity as some might suggest. It is the year of cyber-resilience. This is a
matter for employment lawyers to understand as a core part of their employment litigation risk
management role.
In this somewhat longer piece, I will cover:
Cyber-Resiliency: Definition
The Cyber-Resiliency Imperative In General
The Cyber-Resiliency Imperative For Employers
Cyber-Resiliency in Practice
Cyber-Resiliency: Definition
Cyber-resilience is the ability of an organization to continue doing its work – serving customers, keeping
colleagues working– in the face of ongoing or even successful cyberattacks/data breaches.
Consultants Booz Allen Hamilton describe resiliency as follows:
Traditional cyber defense strategies, such as firewalls and intrusion-detection systems, are no
longer enough. Cyber attacks are now so numerous and sophisticated that many will inevitably
get through. That means organizations must have cyber resilience – the ability to operate in the
face of persistent attacks. Resilience enables the government to continue to provide services to
the public, and industry to continue to serve employees and customers while fending off or
reacting to cyber attacks.
Simply, responsible employers are realizing that their systems will be subject to attack, and, despite best
(even reasonable) efforts, many employers will find their systems penetrated. Every business is
susceptible, especially from the activities of insiders (intentional or not).
The Cyber-Resiliency Imperative (In General)
Admiral Michael Rogers, who has the twin titles of National Security Agency Director and U.S. Cyber
Command Commander, argued for the centrality of cyber-resilience to any adequate cybersecurity
plan. Enterprises should
…not only focus on trying to ensure that no one gets into *y+our systems, but quite frankly…
assume that someone will. And the question becomes, how are you going to operate and
remediate at the same time? That’s resiliency to me, the ability to do both simultaneously.
*It is really worth the time to read Admiral Rogers’ speech, which can be found here].
3|Page
The Cyber-Resiliency Imperative (For Employers)
The recent Sony attack demonstrates the significant concern that an enterprise as an employer must
have this area. According to the New York Times and others:
1. At the very start of the attacks, Sony’s technicians seriously discussed taking the company
offline.
2. When employees arrived at work, they were confronted with disturbing images on their
screens, placed by the hackers.
3. In response to the employee’s discovery, “Sony shut down all computer systems shortly
thereafter, including those in overseas offices, leaving the company in the digital dark ages: no
voice mail, no corporate email, no production systems.”
4. “A handful of old BlackBerrys, located in a storage room in the…basement, were given to
executives.”
5. Using “hastily arranged phone trees,” text messaging became a key mode of communication.
6. “Administrators hauled out old machines that allowed them to cut physical payroll checks in lieu
of electronic direct deposit.”
According to KrebsOnSecurity:
According to multiple sources, the intruders … stole more than 25 gigabytes of sensitive data on
tens of thousands of Sony employees, including Social Security numbers, medical and salary
information.
Indeed, sometimes employee data is the primary subject of an attack. Back to Krebs:
The scammers in charge of [a] scheme [Krebs uncovered] have hacked more than a half-dozen
U.S. companies, filing fake tax returns on nearly every employee. At last count, this particular
scam appears to stretch back to the beginning of this year’s tax filing season, and includes
fraudulent returns filed on behalf of thousands of people — totaling more than $1 million in
bogus returns.
Sony is facing a number of suits from employees relating to what they believe is Sony’s failure to protect
their private information. And the litigation exposure that appears to have narrowly avoided – cutting
checks for payroll, for one instance – is a great lesson.
Accordingly, just as employment lawyers need to have an active role in cybersecurity planning, they
need to have an active role in cyber-resiliency planning as well.
Cyber-Resiliency in Practice
While I will write a more complete piece on this topic, here are some initial thoughts on building a
resiliency plan. The one overarching point is that this is a matter that management has to take
seriously. Accordingly, cyber-resiliency has a significant organizational politics component to it (as do all
risk management and mitigation efforts).
According to Symantec [.pdf]:
4|Page
The process can be best thought of as a framework with five pillars: prepare/identify, protect,
detect, respond, and recover. Using this framework, you can evaluate each pillar of your
organization’s cyber security strategy.
Symantec further notes that
When a breach occurs, the only way to proactively and effectively minimize its damage is to
have the necessary detection and response policies, processes, and technologies in place.
Prepare/Identify
Know what data you have, where it is, what is important, who are its stakeholders, what the
consequences of loss would be.
This requires that management (not just IT) has situational awareness about the state of the
network, its equipment, its vulnerabilities and about the various consequences of particular data
sets being stolen. This is part-and-parcel of a solid data governance plan.
Create a system for monitoring the evolving threat, data and stakeholder environment. Threats,
people and data pools change rapidly, and so your plan needs the ability to adapt and grow with
the organization.
Know your redundancies. Can your organization cut payroll checks manually? Can your
organization reach key decision makers in a crisis if mobile devices are off-line? Cyber-resiliency
should be seen as a core part of your whole business continuity program (and if you do not have
a BC program, it is time to ask why).
To the extent possible, resiliency and the ability to remediate damage should be a part of the
procurement and development system. It should also be a part of your network development
strategy.
Protect
Prioritize key vulnerabilities and highest risk areas and structure a security plan around them
first. Again, this is not a decision for IT alone – it is a highest-level management-level problem.
Security is as much about process as it is about anything else. Simple solutions are often
not done because there is no process to ensure they are done:
Work to segregate networks to prevent more universal damage once an attacker is inside (in
my opinion, there is no reason why an exploit that attacked Sony’s development system was
also a gateway to their Human Resources systems).
Ensure anti-virus software is uninstalled, used and is up-to-date (It cannot stop everything,
but it is certainly a start).
Use available patches to patch your vulnerabilities. There is no excuse not to.
5|Page
Detect
Ensure systems are in place for early detection of intrusions and detecting their progress or
operation. Says expert Jon-Louis Heimer:
Some of these breaches had been in process for quite some time. Initial system
compromises sometimes occurring months before the breach became a known threat.
Some of these breaches had been reported by malware and IDS systems but ignored.
Simply, early detection means significantly lower costs in the event a breach does happen.
Respond
Have a response plan. While every security-related possibility cannot be anticipated, a detailed
playbook with management buy-in can make a critical difference. At the very least, it keeps the
initial decision-making in management’s hands, rather than at the technical level. Such a plan is
a key partof a data security efforts.
In forming those plans, determine who needs to be in the decisional loop, when are they
brought in and by whom? This involves analyzing the whole range of stakeholders and deciding
where and when they are to be brought in.
Have a plan to call law enforcement. It sounds simple, but deciding when to bring in the
authorities can be quite complicated in an organization, especially when the organization is
highly stressed.
Recover
With plans in place, the pathway to recovery is underway before an attack begins or matures.
Have agreements with relevant vendors to ensure you can take immediate steps in a costeffective manner.
Build relationships with people in the remediation space. Knowing how to reach your insurer,
outside legal counsel, equipment providers, etc. and having them know who you are is a key to
rapid response. The same holds true with law enforcement. Knowing who to call – and having
them know you – may be critical.
Back to Admiral Rogers on security breaches: “That is not a discussion that I want to wait until game day,
as it were, to suddenly start to have. Hey, I’m here to help fill in the blanks.”
6|Page
Cybersecurity?
November 30, 2014
Prologue
Cybersecurity must be on an employment lawyer’s radar.
When discussing cybersecurity audits, PriceWaterhouseCooper notes that “*o+ften, when companies get
a glimpse into what really is going on, they are surprised. They discover that the biggest problems may
be caused by their employees.” In short, one of the largest holes in cybersecurity – and thus a key
pathway to incredible financial loss – is undoubtedly through an employee’s keyboard.
Businesses are spending billions on cybersecurity and digital forensics. However, cybersecurity expert
Gary Warner argues that
[t]he weakest component of your cyber security is your humans. If a crook can get that email
in…then your last hope is that your humans are smart enough not to click on it…. But, guess
what? They do. We call it ‘the inevitable click.’
While employers take steps to implement security measures ranging from better firewalls to multi-step
authentication, the reality is, as Nichole Perroth recently suggests in the New York Times, “even those
who are layering on as many defenses as possible are still getting crushed.”
In advising their clients, lawyers in general, and employment lawyers in particular, will have to
understand, and if necessary lead, the drive to protect data. Some thoughts:
1. Cybersecurity policies and procedures will need to conform to the idea that employee-based
threats represent some of the largest vulnerability to an employer’s data. One place to make a
change is in employment policies and their enforcement — ensure that good information
security policies are in place and are backed by serious sanctions. The risks and costs associated
with employee-undermined (such as using weak passwords) and employee-thwarted (such
as stealing sensitive data using USB drives) security will only get worse. Employees also ought
to know what is expected of them regarding security on corporate networks — especially when
it comes to that ‘inevitable click.’ Indeed, employers ought to restrict media such as USB drives
from being attached to the network.
2. Employees must be treated as outsiders when it comes to the systems to which they have no
business need to access. In such a context — even if achieved through virtualization and/or
encryption — the highest levels of security ought to interpose a wall between employees and
prohibited data. In addition, rigorous inspection and detailed audit trails are especially
important in the employment context: insist on being able to know which employees
have accessed which system, when, and what they did once there.
3. Consider employment agreementsthat deal with the entire range of employee-related
risk. Educate employees about how to manage the risk occasioned by third-party apps.
7|Page
4. Educate employees about what their devices (especially, but not exclusively, in a BYOD
environment) are doing data-wise — privacy is still valued and may influence behavior. They
need to know they are sharing many things on that device with their employer and far beyond.
5. Have firm rules about who makes decisions that can have an impact on cybersecurity. In this
emerging environment, any decision relating to the public posting of any data or information
that discusses the company or its systems, no matter how seemingly innocuous, needs to be
escalated to the highest levels of decision-making. One cybersecurity website suggeststhat
Target’s recent data woes may have had roots in vendor-related information left on its publicfacing website – allegedly leading hackers to an air conditioning vendor whose systems touched
Target’s systems.
6. Employer-deployed systems (including wearables and apps) should be built with rigorous
security, configuration and data encryption as a primary concern. Developers should be
meticulously screened, and outside vendor agreements must make security a central focus.
Non-employer-deployed devices that seek to connect with employer systems or that may be
used to carry corporate information (including BYOD devices) should be required to conform to
a device-independent corporate security protocol and screening procedure while the data they
capture should be open to employer scrutiny. Moreover, such devices should only be permitted
to connect with systems that are segregated from the most sensitive data pools, or, where
impossible, should be backed by strict policies, auditing, security protocols and user education.
7. Employment lawyers representing employers of all sizes will need to know how to handle data
breaches either caused by, or that have an impact on, employees. Planning for such an
eventuality needs to be concrete and immediate – breaches should be treated as inevitable.
8. Employers should ensure they have adequate cyber risk insurance.
The bottom line: tech-savvy lawyers need to be an integral part of the security puzzle. While lawyers
need not be security technicians, they need to understand the technical elements of security.
8|Page
Thinking Cybersecurity
December 3, 2014
Some Basics
As I suggested in a previous post, understanding cybersecurity is essential to effectively managing an
employer’s risk. The upshot: employment lawyers must talk to CIOs. To do that, we’ll need to know a
thing or two about the subject matter at hand.
The Basics of Insider-Related Cybersecurity
The Department of Homeland Security identifies six core elements for preventing insider-related
cyberthreats:
(1) Collect and Analyze (understanding and auditing your network)
(2) Detect (monitoring network traffic and data usage for sign of attack )
(3) Deter (raise the cost of initiating an attack)
(4) Protect (repel an attack)
(5) Predict (anticipate threats)
(6) React (reduce opportunity, capability, and motivation for the insider)
I would add two more core elements to this list: Plan (in order to improve reaction times to breaches)
and Re-Assess (continually update all of the core elements).
As DHS notes: “*e+xisting security tools for detecting cyber attacks focus on protecting the boundary
between the organization and the outside world….they are less suitable if the data is being transmitted
from inside the organization to the outside by an insider who has the proper credentials to access,
retrieve, and transmit data.”
This can be intentional or non-intentional (as even non-malicious insiders can do a great deal of damage
should they succumb to any number of scams designed to get them to punch holes in an employer’s
cybersecurity).
Critical Security Controls
Turning from the theoretical to the practical, these eight elements need to be combined with an
understanding of the cybersecurity techniques that have the greatest impact upon improving an entity’s
risk posture against real-world threats.
Examples of such practices can be found in the Council on CyberSecurity’s Critical Controls for Effective
Cyber Defense and are worth reviewing closely.
1: Inventory of Authorized and Unauthorized Devices
2: Inventory of Authorized and Unauthorized Software
3: Secure Configurations for Hardware and Software on Mobile Devices, Laptops, Workstations,
and Servers
4: Continuous Vulnerability Assessment and Remediation
5: Malware Defenses
9|Page
6: Application Software Security
7: Wireless Access Control
8: Data Recovery Capability
9: Security Skills Assessment and Appropriate Training to Fill Gaps
10: Secure Configurations for Network Devices such as Firewalls, Routers, and Switches
11: Limitation and Control of Network Ports, Protocols, and Services
12: Controlled Use of Administrative Privileges
13: Boundary Defense
14: Maintenance, Monitoring, and Analysis of Audit Logs
15: Controlled Access Based on the Need to Know
16: Account Monitoring and Control
17: Data Protection
18: Incident Response and Management
19: Secure Network Engineering
20: Penetration Tests and Red Team Exercises
10 | P a g e
Talking Cybersecurity
December 3, 2014
Questions for Talking with the CIO
As argued elsewhere, employment lawyers need to talk to the CIO in order to more fully manage the risk
employers face from employees. Here is how to get the conversation started.
Data Governance
Does the organization have a data governance plan, that is, “a system of decision rights and
accountabilities for information-related processes, executed according to agreed-upon models
which describe who can take what actions with what information, and when, under what
circumstances, using what methods“? (“Data Governance” will be the subject of a specific post
in the coming days).
Data Collection and Retention
What devices are connected to the network and/or gather data from employees? How does IT
intend to keep up with developing technologies so that all devices that connect with the
network (including employee-owned devices) or gather information on employees are properly
configured?
What software is loaded into the network or employee devices that might gather or store data
related to employees? Is the data collection purposefully done and/or necessary?
Where is employee-related data stored (all of it, including information returned from smart
devices, etc., and activity log information)?
Is all employee-related data subject to document retention policies? Can you identify each pool
of data collected by the company and determine whether it ought to be retained?
Are data transfer, storage and retention policies in compliance with all privacy and data security
regulations? This is especially important in regulated industries and where data crosses
borders. This should also apply to vendors, especially those with direct access to your network
and its data?
Do IT and counsel have a plan in place to deal with the evolving privacy regulatory framework,
especially in international cross-border contexts? In other words, who is tasked with staying on
top of the law with regard to data transfer, breaches and notifications?
Security of Data
Are employees adequately trained in security protocols? Can the IT department proactively
identify who is and who is not following those protocols? Are there adequate and appropriate
security policies in place? Are there clear policies and procedures concerning the identification,
and exfiltration, of trade secrets and other confidential information?
11 | P a g e
Are mobile devices properly configured to ensure the security of employee data, including such
data that is generated by the employee on behalf of the employer and which is transmitted back
to the employer?
Is the network properly configured to segregate data on a “need to know” basis?
Is the network itself properly configured for optimal security? Are firewalls, routers, and
switches using the latest software and patches? Is there a patch/update program in place? Are
appropriate limits placed on, and does IT have control over, all network ports, protocols, and
services? Are administrative privileges routinely audited? Are wireless networks properly
secured? Are adequate malware defenses in place? Is all software up-to-date and are all
security patches applied?
Is all information posted on company and vendor websites reviewed for its compliance with
information security? For instance, some suggestthat the Target breach began with public facing
documents that were exploited for the purpose of understanding Target’s system?
Do all technology-related agreements signed by the company require the appropriate level of
security, particularly for any vendor that connects with your systems? This article, Exposed
Corporate Credentials on the Open Web, a Real Security Risk., gives some interesting insight to
the risks associated with this issue.
Does any technology-related agreement get signed without IT reviewing for security purposes?
Data Release/Breach
Is employee data identified and mapped in order to be producible in an e-discovery context?
How are individual accounts and/or employee activity monitored and logged? Who can access
that information? Is employee data useable to help assess employee performance?
Are there protocols in place to manage the end of the employment relationship? This may
include plans to image drives or monitor attempts at employee data exfiltration. At the very
least, IT should maintain a checklist of procedures for the end of the employment relationship.
Are plans in place for a data breach? Are there industry-compliant incident response,
management and auditing plans in place? How are those plans designed to manage breaches
concerning employee data, including PII or other private information? Are all members of the
incident response team identified and is their 24/7 contact information identified?
Obviously, not everything here is employment law-specific, and not every element is covered — but
everything here is relevant to the conversation.
12 | P a g e
8 Cyber Risk Assessment Frameworks
February 26, 2015
Understanding how to assess cyber risk is essential for a lawyer leading or participating in an enterpriselevel cyber risk management team. One or more of these eight analytical frameworks should help.
An important caveat: no one methodology is going to map directly onto your organization. Taking the
time to review each of these is a great first step to finding the right methodology and framework for
your enterprise.
Here are the frameworks:
1. National Institute of Standards and Technology (NIST): NIST SP-800-30. NISTseeks develop
objective and auditable information security standards and guidelines; its process works to
separate assets into distinct and integrated tiers that help to rationalize the risk assessment
process and to better focus on most vulnerable assets. Free training/overview resource: here.
2. International Standards Organization: ISO/IEC 27001; (managing 27001:ISO/IEC 27002 ). It is a
series of standards against which information security processes and procedures can be
measured and audited.
3. Software Engineering Institute (SEI) at Carnegie Mellon University (CMU):OCTAVE (and Octave
Allegro). OCTAVE Allegro “is a methodology to streamline and optimize the process of assessing
information security risks so that an organization can obtain sufficient results with a small
investment in time, people, and other limited resources. It leads the organization to consider
people, technology, and facilities in the context of their relationship to information and the
business processes and services they support.”
4. Factor Analysis of Information Risk (FAIR): FAIR. FAIR seeks to help users measure risk, especially
where quantification is difficult.
5. Department of Homeland Security (DHS): Cyber Security Evaluation Tool (CSET®). More for those
running an automated, industrial control or business system, CSET is a software base evaluation
tool that uses industry standards to analyze your particular situation.
6. DHS/CMU/SEI: Cyber Resilience Review (CRR). This is a comprehensive, questionnaire-based
assessment of an organization’s cybersecurity management program. It is designed to measure
current resilience and providing gap analysis using best practices standards. This tool is derived
from the CERT Resilience Management Model (CERT-RMM), which is a process-improvement
process improvement approach to building resilience. NIST Crosswalk seeks to map NIST 800
standards onto the RMM model.
7. MITRE Corporation: Cyber Resiliency Assessment: Enabling Architectural Improvement. This
process is broad-based but uses the ways in which architectural resilience practices contribute
to the overall resilience of an organization.
8. Symantec: The Cyber Resilience Blueprint: A New Perspective on Security. A senior manager’s
guide to approaching cyber-resilience.
13 | P a g e
Legal Analysis of Current Trends
Do Data Breaches Hurt? Future Injuries in Data Breach Litigation
February 17, 2015
Protecting a company from data breach lawsuits may get substantially harder.
Once hit by a data breach, companies face suit from consumers and employees who have had personally
identifiable information compromised. A relatively new line of cases has made life very difficult for
these plaintiffs by holding that increased risk of identity theft is not sufficient grounds for a lawsuit.
However, some holdout courts may just have it right.
Understanding this requires a deep dive into the world of standing to sue.
To establish Article III standing to sue, a plaintiff must show that they were actually injured. That is,
they must show that their injury is “concrete, particularized, and actual or imminent; fairly traceable to
the challenged action; and redressable by a favorable ruling.” Clapper v. Amnesty Int’l USA, 133 S. Ct.
1138, 1147 (2013).
Most plaintiffs in a data breach case have not yet been injured by the loss of the data (nor will they be);
rather, they fear such injury and are seeking standing on the basis of the potential future
injury. Accordingly, the key question is: when is there enough of a probability of a future harm so as to
confer standing on a plaintiff?
The Supreme Court appeared to set a very high bar for answering that probabilistic question: the
“threatened injury must be certainly impending to constitute injury in fact,” and that “allegations of
possible future injury” are not sufficient. Clapper, 133 S. Ct. at 1147 (emphasis supplied). In short, it is
not enough to show a probable injury — nothing short of near certainty of a future injury confer
standing.
Not quite so fast. In Susan B. Anthony List v. Driehaus the Court clarified that Clapper’s“certainly
impending” standard did not supplant the line of cases holding that a “substantial risk” that a harm will
occur can confer standing. 134 S. Ct. 2334 (June 16, 2014). Specifically, Justice Thomas wrote for a
unanimous Court:
An allegation of future injury may suffice if the threatened injury is “certainly impending,” or there is a
“‘substantial risk’ that the harm will occur.
What of all this?
A surprising number of data breach cases have been dismissed on standing grounds
under Clapper’s “certainly impending” standard.
For instance, in Galaria v. Nationwide Mut. Ins. Co (a case in which PII was stolen from an insurer), the
district court found
14 | P a g e
In this case, an increased risk of identity theft, identity fraud, medical fraud or phishing is not
itself an injury-in-fact because Named Plaintiffs did not allege—or offer facts to make
plausible—an allegation that such harm is “certainly impending.” Even though Plaintiffs alleged
they are 9.5 times more likely than the general public to become victims of theft or fraud, that
factual allegation sheds no light as to whether theft or fraud meets the “certainly impending”
standard. That is, a factual allegation as to how much more likely they are to become victims
than the general public is not the same as a factual allegation showing how likely they are to
become victims.
Other allegations in the Complaint show such harm is not certainly impending. For example,
Named Plaintiffs state that consumers who receive a data breach notification had a fraud
incidence rate of 19% in 2011. … An injury can hardly be said to be “certainly impending” if there
is less than a 20% chance of it occurring….
That speculative nature of the injury is further evidenced by the fact that its occurrence will
depend on the decisions of independent actors … If they do nothing, there will be no
injury…. See 998 F. Supp. 2d 646, 654-655 (S.D. Ohio 2014)
Several cases follow suit. I won’t bore you with a string cite, but you may look here if you would like
one.
These cases uniformly reject standing for breach victims who can show no other injury. However
uniform those cases may be there are some courts not toeing the line.
First, a red herring. Some have pointed to the litigation arising out of the Target credit card breach as an
exception to the Clapper cases. It isn’t an outlier; it is irrelevant. Standing was found because of actual
injury as pled on a motion to dismiss. The court didn’t need to reach Clapper because the complaint was
carefully drafted and alleged actual injury:
Indeed, many of the 114 named Plaintiffs allege that they actually incurred unauthorized
charges; lost access to their accounts; and/or were forced to pay sums such as late fees, cardreplacement fees, and credit monitoring costs because the hackers misused their personal
financial information. In re Target Corp. Customer Data Sec. Breach Litig., D. Minn. Dec. 18,
2014).
The only thing interesting the Target court did was to push data breach standing litigation to summary
judgment.
Turning to the real outliers, let’s first turn our attention to In Re Adobe Privacy Litigation, (N.D. Cal. Sept.
4, 2014), where the court relied on Ninth Circuit precedentand held that plaintiffs could suffer a
cognizable injury in fact because there was “substantial risk” of harm. The court declined to
apply Clapper (as overruling the precedent) because it thought that the harm in the Clapper case was
significantly different and more attenuated — and that Clapper didn’t appear to explicitly try to
fundamentally re-order the doctrine of standing. Likewise in Moyer v Michaels Stores, (ND Ill. July 14,
2014), the Northern District of Illinois found standing in a credit card-related data breach because,
like Adobe, it distinguished Clapper from the data breach context and found that “risk of identity theft
stemming from the data breach at Michaels is sufficiently imminent to give Plaintiffs standing.” See
also See In re Sony Gaming Networks & Customer Data Sec. Breach Litig., (S.D. Cal. Jan. 21, 2014).
15 | P a g e
So, there are three camps:
1. Clapper: Data breach litigants must show actual or impeding harm for standing.
2. Adobe: Data breach litigants must show a substantial risk of harm
3. Target: Good pleadings can defeat a standing-based motion to dismiss; the real burden of
showing injury can take place at summary judgment.
Which camp is right?
I suspect Adobe is. While the Clapper-data cases present an intriguing line of reasoning, it appears that
the news of Susan B. Anthony List has not yet made it to this area of litigation. Susan B. Anthony
List makes it clear that Clapper wasn’t supplanting the substantial risk of harm test. That breathes new
life into Plaintiffs’ claims
In short, probabilistic standing lives on to fight another day.
16 | P a g e
Checklist: Employee Technology Policies (Part 1 of 2)
January 29, 2015
I have argued elsewhere that employment lawyers should be an integral part of the creation of
workplace policies that prioritize cybersecurity and recognize its centrality to the well-being of the
organization.
What follows is a two-part checklist for employment counsel to use when designing employementrelated technology policies.
Contents: Part One
Acceptable Use of Corporate Technology
Security Compliance
Electronic Privacy
Data Governance
Trade Secrets/Confidential Information
Social media
NLRB Compliance (and Caution)
Contents: Part Two
Cellphone/Mobile Device
ADA Policy/Accommodations
Data Retention Across Devices and Apps
Technology Acquisition
Notes
Acceptable use of Corporate Technology
Limit use to business uses/Describe limits of personal use.
Prohibit unlawful uses; include specifically a prohibition on uses that violate sexual
harassment/EEO policy or laws.
Prohibit use of corporate systems to violate the company’s or other’s intellectual property,
including copyrights, trademarks, confidentiality, and trade secrets.
Prohibit off- hours use by non-exempt personnel; require reporting of off-hours use.
Prohibit uses that yield unlawful exports of technology, violate anti-competition laws and/or
violate any law of any jurisdiction (see Electronic Privacy, below).
Prohibit any use that exceeds the authorization given by the company.
Include a general savings clause.
See NLRB Compliance, below.
Security Compliance
Prohibit any use that is designed to effect a security breach or other malicious use of the
system.
Prohibit circumventing any user authorization requirement.
Prohibit uses that exceed a user’s authorized access to network, software, data or files.
17 | P a g e
Prohibit any access and, without permission, any use, misuse, abuse, damage, contamination,
disruption or destruction of any corporate computer, computer system, computer network,
computer service, computer data or computer program.
Prohibit any activity that seeks to hide the user’s identity, except in conformity with the
company’s ethics, whistleblower and harassment reporting policies.
Prohibit interfering with another’s rightful use of the system.
Ensure all postings to public-facing websites are vetted for cybersecurity compliance.
Comply with password policy.
Prohibit installation of non-approved software.
Prohibit installation of non-approved hardware.
Prohibit exfiltration of software, data or files, including by UBS Drive.
Require reporting of known security issues, including:
o incidents that result in misuse of confidential information of any form,
o incidents that may impair the functionality of the network,
o activities that seek unauthorized access to the network or access which exceeds
authority, and/or
o any violation of any information technology policies.
See NLRB Compliance, below.
Electronic Privacy
Clearly explain that data on network and all electronic equipment are owned and accessible by
the company therefore, employees should have no expectation of privacy while at the
workplace.
Seek explicit consent for employer monitoring of electronic communications.
Prohibit any use that infringes on privacy rights in violation of state, federal and foreign state’s
laws.
Company may preserve, access, or monitor data, accounts and equipment if required by law or
an internal investigation into misconduct.
Company may routinely audit use and traffic.
Identity theft: Prohibit uses that violate the FTC Red Flagrules (if applicable).
Prohibit storage of private employee, consumer or patient information on mobile devices and
drives without explicit authorization.
Require use of encryption for all private employee, consumer or patient information.
Incorporate by reference, and update the policy based upon, industry specific regulations, such
as HIPAA and SEC rules.
Data Governance
See detailed discussion, here.
Require that use of data is limited to permitted/authorized uses and complies with the data
governance plan.
Trade Secrets/Confidential Information
Prohibit unauthorized access to trade secrets.
Prohibit disclosure (or soliciting disclosure).
Prohibit use of unapproved file-hosting services.
18 | P a g e
Prohibit storage of trade secrets on mobile devices (if possible); else, require use of encryption.
Consider including choice of law provision given the disparity among states.
(The Court’s recent decision on Department of Homeland Security v. MacLean, No. 13- 894 (U.S.
Jan. 21, 2015) doesn’t change this analysis).
Social media
Distinguish between personal accounts and corporate accounts.
Use of personal accounts at work
o Prohibit, or specify limits to use (see here).
o Establish a policy regarding corporate access to private social media passwords.
o See NLRB Compliance, below.
Personal accounts at home
o See NLRB Compliance, below.
Social media – corporate accounts
o Define clear ownership
o Require passwords and account recovery information be given to managers
o Define editorial/approval/brand protection policies
o Prohibit uses that are outside of editorial/approval/brand protection policies.
Strive for platform neutrality (e.g., should apply across all social media platforms).
Account for progressing technology.
NLRB Compliance
All policies must be NLRB compliant, even if you do not have a union.
Explicitly state that the rules do limit any protected activity under Section 7 National Labor
Relations Act.
Do not use language that could be construed to limit such protected activity, including any
policy designed to protect confidential information pertaining to an employee’s terms and
conditions of employment or working conditions.
Ensure that any policy regarding language, tone, media etc is consistent with protected activity.
Do not issue a rule curtailing use of social media in response to such protected activity
You may prohibit disclosure of privileged, trade secrets or other confidential information.
You prohibit discriminatory remarks, harassment and threats of violence or similar inappropriate
or unlawful conduct (including disclosures in violation of any financial disclosure law).
Do not retaliate for reports concerning such protected activity.
Use examples that help employees clearly understand that a particular policy does not reach
protected communications.
Recall that the NLRB has determined that employees may have a right to use corporate email to
engage in statutorily-protected discussions about their terms and conditions of employment.
Prohibit employee use of employer social media for personal purposes (see note below).
For more, and a sample policy, see the NLRB’s Acting General Counsel’s Operations Management
Memo of 2014.
Note: The NLRB does not clearly distinguish between policies that relate to employees using corporate
accounts on behalf of the corporation, employees commenting using employer public-facing social
19 | P a g e
media sites and employees using their own personal accounts as used at home or in the workplace. For
instance, in its model policy the NLRB approves the following:
Refrain from using social media while on work time or on equipment we provide, unless it is workrelated as authorized by your manager or consistent with the Company Equipment Policy.
The Board’s decision in Purple Communications [.pdf] sheds some light on this by rejecting the
applicability of the so-called “equipment cases” to e-mail. Despite its protestations, this putatively
technology-specific policy can easily be expanded to other technologies.
The Board in Purple gave one important caveat: “nor do we require an employer to grant employees
access to its email system, where it has not chosen to do so.” Accordingly, an explicit ban on the use of
employer-owned social media may be warranted.
20 | P a g e
Checklist: Employee Technology Policies (Part 2 of 2)
February 1, 2015
Earlier, I published the first part of a checklist for employment lawyers developing technology policies.
Part one covered:
Acceptable Use of Corporate Technology
Security Compliance
Electronic Privacy
Data Governance
Trade Secrets/Confidential Information
Social media
NLRB Compliance (and Caution)
This part covers:
Cellphone/Mobile Device
ADA Policy/Accommodations
Data Retention Across Devices and Apps
Technology and Data Services Acquisition
Notes
Cellphone/Mobile Device
All
o
Require adherence to all company policies, including acceptable use of corporate
technology policy.
o Establish a use-in-vehicles/Safe Use policy.
o Ensure that users understand that personal data is the sole responsibility of the user.
o Restrict the use of those apps for company-related communications that cannot be
adequately brought into compliance with your document retention policy.
o Ensure compliance with NLRB use policies.
o See ADA, below .
Company owned
o Affirmatively recite ownership of device.
o Outline permissible personal use (including that it may not interfere with company
business).
o Ownership of wireless number upon termination.
BYOD
o Specify permissible devices or procedures for having a device approved.
o Specify security policy for cell phones, including reporting lost phones with company
data.
o Set out expectations regarding any management software installed on a device.
o Establish your level of support for the device, including support, help desk limits,
responsibility for data back up.
o Allocate ownership of apps and data.
o Determine whether any third-party apps will be banned from BYOD devices.
o Disclaim liability for destruction of non-employer data due to employer-installed
software or intentional wipes to protect corporate information.
21 | P a g e
ADA Policy/Accommodations
Establish a process for review of requests for IT-related accommodations that ensures
that requests for accommodation are reviewed by a competent authority.
Ensure that users who use devices that connect to the network, smartphones in a companyowned or BYOD context or desktop computers that any information stored about the
employee’s PHI may be viewable by the company.
See Acquisition, below.
Data Retention Across Devices and Apps
Require employee to maintain business records consistent with the company’s record retention
policy on whatever device or app that record is created.
Prohibit the use of devices and apps for business purposes which do not behave consistently
with the company’s record retention policy (e.g., snapchat).
Acquisition of Hardware and Software
Ensure that acquisitions are free from any code that can materially impede your networks,
devices and data. This should include a specific discussion of the programming languages being
used and/or a specification that any programming be done using memory- and type- safe
language and/or adhering the strictest best coding practices in a language that is neither (such
as C++).
Ensure that hardware and software security are baked in at the development stage and not as
an afterthought.
Acquired software should use and implement any required or recommended security patches or
upgrades
Allocates risk through indemnity and insurance requirements.
If your data is being held by the vendor, ensure that your agreement:
o Sets out definition of confidential information
o Sets out a definition of a security breach
o Requires that the vendor provide reasonable hardware, software and physical security.
o Where necessary, sets out the level of security to be met, such as remaining in full
compliance with Payment Card Industry Data Security Standard (“PCI DSS”)
o Ensures that notification of an incident is prompt (reasonably proximate to the incident)
and persistent (trying multiple avenues to reach your team)Establishes the level of
service the vendor will provide you during an incident (24/7)
o Ensures compliance with all applicable federal and state, and foreign privacy and data
protection laws, as well as all other applicable regulations and directives.
o Ensures notification in the case of a subpoena of your data
Establish data ownership and ensure that data can be retrieved by you in a commercially usable
matter regardless of any breach by either party and which survives the term of the agreement.
Ensure that software and hardware are compliant with all EEO and ADA requirements.
22 | P a g e
Notes
The centrality of data and system integrity to a company’s reputation, brand, equity and even
survival dictate a new, higher level of response for violations of these policies.
Ensure that certain post-promulgation communications constitute an update of the policy and
are included by reference in the policy.
Design policies and procedures that are sensitive to human cognitive capabilities and limits. For
instance, password policies should be rationally designed to ensure usability and effectiveness.
(More on this in a future post).
This list bears a stubborn resemblance to that found at the University of Iowa.
23 | P a g e
ADA and Emerging Tech: Key Considerations for Employers
December 28, 2014
The Americans with Disabilities Act (ADA) is often overlooked in employment-related technology
development, acquisition and deployment. As new and emerging technologies enter the workplace (see
my earlier post, here) there are several key considerations related to the ADA to bear in mind.
By way of very brief background, the ADA (and state analogues) typicallyrequire employers to make
reasonable accommodations for their employees and applicants with disabilities.
Key considerations:
All technology designated for use by employees or applicants should be reviewed for usability by
individuals with disabilities, including those using Adaptive or Assistive Technologies (AT).*
Security policies and procedures (especially those in a mobile environment that seek to isolate
workplace apps from non-workplace apps) must be designed to work with — and certainly not
block — AT. Other security policies, such as equipment related to two-factor authentication,
should be designed to work with AT, especially those users with a visual impairment. For
instance, RSA security tokens feature interoperability with Windows screen readers for visually
impaired users.
Ensure that in-house social networking or collaboration sites, video meeting technologies, inhouse learning technologies consider employees with disabilities. Similarly, your websites,
intranets and employment-related documents and forms must be accessible.
Online job application platform and/or any application-related apps must be usable by
individuals with disabilities and should be compatible with a wide range of AT.
Any inventory of apps on a device (especially in a BYOD environment) should be done with
sensitivity to the fact that the very presence of some apps on a device (for instance, an App that
monitors diabetes) may be protected health or disability-related information.
New tech, such as wearable devices, should be designed with individuals with disabilities in
mind — it is far easier to consider this at the design phase than at the deployment
phase. Similarly, RFPs/agreements for new workplace technology ought to insist on compliance
with the ADA. In short, accessibility must be a fundamental part of your technology
acquisition and development program.
International employers must be aware of the various disability-related regulations that cover
their non-domestic workers.
Employees and applicants should have a way of discussing the need for technological
accommodation — and not necessarily solely with someone in an IT department. From a pure
risk-management point of view, the ability to quickly flag, report and remedy barriers for
employees and applicants with disability could be very important to the employer.
24 | P a g e
The good news is that the number of available AT resources are increasing. From apps that assist those
with speech-related or preventing conditions to those that assist the visually impaired to read, the
universe of AT-related apps is expanding.
Two resources:
Partnership on Employment & Accessible Technology (PEAT):http://peatworks.org/
Princeton’s Center for Information Technology Policy recently held a conference called
Designing an Inclusive Digital World, the proceedings of which can be viewed here.
* A good definition of AT can be found here:
Assistive or Adaptive Technology (AT) commonly refers to products, devices or equipment,
whether acquired commercially, modified or customized, that are used to maintain, increase or
improve the functional capabilities of individuals with disabilities.
25 | P a g e
Electronic Privacy in the New Restatement of Employment Law
December 16, 2014
The inaugural Restatement of Employment sets out ALI’s carefully considered (and what will likely be
extremely influential) views concerning the law of employee privacy.* There is a lot to unpack here (and
the structure of this section of the Restatement is not as clear as it could be), so this post will provide a
practical overview.
The basic idea: the new Restatement protects employee electronic privacy interests against “wrongful
employer intrusions.” §7.01 Such interests include “…electronic locations, including work locations
provided by the employer, in which the employee has a reasonable expectation of privacy.” § 7.03 In
addition, employees have a right to have information of a “personal nature” protected from employer
view. § 7.04. Finally, employees also have a right to the “non-disclosure to third parties of the
employee’s information of a personal nature disclosed in confidence to the employer.” § 7.05.
Some observations:
(1) Whether there has been an employer intrusion upon an employee’s privacy depends on whether the
employee had a reasonable expectation of privacy (the“threshold question of whether a privacy interest
is implicated.”) Reporter’s Note to Comment A of 7.02.
(2) As noted, there are three distinct privacy interests; accordingly, there are three different measures of
“reasonable expectations” of privacy:
privacy expectations as to the employee’s person and locations, including virtual electronic
locations;
privacy expectations as to information disclosure to the employer; and
privacy expectations as to information disclosure to other employees or third parties. Id.
(3) An employee’s expectation of privacy in electronic locations can be defeated by notice to the
employee. Comment F to §7.03 says that a “clearly communicated employer notice that a location is not
private, or uncoerced employee consent to an employer intrusion into a certain location, will generally
defeat expectations of privacy.” Without notice, an employer should not rely on its ownership of an
electronic location to be dispositive or even helpful: “*t+ he employer’s ownership or control of the
premises or particular equipment does not necessarily preclude an employee from having a reasonable
expectation of privacy in particular work locations.” Comment E to 7.03. Notably, the Restatement
extends the “reasonable expectation” framework into nonworkplace (that is, home) electronic locations
as well.
(4) The definition of an electronic location is very expansive. It includes all electronic “places” such as
laptops, desktops, cell phones, the cloud, password protected blogs/social media. Comment C to
7.03. Given the rapidly transforming landscape of technology in the workplace, it may be hard for
employers to keep up: an expectation of privacy in any new data pool could create an employer-free
zone (there are some exceptions). As the technology advances, employers should be careful to ensure
that notices concerning privacy are up-to-date and to identify with particularity those locations that are
not private (if any are).
(5) Section 7.05 provides that “*a+n employer intrudes upon the privacy interest *of an employee+ by
providing or allowing third parties access to [confidential] employee information without the
26 | P a g e
employee’s consent.” There is a duty to not release the information voluntarily (with a few
exception) and although not expressly set out, there appears to be a duty to use reasonable means to
secure that information. Comment D §7.05. The Reporter’s note to §7.04 provides an insight when
discussing PII: “… the employer and/or its plan retain a duty to keep such information private, both
under § 7.05 (infra) and the Health Information Portability and Accessibility Act (HIPAA).” So the real
question that is unanswered: does this Restatement believe there is a tort for failure to protect PII from
a data breach? Surprisingly (and I fess up to my quick read), the Restatement does not appear to
seriously wade into employer duties concerning data breaches as they relate to employee data.
More to come…
* I am working from ALI’s proposed final draft, about which it says: “This draft contains all sections of
this project and was approved by the membership at the 2014 Annual Meeting, subject to the discussion
at the Meeting and to editorial prerogative. This material may be cited as representing the Institute’s
position until the official text is published.”
If you wish to buy the restatement from ALI, please click here.
Addendum 3/3/2015: A thought from Justice Scalia that is interesting in light of my suggestion that the
Restatement of Employment Law’s take on privacy will become “extremely influential”:
…Over time, the Restatements’ authors have abandoned the mission of describing the law, and
have chosen instead to set forth their aspirations for what the law ought to be. … Restatement
sections such as that should be given no weight whatever as to the current state of the law, and
no more weight regarding what the law ought to be than the recommendations of any
respected lawyer or scholar. And it cannot safely be assumed, without further inquiry, that a
Restatement provision describes rather than revises current law. Kansas v. Nebraska (Feb. 24,
2015) (Scalia J., Dissenting)(Internal quotations and marks omitted).
27 | P a g e
NLRB: Employer Email Systems OK for Union Use
December 12, 2014
Reversing a longstanding ruling, the NLRB yesterday held in Purple Communications, Inc. v.
Communications Workers of America that employees could, under certain conditions, use employer
email systems for their Section 7 communications. This decision, which will likely have significant
workplace ramifications, is based upon erroneous reasoning and raises some very important questions.
As a key part of its justification for this ruling, the Board noted that “the increased use of email has been
paralleled by dramatic increases in transmission speed and server capacity, along with similarly dramatic
decreases in email’s costs.”
The heart of the argument is that since the cost of email is so low, its use as part of Section 7 activity
should be permitted.
However, while the cost of any one email is de minimis, in the aggregate emails are not as inexpensive
as the Board presumes. From extrinsic costs such as productivity losses related to email inbox overload
to intrinsic costs related to storage, management, retrieval and production of millions of emails, email is
not at all cost-free or even inexpensive to employers. At the same time, numerous low-cost nonemployer-provided avenues of communication exist for employees to use to engage in concerted
activity, including social media. (How can one argue both that emails are not so low cost to employers
and suggest that employees have a plethora of low-cost avenues for their Section 7 communications?
Because unlike the social media giants, most employers do not monetize the data in their email servers.)
Dissenting members argued that the decision creates uncertainty as to which technologies are, as a
matter of law, permitted to be used for Section 7 purposes and that the ruling usurps longstanding
employer property rights. Put differently, when is an employer to conclude that a technology it deploys
(say, a video conferencing system) is sufficiently inexpensive per use to require that the employer’s
access and use rules are abrogated by Section 7? Furthermore, if the employer “opens” a techology too
early, will it run afoul of Section 8(a)(2), which makes it unlawful for an employer “to … contribute
financial or other support to *any labor organization+”?
My editorializing aside, employers should carefully review all of current technology use policies for
compliance with this ruling. Toward that end, it may be helpful for employers to find an objective
measure of cost-per-use for its communications systems. Last, as always, a mechanism for continually
reviewing and updating all technology policies should be in place.
28 | P a g e
Lessons Learned
What the Lenovo Malware Debacle Means
February 23, 2015
As has been widely reported, Lenovo had shipped consumer laptops with software on it that made it
vulnerable to a so-called man-in-the-middle attack, namely, the software intercepted inbound web data,
decrypted it, inserted advertising, recoded it, issued a new security certificate (based on a pre-installed
“root” certificate) and then sent it along to the browser which accepted the data as trusted. It works
with outbound data, too. Equally bad, because the makers of the malware were sloppy (and why
wouldn’t they be?) they used the same certificate on many machines, used a weakly encrypted
certificate and used a password that was easily guessable (not cracked, guessed). This created readymade entrance for other attackers to insert themselves as a person-in-the-middle and compromise
every transaction.
The upshot? The user of an infected machine can have zero confidence that the websites they go to
(banks, corporate, etc) are real. Security experts warn that no web-based transaction from an infected
machine can be deemed to be secure.
What if the user of such a compromised machine was working with your data, on your network? How
can you even assess whether data has been stolen? Simply, you cannot.
Does this mean that BYOD is a bad idea? That is open to debate. On the one hand, by inviting any
computer, any smartphone, any device onto your network is inviting a host of security problems across
multiple platforms. In short, you get what you pay for. On the other hand, even the purchase of
company-owned equipment is no guarantee that an employee will not download bad software or use
bad hardware (see, for instance, this piece on malware in firmware) to your network.
Whatever the answer, I do think that this attack begs employers to think “defense-in-depth,” that is,
layering defenses and even anticipating that an attacker will succeed (perhaps even without your
knowledge). That layering has to include limiting the impact of an attacker once your network is
penetrated. As I have argued elsewhere (here, here and elsewhere) cyber-resiliency includes physically
segregating unrelated data and having excellent detection and response plans.
29 | P a g e
Lessons from the Anthem Breach
February 9, 2015
The Anthem breach reported late last week provides a number of insights:
1. While Anthem’s transmissions were encrypted, their stored data was not. It is worth
examining whether the efficiencies gleaned from non-encrypted storage are outweighed by the
costs of breach recovery, notification and damage to reputation.
2. The majority of plans run by insurers such as Anthem are likely employer-sponsored. On the
one hand, one might counsel an employer that this is a matter between the insurer and the
insured (and not between company and insurer or company and insured). On the other,
employees do not necessarily allocate responsibility so precisely. Clearly, the security measures
used by vendors should be of concern to employers. In addition, employers should include thirdparty breaches in their response planning.
3. The attack vector was likely through an employee’s stolen password. Based on some reporting,
it may be that the employee was the victim of a social engineering (typically, using deception to
cause someone to divulge confidential information like passwords). Once again, we see the
importance of strong policies and training in the overall security scheme.
4. The attack demonstrates threat transference, something that will occur more and more. Threat
transference is a situation where, once thwarted by enhanced security measures, criminals will
move on to more vulnerable targets even if those targets are lower value. As security improves
in the banking and retail sectors, criminals will move to “easier” targets. Because the Anthem
data contains a wealth of information that can be used in identity theft, it remains very valuable
despite the fact that it doesn’t contain direct financial account information. As the threat is
rapidly transferred, there is little doubt that employer data — which contains information as
sensitive as that held by Anthem — will become a specific target.
5. The Anthem attack demonstrates the difficulty inherent in breach detection and
response. Some reporting indicates that the discovery of the breach may have been a lucky
break and that day into the event, the scope of the stolen data is likely still not known. Indeed,
most cyber-attacks last somewhere between three and six months before discovery. One way
of discovering an attack is often finding some of the stolen data for sale online. Cybersecurity
planning should include these realities.
6. In light of massive breaches of employee data that have and will occur, I am wondering if
mitigation efforts should beginin anticipation of a breach. Along with cyber-risk insurance,
perhaps credit monitoring and negotiating withunions about these issues may be worth putting
in place ahead of time.
30 | P a g e
APWU v USPS @ NLRB: Some Best Practices
December 9, 2014
The American Postal Workers Union (APWU) recently filed a NLRB charge against the United States
Postal Service (USPS) in connection with a data breach that led to the release of 800,000 employee and
retiree medical records, social security numbers and bank account and routing information. This has
been widely reported and discussed elsewhere.
One overlooked element of this case is the detailed testimonyabout how the data breach itself was
handled. While many breaches make news, the USPS’ testimony before Congress is fairly unique and
gives us very interesting insight into the breach. So, what can we learn?
Background
First, another word on the case. The mere failure of the USPS to notify the union is not at issue; rather,
the complaint focuses on the fact that the USPS made “unilateral changes in wages, hours and working
conditions by, among other things. providing free credit monitoring services to employees.” In
short, APWU would have liked to have been notified earlier in order to be in a position to bargain over
USPS’ employee-related response matter.
Whether the USPS should have held back notifications for negotiations or should have notified-thennegotiated is, of course, up to the NLRB, Congress and the courts. I won’t comment on this aspect of the
APWU’s charge except to say that it appears to me from the testimony of Randy Miskanic, vice president
of secure digital solutions at the USPS, that the earliest anyone outside of the crisis response team could
have been notified was after a”brownout” period of November 8 and 9 (during which the USPS severely
limited its access to the web and instituted major, system-wide changes to its security systems). At the
very least, the scope of the breach was not known until October 16, 2014 and the theft was
only confirmed on November 4, 2014. For the same reasons of investigation and confirmation, the
suggestion by some members of Congress that employees be notified as soon as a social security
number is known to be compromised may produce legislative proposals but likely very little law making.
Best Practices
Some thoughts while reading Miskanic’s testimony:
1. How would your own systems and procedures hold up under such pressure? Would you (the
employment lawyer) have been called in? In fact, you might wish to run a “table top”
exercise based on this scenario. When would your counsel have been sought?
2. Had this been a private entity dealing with a breach, would the Postal Service been in full
compliance with most state laws concerning breach notification (here, to employees)?Arguably,
yes, as such statutes typically require notification after law enforcement and data security
needs are handled. Here is New York’s statute:
The disclosure shall be made in the most expedient time possible and
without unreasonable delay, consistent with the legitimate needs of law enforcement
or any measures necessary to determine the scope of the breach and restore
the reasonable integrity of the system.
31 | P a g e
Based on the testimony, several federal agencies were very reluctant to permit notification until
the full scope of the breach was known and the perpetrators known (to the extent possible). As
for “restor*ing+ the reasnable integrity of the system,” the two-day brown out period likely
satisfies that prong. Finally the national security implications of this matter would likely weigh
heavily on a court’s decision-making. After all, the suspects in this matter are Chinese
government hackers. (This is very relevant to private enterprises because the United
States indicted Chinese government officials who were said to be aggressively launching
cyberattacks against US business).
3. The testimony provides a good look at a serious remediation effort (and what employment
lawyers must understand during remediation discussions):
The new network security safeguards put into place over this two-day period included
removing workstation administrator rights and enhancing network monitoring. We also
upgraded and segmented Administrative Domain Controllers, removed compromised
systems and accounts, and implemented two-factor authentication for administrative
accounts.
To further reduce the likelihood of phishing or spear-phishing emails—common and
increasingly sophisticated ways of compromising computer users and systems—
impacting the Postal Service network, access to personal email sites such as Gmail or
Yahoo was, and continues to be, blocked. In addition, direct database access is now only
enabled to technology support staff and a number of business applications have been
retired.
Interesting insight and hopefully, some lessons learned.
32 | P a g e
Employment Law Consideration Concerning Future Tech
Disruptive Technologies in the Workplace
November 19, 2014
A recent McKinsey report on twelve “disruptive” technologies included four that will fundamentally
transform how employers relate to their employees: mobile Internet, automation of knowledge work,
the Internet of things and cloud computing. I would add to the list three results of these technologies:
big-data, cybercrime and privacy.
From an employment law perspective, the common element here is data – data that flows to, is stored
by, and is used (or misused) by employers, third parties andemployees.
Employers
As new devices and technologies are deployed, employers will likely inadvertently gather information
they probably do not want – for instance, protected health information (perhaps by detecting a diseaserelated app on a phone) or detailed records of employee movements (which can be very harmful in
wage and hour litigation).
As employers look at these (and other) large pools of data (including applicant data), some will wish to
“mine” this data using increasingly low-cost “intelligent” automated systems. Such work has to be
carefully done – both algorithmic errors and poor statistical methodology can easily lead to significant
errors in the information derived from the raw data. The results, from at least an EEO point of view, can
be quite disastrous.
This data will likely be stored on third-party “cloud” storage systems –an arrangement that will raise
new risks for employers.
Third Parties
Employers need to be concerned data in third party hands — whether it is there intentionally or not.
For instance, employers ask employees to use devices that are loaded with third party apps – and
sometimes they even ask employees to use these apps. These appsroutinely collect significant amounts
of data, including location and unique device identifier information. Such data can be combined to
create a very detailed profile on users. This data – owned, protected and even sold by these third
parties – can create a new window into an employer’s operations that litigants and corporate spies alike
would love to see.
Next, data will inevitably end up in third party hands through litigation and discovery. As the cost of
sophisticated analytics concerning that data is falling, there will be a sea-change in how employment
cases are litigated – especially class actions. And in the regulatory and EEO context, as a recent White
House panel on so-called “big data” concluded, “the federal government should build the technical
expertise to be able to identify practices and outcomes facilitated by big data analytics that have a
discriminatory impact on protected classes.” The use or misuse of this information by the government
or litigants will require a very sophisticated legal response – one that will likely involve the world of
statistical analysis and coding.
33 | P a g e
Information will also end up in third party hands through crime. Whether inadvertent or not,
the primary source of data breaches is through an employee’s keyboard. As the breaches continue and
the costs rise, employers will have to take radical approaches to data protection, including new levels of
data segregation, radically shoring up security-related policies and treating mobile phones, whether
company-owned or not, as on par with laptops.
Employees
As data is produced by more and different devices, there will be serious questions about who owns the
data those devices store and generate. Will an employee-owned, GPS-enabled app used on a “BYOD”
device contain data that is owned by the employee (say, concerning their fitness activity) or, because it
was worn during work, will it contain proprietary information (such as a record of where the employee
visited)? Employers must understand what data their employees are gathering – and update policies
and executive employment agreements to deal with it.
In the social media context, employers will be forced to grapple with always-on devices, including those
that constantly stream video. It is unclear whether a simple workplace ban on such recording (as
recently permitted under the NLRA) will survive video streaming’s convergence with social media –the
latter of which the NLRBmaintains can be a form of protected concerted activity.
Last, employers need to have action plans in place for data breaches caused by or impacting
employees. Employers should also ensure that insurance policies cover employee-caused data breaches
and incidents involving employee information.
Concerns about privacy cover all three areas, but this is well covered elsewhere.
Summary
This short survey illustrates that the world of the employer will more and more involve data-driven risk –
placing their lawyers deep in the world of statistics, system design and security management.
34 | P a g e
New Tech & Employment Agreements
November 26, 2014
From the extraordinarily sensitive data they possess to the social media following they may develop, Csuite executives can do a great deal of damage to a company and its brands – intentionally or not. This
risk begins with the start of employment but continues long after the executive leaves the company.
Executive employment agreements should cover four key topics: social media, security,cooperation,
and developing technologies.
1. Social media during and after employment.
Specify the employer’s social media expectations, including frequency of posting, content/legal
review, and whether accounts are company or executive- owned. For employer-owned
accounts, the executive should (obviously) be required to provide password and recovery
information regularly and upon exiting the company. For the trickier cases of executive-owned
accounts, consider specifying the language for the executive to send out from the account when
s/he leaves employment, such as one recommending that followers migrate to a successor
account (after all, an executive’s Twitter following may be a significant brand asset). Employers
should also plan for the possibility that their executives may die while employed, leaving a string
of difficult-to-recapture corporate digital assets. Finally, traditional non-solicitation, noncompete and confidentiality clauses should be updated to specifically addresspost-employment
social media.
2. Security training and policies for work and home.
Senior executives will likely be the direct target of specific, directed attacks seeking employer data
held by the executive (for instance, spear phishing or targeted social engineering). In addition to
requiring security training and compliance on company equipment, to the extent executives use
private accounts and equipment for company business, agreements should set out data security
measures expected. An executive agreement may also be the ideal place to set out an employerdata-only-on-employer-equipment clause, thereby ensuring
3. Cooperation and data retention.
As company data stored on personal laptops and in private accounts will be subject to discovery, the
executive must be required to adhere to data retention policies. The executive should be required
to notify the company of third party inquiries during and after employment and, upon termination,
should provide the company with a complete list of devices on which company data is or was
held. Executives should also agree to post-employment cooperation with company data requests
during all investigations, including administrative proceedings. This may include an obligation to
turn over personal account passwords to the company if company data is intermingled with private
data. Finally, costs associated with such company demands and compliance should be allocated in
the agreement – and insurance policies should reflect this.
35 | P a g e
4. Developing technologies.
Whatever the longevity of C-suite executives may be, the technology development cycle is even
shorter. New devices – which bring new and often unexpected ways to capture, stream and store
data — will inevitably lead to questions about who owns the data (rather than the device). For
instance, an “always on” fitness device or app may capture very detailed business-sensitive
information, such as a sales route or evidence of travels to negotiate a new acquisition. These
devices and apps may transmit that data to third parties, many of whom are willing to sell data or
who maintain lower than desired safeguards. Another area of concern is self-erasing
communications (such as snapchat or wickr) – executives likely should be prohibited from using such
apps to transact company business. Simply, executive agreements ought to be written with
flexibility to adapt to changing technologies, perhaps best with a permission-before-use clause.
36 | P a g e
8 Trends in the Transformation of Tech-Related Employment Law
December 31, 2014
2015 will see the broadening and deepening of the transformation of tech-related employment law.
Here are eight reasons why:
1. More data breach-related lawsuits by employees (or classes of employees).
2. Increasing attention to inside roles (intentional or otherwise) in corporate security breaches.
3. Emergence of ultra-focused spear-phishing scams against c-suite executives/high-level
employees.
These are sophisticated, highly-targeted campaigns designed to get top executives to click on
links or download files that will expose highly sensitive corporate data. Such
campaigns will target these executives at the office, on mobile devices and at home.
4. Increasingly sophisticated ransomware targeting core employer systems.
This relates to self-replicating malware that encrypts corporate networks — and demands a fee
for decryption. Beyond the obvious threat to an employer’s ability to function on many levels,
such attacks may hamper core employee-related functions such as wage amd hour compliance.
5. Changing standards of care/obligations for protecting employee data.
Continuing the current trend toward increased employee-data regulation, expect many new
standards from industry associations, regulators (EEOC, FTC, SEC), legislatures, courts and
insurers.
6. Expanding the fiduciary-driven management of cyber risk.
Boards will continue to look more and more carefully at managing cyber risk (and relatedly,
cyber risk insurance). In addition, cyber risk will increasingly involve professional risk and
reputation managers, such as GCs, and will move away from being an exclusively IT
role. Moreover, fiduciaries will drive employers toward more sophisticated breach response
plans.
I’ve argued elsewhere about the specific need for the involvement of employment lawyers in all
of these matters.
7. More sophisticated role for big data in so-called “routine” employment litigation matters.
8. Exponentially increasing pools of employee data.
As new technologies increasingly enter the workplace, especially the so-called Internet of Things
(IoT), employment lawyers will need to be active participants in dealing with the massive amounts
data that is produced. Privacy and security concerns related to IoT will be of particular concern, as
37 | P a g e
will e-discovery given both the quantities of the data that will be produced and the fact that there is
not yet astandard format for that data.
Bonus trends: cyber resiliency… terabyte phones…
38 | P a g e
Employment Law Risk and Early-Round Investors
January 23, 2015
A recent program sponsored by Epstein Becker & Green — Moving to the Next Level: Valuation &
Financing Considerations and Employment Strategies for Start-Ups and Emerging
Technology Companies — tackled finance and employment law (and their interplay) as they relate to
tech start-ups. The program covered a range of interesting and important issues.
I took away three key questions I would ask early-round investors.
Why back a company that has ignored its employment-related risk? While the natural instinct
of a founder is to move the company aggressively forward (say, in product development,
revenue generation, etc.) to attract the right investors at the right time, that rush can be
undermined by employment legal issues. There doesn’t need to be a disconnect between
founder-passion and company-legal-health: employment counseling is risk management which
has to be properly dialed to match the risk tolerances of the start-up. Some preventative
measures are worth doing, others are not and choosing your battles – rather than having them
chosen for you by a litigant – is the only rational way to proceed. This is not speculative (for
instance see here, here and here) and as investment money hits a start-up, founders’ pockets
will seem that much deeper.
Why back a company where employment-related risk is unknown? At the same time, the
program led me to consider the downside pressure on valuation that pending employment
litigation (or just serious exposure) should entail and how to assess that risk. It’s notable that
employment-related considerations in due diligence are often lumped into an overall legal
health review or ignored The only way to properly quantify risk is to get under the hood of a
company’s current and past employment practices.
Why back a company which doesn’t understand the workplace implications
of its product? My last point arises from the conversation and an experience a colleague of
mine had when she was buying a piece of HR-related software. During a final review of the
product, a product manager from the company simply could not understand why my friend
wanted employees, and not their managers, to identify their own gender. That employerimposed gender default is a serious stumbling block for a company that wants to treat
transgender employees (and here) lawfully and with respect (and these kinds of examples
abound). The point here is that workplace compliance and legal ramifications of products
destined for the workplace cannot be ignored until the last moment – they have to be baked
into the product organically, as do considerations of how personal equipment and data interface
with a company’s networks. In short, if the product is entering the workplace, then workplace
tech law demands employers’ (and their lawyers’) attention.
39 | P a g e
An Employer’s Guide to the President’s Cybersecurity Recommendations
January 15, 2015
President Obama has released key provisions of his new cybersecurity plan (which he will discuss during
his State of the Union address). As discussion about this plan unfolds, employers should be aware of
several important elements. Please note: a number of commentators have taken a political position on
this subject. I am not.
According to the White House, these proposals are designed to:
Enhance cyber-threat information sharing within the private sector and between the private
sector and the Federal Government;
Protect individuals by requiring businesses to notify consumers if personal information is
compromised; and
Strengthen and clarify law enforcement’s ability to investigate and prosecute cyber crimes.
I will look at why each matters to employers.
CFAA Amendments
As is well known, some employers have sought to use the civil claims section of Computer Fraud and
Abuse Act (CFAA), 18 U.S.C. § 1030(g), against former employees who steal ESI or damage a computer
system.
This proposed amendment to CFAA would
Enable the CFAA to reach any person who intentionally accesses a protected computer without
authorization and who obtains information from that computer.
Create separate liability for an individual who intentionally exceeds authorized access to a
protected computer and who obtains a material amount of information from that computer. For
this materiality provision, the value of the information obtained must exceed $5,000. This is a
new requirement and is different from other provisions related to materiality, which provide
simply for aloss in general in excess of $5,000.
In essence, according to the White House, the amendments would ensure “that insignificant conduct
does not fall within the scope of the statute,” and that it clarifies that CFAA should reach “insiders who
abuse their ability to access information to use it for their own purposes.”
Finally, the proposal would
Amend the definition of the “exceeds authorized access” of CFAA to include accessing a
computer with authorization and then using such access to obtain or alter information in such
computer for a purpose that the accesser knows is not authorized by the computer owner.
Critically for employers, this latter amendment CFAA would resolve a circuit split (with a “yes”) on the
question of whether violating a written restriction – including an employer’s written policy — falls under
40 | P a g e
the ambit of the CFAA. See Orin Kerr, “Obama’s proposed changes to the computer hacking statute: A
deep dive,” The Volokh Conspiracy.
Cyber-Threat Information Sharing
This proposed statute will create a mechanism for sharing cyber threat information, notwithstanding
any privacy laws to the contrary. The stated purpose of the statute is
[t]o codify mechanisms for enabling cybersecurity information sharing between private and
government entities, as well as among private entities, to better protect information systems
and more effectively respond to cybersecurity incidents.
According to the White House
…the proposal encourages the private sector to share appropriate cyber threat information with
the Department of Homeland Security’s National Cybersecurity and Communications Integration
Center (NCCIC) which will then share it (in as close to real-time as practicable) with relevant
federal agencies and with private sector-developed and -operated Information Sharing and
Analysis Organizations (ISAOs).
Key among the provisions is that this new information sharing structure will come with liability
protection for companies who share cyber threat information under the legislation.
Breach Notification
This proposed statute would create a new breach notification framework and charge the Federal Trade
Commission with regulating and enforcing that notification. While directed at consumers, as I’ve
written many times before, employers are also concerned with breach notification. They key here is
that this statute nearly completely preempts state breach laws (with very few exceptions) bringing much
more uniformity and certainty to the process. (Typically, notification would occur within a month).
Interestingly, the proposals allow the United States Secret Service or Federal Bureau of Investigation to
declare that there is a law enforcement-based or national security reason to exempt a disclosure form
notice (many states permit state authorities to make the determination that an investigation would be
hampered by disclosure). Furthermore, this is a signal of the growing role of the FTC in the employeeemployer relationship (especially in the FCRA context).
Summary
The key idea behind these proposals is preemption and thus national uniformity. Whether these
proposals survive the legislative process, the likelihood is that federal law will ultimately be made in this
area — and the patchwork of state privacy and breach laws will not survive in any meaningful way.
41 | P a g e
Quick Take: FTC’s New Report on the Internet of Things
January 30, 2015
From the FTC’s new report on the Internet of Things [pdf]:
IoT presents a variety of potential security risks that could be exploited to harm consumers by:
(1) enabling unauthorized access and misuse of personal information; (2) facilitating attacks on
other systems; and (3) creating risks to personal safety. …*P+ rivacy risks may flow from the
collection of personal information, habits, locations, and physical conditions over time…
companies might use this data to make credit, insurance, and employment decisions.
Overall recommendations:
build security into devices at the outset, rather than as an afterthought in the design process;
train employees about the importance of security, and ensure that security is managed at an
appropriate level in the organization;
ensure that when outside service providers are hired, that those providers are capable of
maintaining reasonable security, and provide reasonable oversight of the providers;
when a security risk is identified, consider a “defense-in-depth” strategy whereby multiple
layers of security may be used to defend against a particular risk;
consider measures to keep unauthorized users from accessing a consumer’s device, data, or
personal information stored on the network;
monitor connected devices throughout their expected life cycle, and where feasible, provide
security patches to cover known risks.
The report and FTC’s press release is here.
As usual, Jules Polonetsky and Christopher Wolf at the Future of Privacy Forum have thoughtful
comments, which can be found here.
42 | P a g e
Big Data
EEOC, Big Data and Why Disparate Impact is Not the Right Direction
March 8, 2015
It has been widely reported that EEOC Assistant Legal Counsel Carol Miaskoff, when addressing a
conference on big data, shared her belief that employers should be concerned with the disparate impact
of their employment-related data mining:
It’s been interesting to me because everyone’s been talking about disparate impact and adverse
impact a lot. In the employment space, those are very precise legal terms. And there’s a cause of
action for disparate impact. And I would say that that’s the one, frankly, that’s most suited to
big data. Because what that’s about is taking a neutral, i.e. like race neutral, gender neutral, et
cetera, term that nonetheless disproportionately excludes members of the protected group
and– this is the critical piece here– and is not job related consistent with business necessity.
I am not convinced that she is right.
First, we should dive into disparate impact. It is well understood that “Title VII prohibits employers from
using neutral tests or selection procedures that have the effect of disproportionately excluding persons
based on race, color, religion, sex, or national origin, where the tests or selection procedures are not
job-related and consistent with business necessity.” An employer can use one of several statistical
models to show that a testing or selection procedure is job-related and consistent with business
necessity, that is, necessary to the safe and/or efficient performance of the job. At that point, the
burden shifts to a plaintiff to show that there are other tests or selection procedures that are equally
effective but have less of a discriminatory impact.
The problem is: any employer using solid data science should be able to show that variables that
correlate highly with job safety and efficiency are those that are statistically defensible. After all, isn’t
that the whole purpose of data science — to tease and out and extrapolate interesting and highly
correlative information? If done right, the data scientist should be able to show a defensible
correlation and an exclusion of other factors as being more equally or even more explanatory (the
definition of “necessary”). In short, data science seems built to precisely resist the disparate impact
framework.
If not disparate treatment, then, absent a new statutory scheme, the remaining avenue for plaintiffs is
the intentional discrimination framework – specifically, showing intentional discrimination through
proof of a discriminatory “pattern and practice.”
Briefly, intentional “pattern and practice” cases are generally evaluated by the courts using the burdenshifting paradigm adopted by the U.S. Supreme Court in Int’l International Brotherhood of Teamsters v.
United States, 431 U.S. 324, 336 (1977). The Teamsters framework charges the plaintiff with the higher
initial burden of establishing “that unlawful discrimination has been a regular procedure or policy
followed by an employer…” Teamsters, 431 U.S. at 360. Upon that showing, it is assumed “that any
particular employment decision, during the period in which the discriminatory policy was in force, was
made in pursuit of that policy” and, therefore, “*t+he *plaintiff] need only show that an alleged individual
discriminatee unsuccessfully applied for a job.” Id. at 362.
43 | P a g e
I don’t think it would be difficult for a plaintiff to show that an employment practice built on data mining
is a regular procedure or policy. The burden then shifts to “the employer to demonstrate that the
individual applicant was denied an employment opportunity for lawful reasons.” Id. According to the
Second Circuit:
Teamsters sets a high bar for the prima facie case the Government or a class must present in a
pattern-or-practice case: evidence supporting a rebuttable presumption that an employer acted
with the deliberate purpose and intent of discrimination against an entire class. An employer
facing that serious accusation must have a broad opportunity to present in rebuttal any relevant
evidence that shows that it lacked such an intent. United States v. City of New York, 717 F.3d 72,
87 (2d Cir. 2013).
This includes demonstrating that the Government’s proof is either “inaccurate or [statistically]
insignificant.” Teamsters, 431 U.S. at 360.
Plaintiffs will try to show that the model used is so fraught with perilous assumptions that it evidences
discriminatory intent. Defendants will try to show that their business model was founded on purely
neutral data mining techniques. In short, the fight will likely focus on whether the assumptions of the
big data model in use were reasonable ones. As I discuss in my article, 10 Questions: Confronting
Allegations Based on Big Data, the key fight over allegations predicated on big data is about the choices
that go into selecting the data and methodology to test in the first place.
I think we are a long way from a jurisprudence for handling big data claims. In the meantime, employers
would do well to keep pattern-and-practice as top of mind as disparate impact.
Back to Ms. Miaskoff. The next two (wholly unreported upon) paragraphs of her remarks are
telling. Ms. Miaskoff goes on to say:
Now, in terms of big data, I think this is the rub. This is really what’s very fascinating, that the
first step is to show and look at what is the tool. Now, this would apply to recruitment or to
selection, but probably perhaps more to selection. What is the tool? Does it cause a disparate
impact? And once you get there, just because it causes a disparate impact doesn’t make it illegal
discrimination under the employment laws. It’s only illegal if it does not predict, accurately
predict, success in the job.
So this raises all kinds of fascinating issues with big data analytics because, indeed, if you do
possibly have prejudices built into the data, something might be validated as predicting success
on the job. But it might just be predicting that white guys who went to Yale do well in this job.
So there’s going to be a lot of interesting thought that needs to be done and technology work,
really, around understanding how to validate these kind of concerns.
I think Ms. Miaskoff has put her finger on the very issue I raise in this article: the point of attack is not at
the results, it is at the underlying predictive model and the prejudices built into the data.
44 | P a g e
10 Questions: Confronting Allegations Based on Big Data
December 21, 2014 Steve Sheinberg Big Data
The big data revolution will require employment lawyers who can get “under the hood” of claims driven
by big data analytics. Here are 10 questions that can help uncover error and bias in the work of data
scientists.
Some Background
Solon Barocas & Andrew D. Selbst argue (in a draft paper called Big Data’s Disparate Impact):
Where data is used predictively to assist decision-making, it can affect the fortunes of whole
classes of people in consistently unfavorable ways. Sorting and selecting for the best or most
profitable candidates means generating a model with winners and losers. If data miners are not
careful, that sorting might create disproportionately adverse results concentrated within
historically disadvantaged groups in ways that look a lot like discrimination.
Put differently by James Grimmelmann in”Discrimination by Database” (a summary of the Barocas &
Selbst piece), “data mining requires human craftwork at every step.” And where there is human
craftwork, bias and bigotry — whether intentional or not — can creep in.
The Ten Questions
(1) What question was asked? Finding out the exact question that was asked of the data can help
understand how the answer was derived from the data. What was being sought and was it the best way
to pose the question?
(2) Who was asking the questions? Was the data scientist tenacious (or creative) enough to pursue and
craft the right questions? Josh Sullivan: “Fundamentally, what sets a great data scientist apart is fierce
curiosity – it’s the X factor. You can teach the math and the analytical tools, but not the tenacity to
experiment and keep working to arrive at the best question – which is virtually never the one you
started out with.” HBR’s Get the Right Data Scientists Asking the “Wrong” Questions). Put another way
(in Wired): “researchers have the ability to pick whatever statistics confirm their beliefs (or show good
results) … and then ditch the rest. Big-data researchers have the option to stop doing their research
once they have the right result.”
(3) How were the dataset(s) originally created and for what purpose? What omissions, errors
(including discriminatory ideas) might have crept into the initial dataset that might alter the outcome of
the data mining process? Grimmelman: “almost every interesting dataset is tainted by the effects of
past discrimination”
(4) How were the datasets selected? Who selected them? On what basis? What sets were
ignored? What was the initial hypothesis being tested that drove the assembly of the particular
datasets?
(5) Do the datasets accurately represent the population being studied? Barocas and Selbst:”If a sample
includes a disproportionate representation of a particular class (more or less than its actual incidence in
the overall population), the results of an analysis of that sample may skew in favor or against the over
or under-represented class.”
45 | P a g e
(6) What data was missing? What data was missing from datasets — either through loss or noncollection and how was its absence handled by the data scientist? Put differently: “It is important to
understand what variables are more or less likely to be missing, to define a priori an acceptable percent
of missing data for key data elements required for analysis, and to be aware of the efforts an
organization takes to minimize the amount of missing information.” Consider also Ray
Poynter’sperspective:
One of the issues about the scale of Big Data is that it can blind people to what is not being
measured. For example, a project might collect a respondent’s location through every moment
of the day, their online connections, their purchases, and their exposure to advertising. Surely
that is enough to estimate their behavior? Not if their behavior is dependent on things such as
their childhood experiences, their genes, conversations overheard, behavior seen etc.
(7) What data was discarded? Managing outlying data, ignoring discrepancies in the data and
discarding data points is a common issue: data scientists should be queried as to why any data points
were discarded to see if there were preconceived notions about the patterns in data and whether the
scientist ignored an important divergence in data values. See Dan Power’s analysis, here.
(8) Were proper proxies used? For instance, we know that zip code is a poor proxy for ability to pay a
mortgage and that a timed run is a bad proxy for being a good firefighter (and that being required to
drag a 125 pound dummy approximately 30 feet along a zigzag course to a designated area in 36 second
while crawling probably is).
(9) Was the model trained properly?
(a) Was the model used to examine the data trained on the correct data
points? Grimmelman: “to learn who is a good employee, an algorithm needs to train on a
dataset in which a human has flagged employees as ‘good’ or ‘bad,’ but that flagging process in
a very real sense defines what it means to be a ‘good’ employee.”
(b) Was the tested data sufficiently close to the training data? Microsoft: “By using similar
data for training and testing, you can minimize the effects of data discrepancies and better
understand the characteristics of the model.”
(10) What alternative techniques could have been applied to the datasets? Dan Powers: “There are
many data mining tools and each tool serves a somewhat different but often complementary
purpose.” Given the availability of multiple products, tools and techniques, the data scientist should
explain what she used and what other results were derived.
Bonus 11th Question: Is the data quality high enough? Is it relevant, complete, correct,
well structured, valid, and timely?
Relevant: Is the data well-targeted to answer the question? Were there improper
extrapolations (that is, using data collected for one purpose for another purpose?
Complete: Is all the needed data there? are there any key gaps?
46 | P a g e
Correct/Accurate: Is the data accurate in that it captures the real world values it is supposed to
represent (see proxies below)? Is the data granular enough, that is, is it sufficiently detailed to
make the point asserted? Are other factors or details missing that could provide an alternative
explanation of the results? Is there any internally inconsistent data and what explains it?
Well-Structured: Are there any structural ambiguities in the data?
Sound: Is the data trustworthy? Was it rigorously assembled?
Timely: Is the data up-to-date? Does it adequately represent the history of the matter studied
(is there adequate retention)? Are there any gaps in the record?
See Doug Vucevic, Wayne Yaddow, Testing the Data Warehouse Practicum: Assuring Data Content, Data
Structures and Quality.
These eleven items are clearly not the only questions one can or should ask. But they are a starting
point. I welcome suggestions for additional questions.
47 | P a g e
Big Data: Critical Concerns for Employment Lawyers
December 14, 2014
The price of data storage and sophisticated analytics are dropping. This will herald a sea-change in how
employment cases are litigated, including both class actions (initially) and individual cases
(eventually). Employment lawyers will need to:
be involved in data governance;
apply basic statistical methods to the results of data mining; and
get deep “under the hood” of the data scientist’s work product.
Some credible estimates are that ninety percent of the world’s data was generated in the last two years
alone. Much of this data is in the hands of employers.
While these pools of data contain incredible opportunities for increasing knowledge and enhancing the
management of organizations, this also means that plaintiffs and defendants — and employmentrelated regulatory agencies — will have access to unprecedented amounts of data that can be mined,
uses, interpreted and exploited in increasingly less expensive ways.
What do employment lawyers need to do and know?
First, where possible, lawyers must be involved in data governance. I’ve discussed this extensively in
two related posts: ESI and Data Governance Part 1 and Part 2. As data mining enters employment
litigation, data governance decisions will have an enormous impact on defense costs.
Second, lawyers will need to know enough statistics to unpack claims predicated on data analysis. All
serious data mining is, at base, a combination of statistical and programming work. Accordingly, the old
saying “lies, damned lies and statistics” should be a lawyer’s mantra in this area. Being able to ask the
right questions about a Big Data result will go a long way to dealing with adverse claims, especially as
the defense of such claims potentially involves the expense of experts.
Last, lawyers will need to know enough about how data miners work, including programming, to be able
to be active participants in examining their results, both before and after experts get involved. As we all
know, a data miner (nowadays called a data scientist) doesn’t just plug-in questions to get
answers. They pick datasets, design algorithms for statistical pattern recognition and use
those algorithms via software (such as Hadoop) on the selected pools of data. Along the way they make
assumptions about the data sets, impose structure on that data, deal with missing data, make judgment
calls about the questions to ask of the data and interpret results. This is a process ripe for errors. Put
differently, “data mining requires human craftwork at every step.” (See James
Grimmelmann at “Discrimination by Database.”) And where there is human craftwork, bias and
errors — whether intentional or not — can creep in.
In another post, I discuss this third element in detail, outlining the precise places where error and biases
to creep into the data mining process. [Ed. Note: It is here]. In the meantime, please take a look
at Solon Barocas’ presentation at an FTC conference “Big Data: A Tool for Inclusion or Exclusion.”
Barocas’ engaging comments start on page 15 of this transcript (.pdf) or at the 15-minute mark of
this video.
48 | P a g e
ESI and Data Governance: Part 1
December 7, 2014
The Basics*
Enterprises of all sizes store troves of information in many different forms and formats about their
employees — and these data pools will undoubtedly multiply over the coming years. Employment
lawyers will be required to interact with technical personnel to assess preservation and if necessary,
supervise production.
So, what’s an employment lawyer to do? Turn to data governance!
Let’s first look at some basics first, and then, in my next post, we will turn to the heart of this area: data
governance for employment lawyers.
What counts as Electronically Stored Information? Information about your employees (along
with all the other data your company holds) is called Electronically Stored Information (ESI) by
the F.R.Civ.P and is broadly discoverable. I won’t waste your time describing ESI any further or
defending the proposition that courts interpret ESI very liberally. They do (see e.g., Small v.
Univ. Med. Ctr. of S. Nev., Case No.: 2:13-cv-00298-APG-PAL, 2014 U.S. Dist. LEXIS 114406 (D.
Nev. Aug. 18, 2014)).
Variations on a theme: ESI. While it is well known that ESI can be found in many forms, its
important to recognize that those forms are multiplying with serious frequency: Beyond
documents, databases, and emails, clearly ESI includes work-related texts on a personal phone,
in metadata, on apps, created by apps and on your or party servers, in logs, stored on a personal
computer at home or on a local drive at work. Simply, such data stores will only be increasing as
new technologies enter the workplace.
Discovery and ESI. It is important to point out that Rule 26(b)(2)(B) limits discovery on ESI from
sources not “reasonably accessible,” and provides a court discretion to order discovery and
specify cost-shifting to obtain that discovery. For much more on discovery of ESI, take a look at
the materials prepared by (the oft-cited) Sedona Conference. See here and here.
Data Governance. Data governance is the system of rules and procedure (or lack thereof)
concerning how an enterprise’s ESI is stored, managed, used, stored, merged, erased, and
retrieved. According to the Data Governance Institute, data governance is a “system of decision
rights and accountabilities for information-related processes, executed according to agreedupon models which describe who can take what actions with what information, and when,
under what circumstances, using what methods.” While data governance is a term not usually
directly associated with employment lawyers, it should be. (After all, one of the core cases in
the e-discovery world was an employment discriminationcase) .
In short, when one considers the breadth of ESI and its discoverability, solid data governance is
essential to employment-related risk management
49 | P a g e
So, what is an employment lawyer to do about data governance?
The first step, which we will address here, is a big one: fully assessing the situation. Some key questions
that can help frame your investigation:
1. Do you have a data “governor?” In other words, is someone in charge of data governance? Is
there/can there/should there be a cross-functional team that deals with data issues? Are you or
other lawyers on that team or connected to the process?
2. Have you surveyed your situation? Do you know what data is where, what retention policies are
in place and who is in charge of deciding upon and executing those policies?
3. Do you have a data governance strategy, even an informal one?
4. Do you know the probability of risk? How has that data been used (or abused) in the
past? What is your litigation and discovery risk?
5. How are data emergencies (breaches, discovery requests) and data security managed? Who
decides? Are litigation holds designed to work across technologies, such as ensuring data
preservation of text messages?
6. How will you assess the true costs of e-discovery? What is, in good faith, “reasonably accessible”
and what is not?
7. Are employees properly educated in good data stewardship and practices, including how and
when to mark documents as privileged?
8. Are you monitoring the efficacy of your controls?
9. Is someone able to report on data governance and its efficacy as an F.R.Civ.P 30(b)(6) witness?
10. How will you ensure that your data governance plan is up-to-date and is encompassing new and
emerging technologies? For instance, as encryption becomes more “popular” (Google and
Apple have released phones will “full device” encryption) what rules will issue to employees?
11. How would you measure a “successful” data governance plan?
In short, you need to Assess, Plan and regularly Reassess.
50 | P a g e
ESI and Data Governance: Part 2
December 7, 2014
Data Governance: Controls and Culture
As said, ESI is an increasingly understood and litigated issue. What requires much more thought by
employment lawyers is data governance. Data governance is the system of rules and procedure that
relate to how an enterprise’s ESI is stored, managed, used, merged, deleted, and transferred.
According to the Data Governance Institute, data governance is a “system of decision rights and
accountabilities for information-related processes, executed according to agreed-upon models which
describe who can take what actions with what information, and when, under what circumstances, using
what methods.”
While data governance is a term not usually directly associated with employment lawyers, it should
be. (Recall that one of the core cases in the e-discovery world was an employment discrimination case).
Moreover, data governance should be both a set of controls and a cultural norm at
organizations. Organizations are too data-driven and too dependent on technology to allow the
storage, management, use, retrieval, elimination and exfiltration of data to be left to anything short of a
fundamental shift in the way that lawyers, technology personnel and executives interact with each other
and with the organization’s data.
Which brings me back to employment lawyers: since that data is often rife with employee-related
information and will most likely be used in employment-related litigation, employment counsel must
ensure that good data governance practices are in place.
The first step in data governance for employment lawyers is a big one: fully assessing the situation.
Some key questions* can help frame your investigation:
1. Do you have a data “governor?” In other words, is someone in charge of data governance? Is
there/can there/should there be a cross-functional team that deals with data issues? Are you or
other lawyers on that team or connected to the process?
2. Have you surveyed your situation? Do you know what data is where, what retention policies are
in place and who is in charge of deciding upon and executing those policies?
3. Do you have a data governance strategy, even an informal one?
4. Do you know the probability of risk? How has that data been used (or abused) in the
past? What is your litigation and discovery risk?
5. How are data emergencies (breaches, discovery requests) and data security managed? Who
decides? Are litigation holds designed to work across technologies, such as ensuring data
preservation of text messages?
6. How will you assess the true costs of e-discovery? What is, in good faith, “reasonably accessible”
and what is not?
51 | P a g e
7. How is information usage/access logged, by whom and where are the logs kept?
8. Are employees properly educated in good data stewardship and practices, including how and
when to mark documents as privileged?
9. Are you monitoring the efficacy of your controls?
10. Is someone able to report on data governance and its efficacy as an F.R.Civ.P 30(b)(6) witness?
11. How will you ensure that your data governance plan is up-to-date and is encompassing new and
emerging technologies? For instance, as encryption becomes more “popular” (Google and
Apple have released phones will “full device” encryption) what rules will you issue to
employees?
12. How would you measure a “successful” data governance plan?
13. Is there a way to ensure that data governance becomes a cultural rather than a mere rule-based
set of norms for the enterprise? Are leaders willing to lead on the issue of the importance of
data governance? Will the company create a culture of data awareness and data integrity?
In short, you need to Assess, Plan and regularly Reassess. And finally, you need to build: build a
compliance scheme and build a leadership culture and environment where data awareness and integrity
are at the heart of the enterprises’ culture.
*- Citations and very interesting reads: my questions these questions are adopted from, but do not
exactly mirror, IBM’s Data Governance Blueprint: Leveraging Best Practices and Proven
Technologies (pdf) and Gartner’s How to Measure Success with Information Governance?. See also the
American Health Information Management Association’s E- Discovery Litigation and Regulatory
Investigation Response Planning: Crucial Components of Your Organization’s Information and Data
Governance Processes.
52 | P a g e
Steve Sheinberg is the General Counsel of the Anti-Defamation League and its
supporting foundation. He is also author of workplacetechlaw.com, a blog that
explored the intersection of emerging technology and employment law.
It is very important to note that all of the opinions and analysis in this blog are
my own and may not reflect the views of my employer.
Steve can be reached at [email protected].
© 2015 Steve Sheinberg
53 | P a g e
© Copyright 2026 Paperzz