cybersecurity and data privacy review and update: looking ... › portalresource › lookup ›...

41
Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020 1010010110010101011100101100 1010101110011010101011010101 0010110111001010101101100110 1011010010101101001101010011 0110101011001010010100110010 110101001101001011001010101 1010101110011010101011010101 1010110111001010101101100110

Upload: others

Post on 04-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

1010010110010101011100101100101010111001101010101101010100101101110010101011011001101011010010101101001101010011011010101100101001010011001011010100110100101100101010110101011100110101010110101011010110111001010101101100110

The reality of cybersecurity and data privacy threats and regulations proved a sobering one for many companies in 2019. The cost of data privacy began to hit home in January, with French regulators imposing a record fine for General Data Protection Regulation (GDPR) violations, and with subsequent global data privacy fines increasing in amount and frequency throughout the year. As the year progressed, regulators did not relent, presenting a flurry of comprehensive and demanding data privacy and cybersecurity regulations, including for the California Consumer Protection Act (CCPA).

The year also saw companies and governments alike confronting escalating cybersecurity threats, especially as geopolitical tensions continued to rise.

As with the start of 2019, the start of 2020 is proving to be no respite. Quite the opposite. The CCPA went into effect on New Year’s Day, and geopolitical tensions increased dramatically in the first week of January. Travelex CES, and even the Texas Department of Agriculture, fielded some of the first cyberattacks of the year, while regulators such as the New York Department of Financial Services (NYDFS) are directing companies to urgently address outstanding security issues and test updated disaster recovery plans in light of growing cyber threats.

Amidst the escalating cybersecurity threats of 2020, companies still wait expectantly for clarity on state privacy regulations, including the CCPA, and for news about additional privacy laws that states may pass, particularly in New York and Washington. Companies are also girding themselves for a dramatic rise in corresponding litigation,

especially with the CCPA’s new private right of action. Furthermore, as the pace and capacity of technology continues to demand deeper integration of third-party supporting technologies or data sets, and as consumers increasingly access and leverage tools for visibility into the management of their private data, companies will increasingly face challenges to data integrity and data management. Data manipulation attacks, including deepfake photographs and videos, are among the next generation threat to the integrity of data that we expect to see more of in 2020. Furthermore, untangling data to make it available or to delete it in light of new privacy laws will prove among the most challenging issues companies face this year— and in years to come.

The past year brought some good news, however, particularly as companies are proving themselves far more resilient against cyberattacks, and as companies realize that there are definite ways to leverage prior privacy compliance efforts to help reduce the burden of complying with new and forthcoming privacy laws.

Ultimately, as we embark on a new year and a new decade, it has never been more important to ensure that companies have a comprehensive, proactive, risk-based and well-practiced cyber and privacy strategy. Staying on top of all the rapid changes in this field is a challenge, but one that must be met. We hope this compilation helps. We also publish Updata, a global quarterly round-up of cyber and privacy developments from around the world, and we maintain a cybersecurity and privacy insights blog, as well as a separate website devoted to the CCPA.

Most importantly, we wish you all the best in 2020!

Did you know?

$6TrillionEstimated global annual cybersecurity damages by 2021

Source

1/2of cyberattacksGeared towards small businesses that, on average, spend less than $500 annually on cybersecurity

Source

$1TrillionEstimated global five-year cybersecurity spending through 2021

Source

$2TrillionEstimated global cybersecurity losses in 2019

Source

$300Billion Estimated worth of 2024 cybersecurity market

Source

146BillionEstimated number of records stolen by 2023

Source

To best prepare for the future, one must understand the past. So here we present our year-end compilation of the most pressing 2019 cybersecurity and data privacy updates, alerts and analyses.

2

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

2019 Key highlightsData Privacy and CybersecurityInternational

– Article: In the spotlight: Cyber resilience and risks around outsourcing Published by The Lawyer

– Article: Lawyers at the Vanguard: The Wisdom of Involving Lawyers at the Innovative Design Phases, and the Obligations on Those Lawyers Published by FinTech Law Report

– Legal Alert: January’s Privacy Blizzard

United States (general)

– Article: The state of US data privacy and cybersecurity laws in 2019

– Article: Legislative heat wave: A mid-year review of upcoming cybersecurity laws and enforcement activity Published by Cybersecurity Law & Strategy

– Article: Know Your Tech Published by Cybersecurity Law & Strategy

– Legal Alert: Newfound consumer privacy focus could transform debate over expiring US surveillance authorities

– Legal Alert: FTC effectively shuts down Utah company’s operations pending compliance with mandated data security plan

– Legal Alert: Decoding Regulation S-P – What noncompliance looks like and what it will cost you

– Article: Securing retirement: 401(k) plan cybersecurity Published by SHRM

– Article: Insuring Against a Data Breach Published by Construction Executive

United States (state specific)

– Legal Alert: The California Attorney General’s proposed regulations on the California Consumer Privacy Act – a helpful roadmap on how to comply?

– Legal Alert: Dangerous seas ahead - the California Consumer Privacy Act and litigation risk

– Legal Alert: New CCPA amendments bring clarity prior to the January 1, 2020 deadline

– Article: The New Vendor Management World Under NYDFS’ New Cyber Regulation Published by LegalTech News

Cryptocurrency – Article: Federal enforcement trends in the cryptocurrency sphere

– Legal Alert: Cryptocurrency enforcement on the upswing—Texas cryptocurrency issuers agree to pay over $10 million to the SEC

– Legal Alert: IRS provides long-awaited cryptocurrency guidance

– Legal Alert: The SEC marches on - Cryptocurrency startup Block.one pays civil penalty of $24 million over unregistered initial coin offering

– Article: New IRS Tax Guidance Targets Crypto, and US Persons Who Use It Published by Cointelegraph

Cross-border data transfers – Legal Alert: The CLOUD Act – A cross-border data access agreement rises from the fog

Biometrics – Legal Alert: Passing the eye test—Defense strategies and the Biometric Information Privacy Act

– Legal Alert: Biometrics beware – Compliance and the Biometric Information Privacy Act

– Legal Alert: The floodgates open – Illinois Supreme Court issues landmark ruling in biometrics case

– Article: Illinois Courts Continue To Interpret BIPA Broadly Published by Law 360

3

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Data Privacy and CybersecurityArticle: In the spotlight: Cyber resilience and risks around outsourcing

November 2019

The operational resilience of financial institutions has come under increased scrutiny following a number of recent high-profile IT failures and cyberattacks. Operational resilience itself is much broader than merely IT and cyber: it covers events ranging from natural disasters to civil unrest, and those impacting critical national and market infrastructure.

In this article for The Lawyer, Eversheds Sutherland attorneys Michael Bahar, Jake McQuitty, Craig Rogers and David Cook focus on cyber resilience – and in particular, the risks associated with outsourcing business-critical IT operations and middle and back-office services.

Full article

Authors

Michael Bahar | +1 202 383 0882 | Email

Jake McQuitty | +44 207 919 0600 | Email

Craig Rogers | +44 20 7919 0707 | Email

David Cook | +44 161 831 8144 | Email

4

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Lawyers at the Vanguard: The Wisdom of Involving Lawyers at the Innovative Design Phases, and the Obligations on Those Lawyers

March/April 2019

In the gold rush environment of fintech, only fools rush in without their tech-savvy lawyers.

While the financial services landscape is already heavily dotted with regulation, privacy and cybersecurity regulations are adding significant navigational hazards, putting a premium on taking the time to understand and accommodate these regulations in the design and adoption phases, as opposed to in the far more costly remediation phases. Done right, more fully understanding the tech and its interplay with regulations can help with “future-proofing” as well, thereby more than making up for any “lost” time with taking a deliberate approach to fintech.

Given this, in-house and external counsel should increasingly be in the room where it happens, as it’s happening—rather than being asked to conduct a review right before rollout, when most of the significant decisions have been set and are hard to unwind.

In the article for FinTech Law Report, Eversheds Sutherland attorney Michael Bahar discusses why lawyers need to be sufficiently comfortable talking tech and working with their clients to help them navigate their fintech innovations to operational, commercial, and legal success.

Full article

Author

Michael Bahar | +1 202 383 0882 | Email

5

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: January’s Privacy Blizzard

February 4, 2019

In collaboration with Eversheds Sutherland partner Paula Barrett.

As predicted, the start of 2019 provided scant respite from the frenetic pace of privacy and cybersecurity developments during 2018. This past month alone, in a blizzard of activity, regulators amended regulations and enforced substantial fines under existing regulations; courts issued significant interpretations of current law; and legislatures proposed new laws aimed at increasing privacy obligations and potential liability. The common theme is that from all corners—hackers, regulators, legislators and plaintiffs—the burden on companies to protect sensitive information is only growing, as are the obligations on companies to provide greater transparency and greater rights to customers and employees on their data collection and sharing practices.

This Alert highlights the most pressing cybersecurity and data privacy updates from the month of January, including the French fine on Google; the US federal court decisions rejecting the Yahoo! data breach settlement and largely granting CareFirst’s motion to dismiss; the Illinois Supreme Court’s decision allowing a class action alleging technical violations of a biometric statute to go forward; Massachusetts’ revised breach law; the National Futures Association’s revised cybersecurity guidance; and Washington State’s introduction of a General Date Protection Regulation (GDPR)-like privacy bill.

Google Faces First Major Fine Under the GDPR and a US Judge Rejects Yahoo! Settlement

The CNIL, France’s data protection regulator, opened 2019 with a bang by handing down a $57 million fine against Google, by far the largest penalty issued under the GDPR thus far. Previous fines under the law had not crossed the $1 million mark. The CNIL found that Google violated the GDPR because the tech giant failed to provide enough information to consumers regarding its data collection practices and failed to obtain valid consent for personalized advertisements.

While Google plans on appealing the decision, the CNIL’s action reinforces some cautionary points. First, transparency and coherence are vital. For all companies, it will be important to ensure that privacy disclosures are aggregated in one place and are clear on what the company is doing with an individual’s data, how it is collecting and sharing that data, and how long it is retaining it, and that the privacy policy is presented in a way that is easy to read and readily accessible to the individual. US courts are also willing to take strong action against companies that they deem insufficiently transparent. This month, US District Court Judge Lucy Koh rejected a proposed $50 million settlement over Yahoo!’s data breach, citing in part the egregious nature of the company’s “history of nondisclosure and lack of transparency related to the data breaches.”

Returning to the GDPR, the emphasis on transparency can play out in privacy policies. For example, companies often create policies that make blanket declarations of their lawful bases for gathering personal data (e.g., “we collect your information on the bases of our legitimate interest and the necessity of fulfilling a contract”). That approach may not be sufficiently transparent, as opposed to listing the specific categories of data collected and stating the specific lawful bases.

This clarity theme is also relevant to consent. The Google decision emphasized the need for the individual to take a positive step to evidence consent (no opt-outs or implied consents), and for the consent to have clarity on what consent is actually being requested.

Second, the CNIL decision indicates that it will be important for those companies with EU establishments to make clear their “main establishment” in the EU. In addition, for this designation to be recognized, there needs to be decision making about the relevant data processing occurring in that location. If all the decision making still happens outside the EU, or in a different EU country, the supervisory authorities can take the view that it is not the main establishment and lead supervisory authority, and the company could therefore lose the benefit of the “one stop shop.” With many international groups having small presences in the EU—perhaps driven by tax considerations— and all the decision making still happening back at corporate HQ outside the EU, that will cause concern. The CNIL felt it could take its action because Google had not sufficiently established that Ireland was its main establishment for the particular processing concerned. This also raises the point that some organizations may have different main establishments for particular types of processing.

Third, this decision shows that regulators are willing to back up the GDPR with substantial fines. The Yahoo! decision also indicates the long-term costs of non-compliance. Put another way, if companies think the costs of compliance are expensive, regulators and courts are basically saying: try non-compliance.

US District Court Dismisses Vast Majority of Claims against CareFirst

US courts in January have shown that there are limits to private suits against companies that have suffered a breach. On January 29, DC District Court Judge Christopher R. Cooper, in the CareFirst class action, agreed with Eversheds Sutherland attorneys Matt Gatewood and Robert Owen when he dismissed the vast majority of claims against the health insurer. Judge Cooper originally dismissed the suit on constitutional standing grounds in 2016, but the US Court of Appeals for the DC Circuit reversed, ruling that the

6

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

policyholders had “cleared the low bar to establish their standing at the pleading stage” by asserting there was a substantial risk that their stolen personal information could be used “for ill” purposes, such as identity theft, even though it had yet to be misused. The US Supreme Court declined to take the case. On remand, Judge Cooper held that while plaintiffs’ “alleged injuries may be enough to establish standing at the pleading stage of the case, they are largely insufficient to satisfy the ‘actual damages’ element of nine of their state-law causes of action.” Judge Cooper concluded that the Complaint’s allegations of future risk of identity theft, loss of the benefit of the bargain, prophylactic purchase of credit monitoring, and emotional distress were not enough to clear the requirement that actual damages be stated.

Illinois Supreme Court Finds that Plaintiffs Need Not Show Actual Harm in Biometrics Cases

In Illinois, January was to be plaintiffs’ month. In a unanimous decision on January 25, 2019, the Illinois Supreme Court found that a plaintiff need not show actual harm to seek relief under the state’s Biometric Information and Privacy Act (BIPA). Instead, the court held that a procedural harm is sufficient to bring forth a claim under the law. Eversheds Sutherland has analyzed the decision and its impact.

So many businesses are seeking to adopt biometrics across the spectrum of sectors, and they are doing so both for the consumer-facing part of their businesses and for their employees. Ironically, many are doing so to further protect data. However, biometrics are also being adopted in physical location access controls, time and performance management systems, and in myriad other ways to verify the ID of staff for purposes other than security. This use is becoming more challenging to adopt lawfully across a growing number of countries and not just where the jurisdictional reach of the GDPR comes into play. This case is interesting, not just as it pertains to BIPA, but also because it is not clear how Article 79 of the GDPR and its right to a judicial remedy for a breach will play out in the courts, alongside the Article 82 right to compensation.

Massachusetts Updates its Data Breach Notification Law

On January 10, Massachusetts Governor Charlie Baker signed a bill amending the state’s data breach notification requirement. Set to go into effect on April 11, 2019, the amendments will require:

Companies that suffer data breaches involving social security numbers to provide credit monitoring services for 18 months to affected consumers free of charge; this requirement is 42 months if the company that suffered the breach is a credit monitoring agency;

Post-breach notice sent to the Massachusetts Attorney General and the state’s Office of Consumer Affairs and

1 https://ncdoj.gov/CMSPages/GetFile.aspx?nodeguid=89988b8d-2bbe-4854-bc7f-a77cfc4b38b2&lang=en-US.

Business Regulation (OCABR) to include whether the company has a written information security program in place and the type of information compromised; and

Corporations providing post-breach notification to consumers to identify any parent or affiliated corporations.

The newly amended law will also require the OCABR to post information about the breach on its website, including a copy of the notice sent to consumers regarding the breach and instructions on how consumers could get access to the notice sent to the OCABR and the state attorney general.

These amendments will make Massachusetts’ data breach notification laws one of the most onerous in the country. Massachusetts will join California, Delaware and Connecticut as only the fourth state to require companies to provide credit monitoring services to consumers after a data breach. As seen in the past with data breach laws, other states may follow Massachusetts’ lead and pass laws that have similar or more stringent requirements. North Carolina’s Attorney General, for example, has proposed an amendment to the state’s data breach notification requirement that would expand the definition of a security breach to include ransomware attacks and require credit reporting agencies that suffer a data breach to provide five years of free credit monitoring to affected consumers.1

For the latest state and foreign breach jurisdiction requirements, please visit our app.

NFA Amends Interpretive Notice Regarding Cybersecurity Information Systems Security Programs

In addition to the numerous legislatures amending or considering amending their data breach notification requirements, the National Futures Association (NFA)—the self-regulatory organization for the US derivatives industry—recently amended its interpretive notice regarding cybersecurity. The amendments, set to go into effect on April 1, 2019, will update the current Information Systems Security Program (ISSP) requirements that NFA members have to comply with in three important ways. First, the amendments require members to provide cybersecurity training to employees at least annually and more frequently if warranted, in addition to training upon hiring as is currently required. The amendment also requires that NFA members identify specific topic areas that they will cover in their cybersecurity trainings.

Second, the amendment clarifies the appropriate officers that may approve the ISSPs. While currently the NFA member’s CEO, CTO or “other executive level official” has the authority to approve ISSPs, the amendment deletes the term “executive official” and replaces it with “senior level officer with primary responsibility for information security or other senior official who is a listed principal and has the authority to supervise the NFA Member’s execution of its ISSP.”

7

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Finally, and perhaps most importantly, the amendment to the interpretive notice requires members to notify the NFA of a security incident in certain circumstances. Currently NFA members are not required to send incident responses to the NFA. But once the amendment goes into effect on April 1, 2019, if an NFA member (other than futures commission merchants for which the NFA is not the designated self-regulatory organization) suffers a cybersecurity incident that results in a loss of customer or counterparty funds, or a loss of a member firm’s capital, or if the member has to otherwise notify its customers or counterparties of a cybersecurity breach pursuant to state or federal law, then that member must also notify the NFA. The NFA plans on releasing subsequent guidelines regarding the manner in which members need to notify the NRA after a cybersecurity incident prior to the April 1 implementation date of the amendment.

Read more about the latest NFA amendment.

Washington State Proposes its own Version of the CCPA

With the California Attorney General currently hosting public hearings regarding rulemaking required under the California Consumer Privacy Act (CCPA), the state of Washington is considering similar comprehensive data privacy legislation. A bill has been introduced that mirrors the CCPA in many ways, which in turn mirrors the GDPR, and provides individual data privacy rights to consumers—such as the right to be notified that their personal information is being collected, the right to opt out of the sale of their personal information for marketing purposes, and the right to be forgotten.

Penalties under Washington’s bill would be capped at $7,500, like the CCPA, though the former is only enforceable by the state attorney general, while the CCPA also has a limited private right of action. The state legislature recently held a hearing on the bill. Should it gain traction and pass into law in its current form, it would go into effect on December 31, 2020.

While the Washington law may change significantly as it goes through the legislative process, and may not even pass, the message is nonetheless clear. More jurisdictions, both in the US and abroad, are continuing the trend sparked by Europe’s GDPR to require companies to provide enhanced transparency into their data collection, retention and sharing practices and to enhance the data privacy rights of individuals.

Authors

Michael Bahar | +1 202 383 0882 | Email

Paula Barrett | +44 207 919 4634 | Email

Matt Gatewood | +1 202 383 0122 | Email

Robert Owen | +1 212 389 5090 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Related attorney

Michael Steinig | +1 202 383 0804 | Email

8

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

The state of US data privacy and cybersecurity laws in 2019

December 2019

In collaboration with Eversheds Sutherland attorneys Paula Barrett

US data privacy and cybersecurity laws developed rapidly in 2019, especially in California, New York, Nevada and Massachusetts. While there are ongoing efforts to pass a federal privacy law, states are leading the way.

– US legislators and regulators have passed—or are looking to pass—privacy standards with enhanced disclosure and rights obligations, as well as stepped-up cybersecurity requirements and breach reporting timelines.

– The definition of personal information also continues to expand beyond traditional US notions of personally identifiable information.

– Many of these laws have extraterritorial reach, and compliance with one set of rules, including the EU’s GDPR, is no guarantee of compliance with another.

Authors

Michael Bahar | +1 202 383 0882 | Email

Sarah Paul | +1 212 301 6587 | Email

Paula Barrett | +44 207 919 4634 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

9

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Legislative heat wave: A mid-year review of upcoming cybersecurity laws and enforcement activity

August 2, 2019

While legislation to enhance data privacy rights and obligations continue to make headlines, regulators and legislators are also stepping up their cybersecurity expectations.

In their article for Cybersecurity Law & Strategy, Eversheds Sutherland attorneys Michael Bahar, Mary Jane Wilson-Bilik and Sarah Paul review a number of states that have updated their existing data breach notification laws and passed new cybersecurity requirements in the first half of 2019.

Full article

Authors

Michael Bahar | +1 202 383 0882 | Email

Sarah Paul | +1 212 301 6587 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

10

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Know Your Tech

March 2019

Technologies are often pitched as solutions, if not game-changing solutions. Indeed, many times they are, but no solution comes without the seeds of its own costs and challenges. For pragmatic and regulatory compliance reasons, it is increasingly important for boards, senior executives and general counsel to sufficiently understand technologies such as blockchain, artificial intelligence (AI) and integrated “smart” components to recognize their potential risks and costs, not just their potential promise.

In the article for Cybersecurity Law and Strategy, Eversheds Sutherland attorney Michael Bahar discusses why it is more important than ever to know your tech.

Full article

Author

Michael Bahar | +1 202 383 0882 | Email

11

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: FTC effectively shuts down Utah company’s operations pending compliance with mandated data security plan

November 19, 2019

1 All reviews that have mandated frequency must also take place following any incident that puts personal information at risk

With companies increasingly worried about what the California Attorney General, and private litigants, will do once the California Consumer Privacy Act comes into effect, they should not lose sight of what the Federal Trade Commission (FTC) is already doing.

On November 12, 2019, the FTC announced that InfoTrax Systems L.C. (InfoTrax), a Utah technology company, and Mark Rawlins, its founder, agreed to implement a comprehensive data security program as part of a 20-year consent agreement to settle an FTC complaint in the wake of a two-year breach of the company’s network. The FTC alleged that InfoTrax, whose clients are primarily multilevel marketers, failed to use reasonable, low-cost and readily available security protections to safeguard the personal information that it maintained on its clients’ behalf. As part of the proposed settlement, InfoTrax and Rawlins are prohibited from collecting, selling, sharing, or storing personal information pending the overhaul of the company’s security operations—in effect, temporarily shutting down the company’s operations.

The settlement re-emphasizes the FTC’s determination to enforce cybersecurity standards, regardless of what state regulators are doing, and the settlement sheds further light on what regulators and courts consider “reasonable” cybersecurity practices.

Furthermore, it puts companies on notice that enforcement actions may target the data-protection practices of companies whose clients are other businesses, not just companies who deal with individual consumers; business operations may be severely interrupted until operations are compliant with an FTC settlement; and compliance with an FTC-mandated information security plan and accompanying certification will likely be more costly than voluntary safeguards to sensitive data.

InfoTrax provides back-end operations including compensation, accounting, data security, and operation of its clients’ web portals. The FTC alleges that from 2014 to 2016, a hacker infiltrated InfoTrax’s servers more than 20 times, and in March 2016 the hacker accessed about one million consumers’ personal information. The FTC defines personal information as individually identifiable information from or about an individual consumer including, but not limited to, names, addresses, Social Security numbers, and credit/debit card information. The FTC’s complaint against InfoTrax and Rawlins alleged violations of § 5(a) of the Federal Trade Commission Act, 15 U.S.C. § 45(a).

Under the terms of the proposed settlement, InfoTrax and Rawlins are required to document and implement a comprehensive data security program (the Information Security Program) to protect the “security, confidentiality, and integrity” of stored personal information. The proposed settlement mandates that the Information Security Program safeguards must, in addition to other requirements, include:

– Policies, procedures, and technical measures to inventory and delete personal data that is no longer needed;

– Annual code review and software penetration testing;

– Technical measures to detect and limit unknown file uploads and anomalous activity that may attempt to exfiltrate personal information outside the company’s network boundaries;

– Annual assessment, testing, and monitoring of the safeguards’ effectiveness;

– Network vulnerability testing every four months and annual penetration testing;

– Annual review of the Information Security Program and modification based on the results of that review; and

– Annual assessment and documentation of internal and external risks to personal information1

In addition to requiring InfoTrax to conduct its own internal monitoring, the proposed settlement requires InfoTrax to obtain independent assessments from a third party, selection of which is subject to the FTC’s approval, for the initial 180-day period after the FTC order and then biennially for 20 years. The third-party report must: (1) determine whether the plan has been implemented and maintained as required; and (2) identify gaps or weaknesses in the plan.

Furthering the trend toward senior officer accountability for cybersecurity, the proposed settlement requires annual certification by a senior corporate manager that the requirements of the proposed settlement have been established, implemented and maintained, and that there is no known, material noncompliance that has not been corrected or disclosed to the FTC.

Finally, the FTC’s settlement acknowledges what privacy laws like the California Consumer Privacy Act imply and what both the New York State Department of Financial Services’ 23 NYCRR Part 500.11 (which requires third-party service providers to comply

12

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

with cybersecurity practices and to certify compliance by March 2020) and Europe’s General Data Protection Regulation (GDPR) recognize: that companies are responsible for personal information they pass to other companies. The settlement requires InfoTrax to retain service providers capable of safeguarding personal information, indicating to all companies the importance of vendor due diligence and diligent contracting.

The FTC voted 5-0 in favor of the settlement, with one commissioner writing a concurring opinion to question whether the order’s 20-year timeline is too long, given both the burden to the company and the dynamic nature of the technology industry.

Authors

Michael Bahar | +1 202 383 0882 | Email

Sarah Paul | +1 212 301 6587 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Andrew Weiner* | +1 212 301 6602 | Email*Andrew is not admitted to practice. Application submitted to the New York State Bar.

13

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: Decoding Regulation S-P – What noncompliance looks like and what it will cost you

April 22, 2019

1 https://www.sec.gov/files/OCIE-Risk-Alert-Regulation_S-P.pdf.

As every investment adviser, broker-dealer, and fund (and their lawyer) knows, noncompliance with Regulation S-P, the SEC’s primary rule on privacy notices and safeguard policies, can land a registrant in hot and expensive water. What noncompliance looks like, however, has not always been clear. On April 16, 2019, the staff of the SEC’s Office of Compliance Inspections and Examinations (OCIE) released a Risk Alert setting forth common Regulation S-P compliance issues observed over the last two years of exams.1

Deficiencies

– Privacy and Opt-Out Notice Deficiencies:

Common privacy and opt-out notice deficiencies observed by OCIE included the failure to provide timely and accurate privacy notices to customers, including privacy notices that failed to inform customers that they could opt out of the sharing of their nonpublic personal information.

– Policy and Procedure Deficiencies:

OCIE observed failures to design comprehensive policies and procedures related to Rule 30(a) of Regulation S-P (the Safeguards Rules), which requires firms to adopt written policies and procedures reasonably designed to safeguard customer records and information. OCIE noted that mere restatement of the Safeguards Rules is insufficient; the policies and procedures must include actual measures related to the administrative, technical, and physical safeguards. OCIE went on to provide specific observations relating to failures to implement policies or design those policies to safeguard customer information. From this list, OCIE has provided firms with tangible steps to try to safeguard customer information:

» Design and implement policies and procedures to safeguard customer information on personal laptops, prevent employees from sending unencrypted emails containing personally identifiable information (PII), and prohibit employees from sending customer PII to unsecure outside networks. Relatedly, design and implement procedures to safeguard hard copy customer information (i.e., lock file cabinets in open offices).

» Train employees on the methods used to protect customer information and monitor whether these safeguards are being followed.

» Review third parties’ handling of customer information and ensure adequate protection. If the firm’s policies and procedures require outside vendors to contractually agree to keep customer PII confidential, firms should require the outside vendors to sign those contracts.

» Inventory customer PII and where it is kept. Firms can’t keep it safe if they don’t know where it is.

» Cut off access of departing employees’ ability to review and obtain customer information before they depart.

» Limit employee access to required customer information; only share customer login credentials as needed and permitted by policy.

» Design and implement an incident response plan. This plan needs to address: (1) role assignment for implementing plan; (2) how to address a cybersecurity incident; and (3) an assessment of system vulnerabilities.

– Other Issues That Firms May Want to Address:

Although not discussed in the Risk Alert, we have observed that the SEC has been looking at two issues relating to the Broker Protocol and compliance with Regulation S-P: (1) how firms that are members of the Broker Protocol disclose in their privacy notices that firms or departing representatives may provide PII to the departing representatives’ new firms; and (2) how firms track whether customers opt out of the disclosure of such information.

Enforcement Implications

The Risk Alert does not provide information on what noncompliance with Regulation S-P can cost a firm. However, the below list provides examples of enforcement actions against firms that failed to comply with Regulation S-P:

– The SEC charged a dually registered broker-dealer and investment adviser with violations of Regulation S-P, specifically the Safeguards Rule and the Identity Theft Red Flags Rule. The firm paid a $1,000,000 penalty to settle charges relating to cybersecurity failures that resulted in cyber-intruders gaining access to customer information. Although the firm had policies and procedures to address cybersecurity threats, the SEC found that the firm did not adequately enforce them.

– The SEC settled an action against a dually registered firm for a $1,000,000 penalty related to the firm’s failure to adopt written policies and procedures to protect customer data, allowing a then-employee to transfer customer data to his personal server, which was subsequently hacked by a third party. Specifically, the firm did not restrict employee access to customer information based on an employee’s legitimate business need.

14

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

– The SEC settled an action for $75,000 against an investment adviser that failed to adopt written policies and procedures to protect customer information when it stored client information on a third-party hosted web server that was ultimately victim of a cybersecurity attack.

– FINRA fined a broker-dealer $225,000 for Regulation S-P violations after an unencrypted laptop with customer information was lost. Despite the fact that there was no evidence the customer information was accessed, FINRA focused on the firm’s violation of its own policies and procedures to encrypt laptop computers that contain confidential customer information.

– FINRA fined one broker-dealer $175,000 for Regulation S-P violations after finding that it failed to safeguard customer information when it failed to configure firewall protections and used ineffective username and password systems.

As regulators’ interest in Regulation S-P increases, firms should consider reviewing their privacy notice process to confirm that they provide timely, accurate, and comprehensive privacy notices to their customers. In addition, firms may want to review and update existing data security and compliance policies and procedures to avoid the missteps described above. If firms fail to heed the messages contained in OCIE’s Risk Alert, they may find themselves receiving a “notice” that they are the subject of an enforcement action.

Authors

Brian Rubin | +1 202 383 0124 | Email

Larry Polk | +1 404 853 8225 | Email

Amanda Giffin | +1 404 853 8061| Email

Related Attorneys

Eric Arnold | +1 202 383 0741 | Email

Olga Greenberg | +1 404 853 8274 | Email

Cliff Kirsch | +1 212 389 5052 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Al Sand | +1 212 287 7019 | Email

15

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Securing retirement: 401(k) plan cybersecurity

August 27, 2019

Retirement plan participants are becoming increasingly reliant on online platforms, including mobile phone apps, to access and monitor their 401(k) plan accounts. At the same time, these types of online platforms are increasingly susceptible to data breaches and sophisticated fraud schemes.

In their article for SHRM, Eversheds Sutherland attorneys Brenna Clark and Brittany Edwards-Franklin discuss how given the vast amounts of money in US 401(k) plan accounts, it seems almost inevitable that there will be a successful, large-scale attack on retirement plans soon.

Authors

Brenna Clark | +1 404 853 8027 | Email

Brittany Edwards-Franklin | +1 404 853 8130 | Email

16

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Insuring Against a Data Breach

July 20, 2019

Although hacking attempts may be most commonly directed at financial and health institutions housing troves of financial and personal data, the construction industry is not immune from the risk of a data breach. Like most industries, the construction industry continues to advance in technological innovation—projects are becoming increasingly dependent on mobile connectivity and there is a growing reliance on cloud-based storage and sharing services.

In their article for Construction Executive, Eversheds Sutherland attorneys Jesse Lincoln and Margaret Flatt discuss how such innovation comes with an increase in possible cyber-attacks and data breaches.

Full article

Authors

Jesse Lincoln | +1 404 853 8211 | Email

Margaret Flatt O’Brien | +1 404 853 8070 | Email

17

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: The California Attorney General’s proposed regulations on the California Consumer Privacy Act – a helpful roadmap on how to comply?

October 17, 2019

In collaboration with Eversheds Sutherland attorney Paula Barrett

On October 11, 2019, the California Attorney General issued long-awaited draft Regulations to the California Consumer Privacy Act (CCPA). The draft Regulations provide helpful clarity on some core aspects of California’s sweeping new privacy law, while also adding significantly to the complexity of, as well as the list of, requirements on a business to protect consumer personal data. In some instances, these regulations go beyond the requirements of the European Union’s General Data Protection Regulation (GDPR), and in some cases, the likely contemplation of even the well prepared.

This alert does not summarize the entirety of the draft Regulations; rather it points out and analyzes some of the key highlights. The Regulations are still a draft, so they may change; but with the CCPA set to go into effect in only a few months, businesses will find it useful to look to them as an invaluable compliance roadmap.

First, the draft Regulations clarify that a company can be both a business and a service provider, depending on its relation to the consumer and the data collected (999.314). This clarification is highly significant, and it helps bring the CCPA more in line with the European GDPR’s critical distinction between controllers and processors. In essence, with the draft Regulations, the CCPA would recognize that service providers need not assume the responsibility of businesses for adjudicating consumer access and deletion requests. If the service provider receives an access or deletion request from a consumer regarding personal information that the service provider collects, maintains, or sells on behalf of a business it services, it need not comply with the request, but shall explain the basis for the denial, and refer the consumer to the relevant business (999.314(d)). Therefore, a business can now be less concerned that its service providers would be making unilateral decisions on responding to consumer requests regarding personal information that the business collected or processed through the service provider. As with the GDPR, the key will be to ensure a contract is in place with service providers that includes clear direction on how to handle consumer access or deletion requests.

Second, the draft Regulations acknowledge that when it comes to consumer access or deletion requests, there is an inherent tension between privacy and cybersecurity. For example, if an individual asks for a detailed accounting of the personal information a business maintains on them, should the business simply email that information, or would doing so present undue cybersecurity risks? Should personal information be masked

when presented to a requesting consumer, akin to what is done with all but the last four digits of credit card numbers on receipts? The draft Regulations state that a business “shall not” provide a consumer with specific pieces of personal information if the disclosure creates a “substantial, articulable, and unreasonable risk to the security of that personal information, the consumer’s account with the business, or the security of the business’s systems or networks” (999.313(c)(3)). In fact, the draft Regulations even exclude Social Security numbers, driver’s license numbers, financial account numbers and other sensitive information from the disclosure requirement (999.313(c)(4)).

Similar privacy vs. security questions apply for verification, which the draft Regulations also helpfully address. While the draft Regulations require the business to disclose its verification procedures (999.308(b)(1)(c)), it requires only that those verification procedures be reasonable and tiered to the sensitivity of the personal information at stake (993.323).

Third, the draft Regulations help avoid confusion over whether the CCPA actually requires a business to collect more personal information in order to comply with consumer requests. Take, for example, IP addresses, which the CCPA includes within its definition of personal information, but which many companies do not necessarily collect or maintain in a way that reasonably identifies a consumer or a household. The draft Regulations confirm that the business would not have to, say, correlate a dynamic IP address with an internet service provider if a consumer asks for all personal information that the business collected on them. Instead, the business may require the consumer to demonstrate that they are the “sole consumer associated with the non-name identifying information” (999.325(e)(2)).

In addition, for verification purposes, a business need only match the identifying information provided by the consumer to the personal information of the consumer already maintained by the business. A business may collect additional personal information to verify a request, according to the draft Regulations, but the business “shall delete any new personal information collected for the purposes of verification as soon as practical after processing the consumer’s request…” (999.323(c)).

Fourth, in an additional nod to cybersecurity, not to mention practicality, the draft Regulations provide that where personal information is stored on archived or backup systems, deletion may be “delayed until the archived or backup system is next accessed or used” (999.313 (d)(3)). While the CCPA, unlike the GDPR, does not have an explicit data minimization requirement,

18

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

one way to facilitate compliance when it comes to deleting personal information on backups is to institute a retention schedule whereby unaccessed archives are aged off after a certain period of time (assuming, of course, that newer backups are sufficient).

Fifth, the draft Regulations help answer the question: what does it mean to delete personal information? In their current form, the draft Regulations clarify that a business can comply with a deletion request by “[p]ermanently and completely erasing” the personal information on its existing systems “with the exception of archived or backup systems,” de-identifying the personal information, or aggregating the personal information (999.313(d)(2)).

Sixth, the draft Regulations help resolve the logical conundrum inherent in deletion requests: how do you prove that you have deleted someone’s information upon their verified request? Now, the draft Regulations clarify that a business may—and in fact shall—maintain records of consumer requests, including for deletion, made pursuant to the CCPA, and how the business responded to those requests for at least 24 months (999.317(b)). The Regulations explain that the records may be maintained in a “ticket or log format provided that the ticket or log includes the date of the request, the nature of the request, manner in which the request was made, the date of the business’ response, the nature of the response, and the basis for the denial of the request if the request in denied in whole or in part” (999.317(c\)). For further clarity, the Regulations exempt these records from the CCPA (999.317(d)).

Seventh, the draft Regulations make clear that the CCPA applies to a business that does not collect information directly from consumers if the business sells a consumer’s personal information (999.305(d) and 999.312(e)). A business that does not interact directly with consumers must either contact the consumer directly and provide notice that the business sells that consumer’s personal information and provide them with notice of the right to opt out, or contact the source of the personal information and obtain a signed attestation from the source that it gave the consumer the notice of collection, and provide an example of the notice. Attestations must be kept for two years (999.305(d)). A business that does interact directly with consumers must also provide at least one method by which the consumer may submit requests to know or requests to delete online (999.312(e)). This provision has particular significance for consumer data resellers and advertising networks.

As helpful as these draft Regulations are, it is important to note that they also add some obligations on businesses, beyond the current CCPA requirements, including the following:

– Any time a business wants to use a consumer’s personal information for any purpose other than those disclosed in the collection notice, the business would have to “directly” notify the consumer of this new use and “obtain explicit consent” from the consumer for the new use (999.305(a)(3)). So far, the regulations do not distinguish between material and non-material changes.

– To avoid including a “Do Not Sell” notice, a business would have to affirmatively state that “[i]t does not, and will not” sell personal information (999.306(d)(2)). Furthermore, a consumer whose personal information is collected while an opt-out notice is not posted shall be deemed to have validly submitted a request to opt out. If this provision survives, it can be read as retroactively importing an opt-in requirement regarding sale into every legacy collection.

– While the current text of the CCPA states that a business “may offer financial incentives, including payments to consumers as compensation, for the collection of personal information, the sale of personal information, or the deletion of personal information,” (Cal. Civ. Code 1798.125(b)(1)), the draft Regulations would require that the mandatory financial incentive notice include a “good-faith estimate” of the value of the consumer’s data and a description of the method the business used to calculate that value (999.307(b)(5)). Interestingly, this may generate a new publicly available data set around the real financial value of personal data.

– A business would need to confirm receipt of any requests to know or delete within 10 days of receipt of the request, and the business would have to provide information about how it will process the request (999.313). Unverified requests to delete would need to be treated as requests to opt out of sales (999.313(d)(1)).

– Any business that alone or in combination, annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, the personal information of 4,000,000 or more consumers, must compile specific metrics for each calendar year. As indicated above, the regulations clarify that the maintenance of that information alone would not violate the CCPA or the regulations as long as it is not used for any other purpose (999.317(g)).

These additional obligations take the CCPA beyond GDPR in several instances, reinforcing that those relying on their GDPR program documentation and processes to carry them through the CCPA will need to revisit that approach. Swiftly too, given the potential operational impacts of implementation against the impending deadline.

19

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

The regulations are subject to a notice-and-comment period and are likely to be developed further. The deadline to submit written comments is December 16, 2019.

Authors

Michael Bahar | +1 202 383 0882 | Email

Sarah Paul | +1 212 301 6587 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Paula Barrett | +44 207 919 4634 | Email

Al Sand | +1 512 721 2721 | Email

John Allen Zumpetta | +1 212 389 5078 | Email

Pooja Kohli, Litigation Specialist | +1 212 389 5037 | Email

20

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: Dangerous seas ahead - the California Consumer Privacy Act and litigation risk

September 5, 2019

1 https://www.eversheds-sutherland.com/global/en/what/practices/commercial-it-law/whitepapers-application-of-cloud-services.page?

2 https://us.eversheds-sutherland.com/NewsCommentary/Legal-Alerts/203044/Legal-Alert-Stand-Up-Sit-Down-Stand-Up-Ninth-Circuit-Revives-Spokeo-No-injury-Suit

While many breathed a sigh of relief when the California legislature provided only a limited private right of action for data breaches under its sweeping new privacy law—the California Consumer Privacy Act (CCPA)—companies that collect the personal information of California residents are nonetheless facing a potential storm of litigation starting January 1, 2020.

This alert discusses the new litigation risks under the CCPA, other avenues that plaintiffs can use to sue companies for alleged privacy violations under the CCPA, and what companies can do to reduce their risk.

a. The Private Right of Action Under the CCPA

In its current form, the CCPA provides a limited private right of action for certain data breaches, putting a premium on ensuring “reasonable” security measures. In particular, under California Civil Code § 1798.150(a)(1), any consumer may bring suit if their “nonencrypted or nonredacted personal information is subject to an unauthorized access and exfiltration, theft, or disclosure” as a result of the business’s violation of the duty to “implement and maintain reasonable security procedures and practices.”

Therefore, it is critical for businesses to maintain—and be able to readily demonstrate through written security plans and policies—reasonable security standards to protect a California resident’s personal information from unauthorized use, access and disclosure. Consistent with other cybersecurity and privacy standards like the New York Department of Financial Services Cybersecurity Regulation or Europe’s General Data Protection Regulation (GDPR), the CCPA does not define what “reasonable” means. Instead, reasonable security procedures and practices are those that are “appropriate to the nature of the information.” In other words, California—like New York and the EU—requires companies to undertake risk-based assessments of its own cybersecurity needs, to take care not to fall below what other similarly situated companies are doing, and to monitor regulatory guidance and enforcement actions to ensure they live up to what regulators expect.1

i. CCPA Statutory DamagesImportantly, the law’s provision of statutory damages not only could result in substantial fines for noncompliance, but it also better ensures that plaintiffs have standing to bring suit. Under US Supreme Court jurisprudence, victims of data breaches have often found it very difficult to establish standing. According to the seminal case, Spokeo v. Robins, 136 S. Ct. 1540 (2016), an injury-in-fact must not only be particularized, i.e., affecting the

plaintiff in a personal and individual way, but it also must be concrete.2

The CCPA’s private right of action, however, potentially removes the hurdle by providing for statutory damages “in an amount not less than one hundred dollars ($100) and not greater than seven hundred and fifty ($750) per consumer per incident or actual damages, whichever is greater.” Cal. Civ. Code § 1798.150(a)(1)(A) (emphasis added).

In addition, in assessing the amount of statutory damages, courts are directed to assess the state of a company’s compliance posture and, in essence, evaluate how seriously it takes consumer privacy. Specifically, courts must look to: the nature and seriousness of the misconduct, the number of violations, the persistence of the misconduct, the length of time over which the misconduct occurred, the willfulness of the defendant’s misconduct, and the defendant’s assets, liabilities, and net worth. Id. at 1798.150(a)(2).

Given that data breaches can often implicate the records of tens of millions of customers, the potential for statutory damages of up to $750 “per customer per incident” raises the specter of substantial exposure for large-scale breaches. Assuming, for instance, one million California consumers were affected by a breach, the law could allow up to $750 million to be assessed against the offending business in statutory damages.

ii. CCPA Cure ProvisionAs much as the statutory damages lowers the standing bar and elevates the risks of a huge payout, the CCPA’s private right of action does contain a statutory cure provision, which provides that before filing suit on an individual or class-wide basis, a plaintiff must provide “a business 30 days’ written notice identifying the specific provisions of this title the consumer alleges have been or are being violated. In the event a cure is possible, if within the 30 days the business actually cures the noticed violation and provides the consumer an express written statement that the violations have been cured and that no further violations shall occur, no action for individual statutory damages or class-wide statutory damages may be initiated against the business.” Id. at 1798.150(b).

But a cure might not be possible in certain breach situations. For example, when addressing statutory “cure” provisions in connection with other statutes, California courts have held that future compliance is an insufficient “cure” if the defendant cannot undo the harm to the plaintiff that its alleged violation already caused. Romero v. Dep’t Stores Nat’l Bank, 725 F.

21

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

App’x 537, 540 (9th Cir. 2018) (collecting cases).

b. Private Right of Action Under California’s Unfair Competition Law?

Because the private right of action provided by the CCPA is narrow, Plaintiffs are likely to look to other California laws to bring suit for CCPA violations that do not fall within the scope of § 1798.150(a)(1), even those unrelated to a data breach. California’s Unfair Competition Law (UCL), for example, broadly prohibits, and provides civil remedies for, unfair competition, which it defines as “any unlawful, unfair or fraudulent business act or practice.” Bus. & Prof. Code § 17200 et seq. Violating the requirements in the CCPA—whether the disclosure requirements, the requirements to afford certain rights, or the obligation to maintain reasonable security measures—could constitute an unlawful, unfair or fraudulent business act or practice.3

Defendants, however, have a strong defense to such claims. California Civil Code § 1798.150(c) explicitly states that “[n]othing in this title shall be interpreted to serve as the basis for a private right of action under any other law.” Legislative history, the CCPA’s vesting of enforcement authority with the California Attorney General, and subsequent statements from the Attorney General, also support the California legislature’s intent to narrowly circumscribe the private right of action to the CCPA only. Defendants faced with class action litigation can therefore argue that the reference to “any other law” applies to the UCL and forbids reliance on the privacy provisions of the CCPA as a basis for UCL liability under the “unlawful” prong.

Absent legislative amendment or further development in the case law, plaintiffs are nonetheless likely to argue that any limited nature of the private right of action would apply only to data breaches. In other words, § 1798.150 at most bars UCL liability predicated on violation of § 1798.150 and does not bar UCL actions predicated on violations of other sections of the CCPA. In addition, plaintiffs may argue that California law permits plaintiffs to bring UCL claims based on violations of laws that do not explicitly provide for private rights of action, which is the case for the CCPA’s non-breach and privacy protections.

3 The “unlawful” prong of the UCL “borrows” violations of other laws and makes them independently actionable under the UCL. Smith v. State Farm Mutual Automobile Ins. Co., 93 Cal.App.4th 700, 718 (2001). This raises the possibility that plaintiffs’ lawyers might attempt to enforce the privacy provisions of the CCPA by using the UCL.

4 AT&T Mobility LLC v. Concepcion, 563 U.S. 333 (2011); DirecTV Inc. v. Imburgia, 36 S. Ct. 463 (2015); Kindred Nursing Centers LP v. Clark, 137 S. Ct. 1421 (2017).

c. Arbitration?

One way to defend against CCPA litigation is to proactively keep those disputes out of court. Civil Code § 1798.192 would invalidate waivers of rights under the CCPA; but it does not expressly prohibit arbitration. Therefore, defendants facing class action litigation arising from data breaches will have a reasonable argument, in light of recent US Supreme Court authority, that the Federal Arbitration Act authorizes mandatory arbitration of CCPA disputes and allows waiver of class action treatment.4

d. Conclusion

Ultimately, the best defense is compliance. Having CCPA-compliant policies, being able to respond to consumer exercises of rights, and having holistic, risk-based and well-practiced security measures effective no later than January 1, 2020, are the most effective ways to avoid or weather the incoming storm of privacy litigation in California.

Authors

Michael Bahar | +1 202 383 0882 | Email

Sarah Paul | +1 212 301 6587 | Email

Jennifer Van Dale | +1 852 2186 4945 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Ian S. Shelton | +1 512 721 2714 | Email

Rhys Patrick John McWhirter, Consultant | +1 852 2186 4969 | Email

Related attorneys

Matt Gatewood | +1 202 383 0122 | Email

Kymberly Kochis | +1 212 389 5068 | Email

Robert D. Owen | +1 212 389 5090 | Email

Phillip E. Stano | +1 202 383 0261 | Email

Lewis S. Wiener | +1 202 383 0140 | Email

Ronald W. Zdrojeski | +1 212 389 5076 | Email

Al Sand | +1 512 721 2721 | Email

22

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: New CCPA amendments bring clarity prior to the January 1, 2020 deadline

November 15, 2019

In the run-up to January 1, 2020, the California legislature and Attorney General are rushing to provide clarity to the California Consumer Privacy Act of 2018 (CCPA)—and businesses are rushing to interpret and implement these new changes and guidelines.

With less than two months to go, we wanted to provide an additional overview and analysis of the key amendments the California legislature passed, in addition to our analysis of the draft California Attorney General Proposed Regulations.

On October 11, 2019, the Governor signed seven assembly bills, five of which amend the CCPA and two of which have indirect effects on the CCPA. In these bills lie: (a) limited reprieves from the January 1, 2020 deadline; (b) further clarity on what businesses must do to verify or authenticate a consumer request; and (c) further requirements on what businesses must include in their privacy notices—and what they can exclude.

A Quick CCPA Refresher

At a high level, the CCPA provides California consumers the following rights:

1. Data Transparency – Businesses have to provide detailed information in their privacy policies about what they do with personal information, including the collection, processing, and transfer or sale of that data;

2. Right of Access and Deletion – Businesses have to provide the consumer with a right to request access to their personal information and to request the business delete that information (even if businesses don’t always have to grant the request). Importantly, businesses must have the means to operationalize these rights and verify the identity of the person making the request;

3. Right to Opt-Out – Businesses that sell personal information have to inform customers of that practice and provide them with an opt-out; and

4. Data Security & Breach Disclosure – Businesses have to implement and maintain reasonable security procedures and practices to safeguard personal information, and they are liable both before the CA Attorney General and before courts if they violate that right.

The CCPA applies to all for-profit businesses that do business in California, whether located in California or not, and meet any of the following conditions: a) have annual gross revenues in excess of $25 million or more; b) collect, sell or share for commercial purposes the personal information of at least 50,000 consumers, households or devices; or c) derive at least 50 percent of its annual revenues from selling consumers’ personal information.

The October 2019 CCPA Amendments

The amendments have added some additional clarity to the CCPA and afforded businesses some temporary relief.

– Procedures for verifying a consumer request: While the CCPA still prohibits businesses from requiring consumers to create an account with the business to make a verifiable consumer request, AB-25 amends 1798.130(a)(2) to allow businesses to require a consumer who maintains an account with the business to submit the request through the account. It also clarifies that businesses may require authentication of the consumer that is “reasonable in light of the nature of the personal information requested.” The CA Attorney General also provided guidance in the proposed regulations on how businesses can verify an individual; but, at the end of the day, there is no check-the-box approach. Businesses will be challenged to make reasonable, risk-based judgments on how to authenticate individuals relative to the sensitivity of the data requested to ensure appropriate privacy and cybersecurity, without creating an undue burden on the consumer.

– Limited exemption for employees: AB-25 adds new 1798.145(g) to exempt businesses from having to grant access and deletion rights to employees, job applicants, medical staff members or contractors, as well as business owners, directors or officers, until January 2021, including personal information retained to administer benefits for another person relating to such persons. Not exempted are (a) the right of employees, job applicants and contractors to receive a “Notice of Collection” at or before January 1, 2020; and (b) their right to sue the business for breaches of such data. In other words, companies still need to provide detailed employee privacy policies by the end of the year, and protect that information with reasonable safeguards upon penalty of potential class action, even if they don’t have to grant other rights—as a matter of law—until the start of 2021.

– “Personal information” redefined: AB-874 explicitly excludes “deidentified or aggregate consumer information” from the definition of “personal information.” This clarification will reduce the strain of CCPA compliance on the secondary data market that leverages deidentified or aggregate data for business purposes. For example, a business that purchases IP and geolocation data will not need to comply with the CCPA so long as that data is deidentified or aggregated within the meaning of the CCPA. The amendment also removed the requirement that “publicly available information” must be “compatible with the purpose for which the data is maintained and made available in public records.”

23

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

– Fair Credit Reporting Act (FCRA) exemption expanded. AB-1355 broadens the exemption in 1798.145(d) for FCRA information so that the exemption applies to a broad range of activities performed by consumer reporting agencies, by furnishers of information used in consumer reports and by users of a consumer report, to the extent such activity is subject to regulation by the FCRA and the information is not used, communicated, disclosed or sold, except as authorized by the FCRA. As with the other exemptions in 1798.145, FCRA data is subject to the private right of action for data breaches.

– Data breach liability narrowed to data that is not encrypted AND not redacted. AB-1355 modifies 1798.150 to permit a private civil action to be instituted only for personal data that is “nonencrypted and nonredacted.” Many companies will interpret this requirement as an incentive to incorporate encryption and redactions into their “reasonable” safeguards.

– Limited Business-to-Business (B2B) exemption that expires in one year. AB-1355 excludes from the CCPA until January 2021 all personal information collected by a business where it is communicating or transacting with a consumer who is acting for another organization and the communication or transaction occurs solely within the context of the business providing or receiving a product or service to or from such organization. Like job applicants and employees, a person whose personal information is gathered in the B2B context has the right to sue if there is a security breach involving his/her personal information.

– Anti-discrimination. AB-1355 retains the prohibition on a business from discriminating against the consumer for exercising any of the consumer’s rights under the CCPA. Yet, where the CCPA makes an exception if the differential treatment is reasonably related to the value provided to the consumer by the consumer’s data, AB-1355 modifies the exception to read the value provided to the business by the consumer’s data. The California Attorney General Regulations provide some helpful examples:

1. “A music streaming business offers a free service and a premium service that costs $5 a month. If only the consumers who pay for the music streaming service are allowed to opt-out of the sale of their personal information, then the practice is discriminatory, unless the $5 per month is reasonably related to the value of the consumer’s data to the business.”

2. “A retail store offers discounted prices to consumers who sign up to be on their mailing list. If the consumer on the mailing list can continue to receive discounted prices even after they have made a request to know, request to delete, and/or request to opt-out, the differing price level is not discriminatory.”

The key is: businesses that offer variable pricing, premiums or coverage have to make sure that those differences are made for the right reasons, and not for the wrong reasons. To the list of wrong reasons, the CCPA now adds consumer choices with respect to their personal information.

– Vehicle warrantees: AB-1146 carves out a specific exception from the “right to deletion” for vehicle information or ownership information between a manufacturer and a dealer, for purposes of a vehicle repair relating to a warranty or recall. This amendment paves the way for other industries to carve out exceptions to “personal information” for data that may be required to meet continuing contractual obligations, document retention requirements, or may otherwise be in the public interest.

– Online businesses: AB-1564 exempts businesses that operate exclusively online and have a direct relationship with a consumer from the requirement to provide consumers two or more methods for submitting requests. Online-only businesses are only required to provide an email address for submitting requests, and, if the business maintains an internet website, the business must also make the internet website address available to consumers to submit requests.

– Data broker registration: AB-1202 requires “data brokers” to register with the California Attorney General and to be listed on the Attorney General’s website. The bill defines a data broker as a business that knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship. This amendment, and the corresponding list that will be created, will likely have a significant effect on the data brokerage market by providing transparency, and the additional scrutiny that comes with it. Consumer reporting agencies covered by the FCRA, financial institutions subject to the Gramm-Leach-Bliley Act, and entities subject to the Insurance Information and Privacy Protection Act are exempted from the data broker registration requirements. The bill does not require data brokers to provide information on how consumers may exercise their CCPA right to opt-out of the sale of their personal information.

Expanded data breach notification requirements: AB-1130 amends California’s Data Breach Notification Law (Civ. Code § 1798.29) to expand the definition of personal information—which is the definition used for the Private Right of Action in the CCPA—to include: (a) unique biometric data; and (b) government-issued identification numbers including passport numbers. AB-1130 also provides guidance on how during a breach companies could notify other entities that use the same biometric data as an authenticator to no longer rely on it for authentication purposes.

Authors

Michael Bahar | +1 202 383 0882 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Paul McCulloch-Otero | +1 212 301 6604 | Email

Al Sand | +1 512 721 2721 | Email

24

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: The New Vendor Management World Under NYDFS’ New Cyber Regulation

March 29, 2019

As of March 1, 2019, the New York State Department of Financial Services’ (NYDFS) cybersecurity regulation, 23 NYCRR Part 500, requires financial services institutions regulated by NYDFS to implement policies and procedures to address the cybersecurity risks posed by third-party service providers to the institutions’ nonpublic information (NPI).

In their article for LegalTech News, Eversheds Sutherland attorneys Michael Steinig and Alexander Sand discuss challenges entities must overcome to maintain compliance after the deadline.

Full article

Authors

Michael Steinig | +1 202 383 0804 | Email

Al Sand | +1 512 721 2721 | Email

25

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Federal enforcement trends in the cryptocurrency sphere

Fall 2019

Although virtual currencies have proliferated across the market, there remains much confusion about what exactly constitutes a cryptocurrency, how its value is determined, and how cryptocurrencies can be used.

In their article for the Fall 2019 edition of Partnering Perspectives, Eversheds Sutherland attorneys Sarah Paul and Sarah Chaudhry provide a brief overview of cryptocurrencies and discusses how policing cryptocurrencies is becoming a priority for a number of federal agencies.

Full article

Authors

Sarah Paul | +1 212 301 6587 | Email

Sarah Chaudhry | +1 212 389 5071 | Email

Cryptocurrency

26

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: Cryptocurrency enforcement on the upswing—Texas cryptocurrency issuers agree to pay over $10 million to the SEC

September 11, 2019

Along with other regulators, the Securities and Exchange Commission (the “SEC”) has been signaling its intention to pursue those in the cryptocurrency sphere that it believes are capitalizing on the excitement and novelty of cryptocurrencies by engaging in illicit conduct. Recently, the SEC took action against one such entity, a Dallas-based cryptocurrency dealer called Bitqyck Inc. (“Bitqyck”). On August 29, 2019, the SEC announced that it had reached a settlement with Bitqyck and its founders for approximately $10.1 million. The SEC’s actions demonstrate that institutional investors may be best served by exercising reasonable due diligence, instead of solely relying on the representations made by cryptocurrency issuers, when determining whether to invest in a venture.

The SEC’s complaint against Bitqyck, which was filed in the United States District Court for the Northern District of Texas, alleged that Bitqyck and its founders, Bruce Bise and Sam Mendez, defrauded investors and operated an unregistered exchange. It charged Bitqyck, Bise and Mendez with violations of Sections 5(a), 5(c), and 17(a) of the Securities Act, as well as Section 10(b) of the Exchange Act and Rule 10b-5 thereunder. It also charged Bitqyck with violating Section 5 of the Exchange Act and Bise and Mendez with aiding and abetting that violation.

The SEC’s complaint painted a picture of deception that infused nearly every aspect of Bitqyck’s business model. The complaint alleged that Bitqyck, which is owned and operated by Bise and Mendez, mass-marketed two digital tokens, Bitqy and BitqyM, to prospective investors in 45 US states, two US territories, and 20 countries through multiple unregistered digital asset securities offerings. According to the SEC, the company raised more than $13 million from more than 13,000 investors who collectively lost over two-thirds of their investments. The defendants allegedly falsely represented to investors that if they purchased a Bitqy token, they would automatically receive one-tenth of one share of Bitqyck common stock through the operation of a “smart contract” associated with the token. In truth, however, there were no such “smart contracts” and the investors never received any Bitqyck common stock. Further, the defendants allegedly touted a global marketplace called QyckDeals, which they billed as a daily deals site like Groupon, but which did not really exist; they falsely claimed that Bitqyck owned a cryptocurrency mining facility in the state of Washington from which investors could profit; and they created an online trading platform called TradeBQ, which they failed to register with the SEC.

The complaint sought permanent injunctions, the disgorgement of ill-gotten gains with interest, and civil monetary penalties. Without admitting or denying any wrongdoing, Bitqyck, Bise and Mendez consented to the injunctive relief and Bitqyck also consented to an order requiring that it pay disgorgement, prejudgment interest and a civil penalty of $8,375,617. Bise and Mendez consented to the entry of an order that they each pay disgorgement, prejudgment interest and a civil penalty of $890,254 and $850,022, respectively.

The SEC’s action against Bitqyck and its founders, as well as the hefty amount of the penalties imposed, is another sign of the SEC’s increased focus on regulating the developing cryptocurrency market. It is also in alignment with statements made by the SEC making clear that it is a company’s—not the SEC’s—burden to demonstrate that the relevant cryptocurrency is not a security and, thus, not subject to applicable securities’ laws and regulations. It appears that investors or potential investors may have had reservations about Bitqyck even before the SEC’s action, as evidenced by online articles pondering the legitimacy of the company. Individuals and institutions seeking to invest in the cryptocurrency realm may choose to approach it with a healthy dose of apprehension and remember that if it sounds too good to be true, it probably is.

Author

Sarah Paul | +1 212 301 6587 | EmailBrian Rubin | +1 202 383 0124 | EmailMichael Bahar | +1 202 383 0882 | Email Sarah Chaudhry | +1 212 389 5071 | Email

Related attorneys

Greg Kaufman | +1 202 383 0325 | EmailAl Sand | +1 512 721 2721 | Email

27

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: IRS provides long-awaited cryptocurrency guidance

October 17, 2019

On October 9, 2019, the Internal Revenue Service (IRS) issued guidance addressing select issues involving the tax treatment of virtual currency transactions and reminding taxpayers of their reporting obligations. The guidance, provided in the form of a revenue ruling (the Revenue Ruling) and a set of frequently asked questions (FAQ), specifically addresses the tax treatment of cryptocurrency received for services, the calculation of gains and losses and tax basis in cryptocurrencies, and the method of reporting transactions for virtual currency holders. See Rev. Rul. 2019-24.

The Revenue Ruling and the FAQ are intended to supplement an IRS notice issued five years prior in March 2014, which provides that for purposes of federal income tax, virtual currencies should be treated as property, and general tax principles applicable to property transactions apply to virtual currency transactions. See Notice 2014-21.

The IRS has stated that the purpose of the new guidance is to help taxpayers better understand their reporting obligations for specific transactions involving virtual currency.

Background and Definitions

The Revenue Ruling and the FAQ define virtual currency as a digital representation of value that functions as a medium of exchange, a unit of account, and a store of value other than representation of the US dollar or a foreign currency (“real currency”). Cryptocurrency is defined as a type of virtual currency that uses cryptography to secure transactions that are digitally recorded on a distributed ledger.

Within this context, the IRS explains that a “hard fork” occurs when cryptocurrency on a distributed ledger undergoes a protocol change or shift, resulting in a permanent divergence from the existing distributed ledger and potentially resulting in the creation of new cryptocurrency. Following a hard fork in which a new cryptocurrency is issued, the new cryptocurrency is recorded on the new distributed ledger and transactions involving the prior existing cryptocurrency continue to be recorded in the prior distributed ledger.

In some instances, a hard fork will be followed by an “airdrop.” An airdrop is a method of distributing units of a cryptocurrency to the addresses of taxpayers included on the distributed ledger. In the case where an airdrop follows the occurrence of a hard fork, units of the new cryptocurrency are distributed to addresses included on the prior distributed ledger.

Though hard forks are not always followed by an airdrop, the possibility of receiving units of new cryptocurrency have raised several questions for taxpayers regarding the tax implications of such distributions.

The New Guidance

Intended to provide taxpayers with a better understanding of how to apply well established tax principles to a quickly developing and changing technological environment, the Revenue Ruling addresses two specific questions:

– Whether a taxpayer has gross income under § 61 of the Internal Revenue Code (Code) as a result of a hard fork of a cryptocurrency the taxpayer owns if the taxpayer does not receive units of a new cryptocurrency; and

– Whether a taxpayer has gross income under § 61 of the Code as a result of an airdrop of a new cryptocurrency following a hard fork if the taxpayer receives units of new cryptocurrency.

The Revenue Ruling holds that in the first instance, where a taxpayer does not receive units of a new cryptocurrency as a result of a hard fork, the taxpayer also does not have gross income under § 61.

Where a taxpayer does receive units of new cryptocurrency as the result of an airdrop following a hard fork, the Revenue Ruling holds that the taxpayer has an accession to wealth and therefore has ordinary income under § 61. The amount included in gross income is equal to the fair market value of the cryptocurrency when the airdrop is recorded on the distributed ledger. This analysis relies on the ability of the taxpayer to exercise dominion and control over the units of new cryptocurrency at the time of the airdrop.

The FAQ, additionally, provides further insight into the application of long-standing tax principles to virtual currency transactions. The FAQ reiterates that when taxpayers sell virtual currency, they must recognize any capital gain or loss on the sale, subject to applicable capital loss deductibility limitations. Gain or loss on the sale of virtual currency is measured by the difference between the taxpayer’s adjusted basis in the virtual currency and the amount the taxpayer actually received in exchange for the virtual currency. The FAQ allows the taxpayer to use specific identification for determining basis, with FIFO being the deemed method if currency is not specifically identified. LIFO is not a recognized method to account for basis.

The FAQ also touches on a number of other issues, such as the tax treatment of virtual currency received in exchange for services, the tax treatment of virtual currency received as a bona fide gift, and the method of determining the fair market value of cryptocurrency received through a cryptocurrency exchange.

28

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Eversheds Sutherland Observation: The IRS’s new virtual currency guidance marks the latest step in a series of recent efforts by the agency to increase compliance and enforcement in this space. In July 2019, the IRS announced through a news release that it had begun sending letters to taxpayers with virtual currency transactions that have either potentially failed to report income or did not accurately report their transactions. The IRS anticipated that by the end of August, over 10,000 taxpayers would receive these letters. The letters followed the IRS’s issuance of a “John Doe” summons to Coinbase, one of the largest platforms for exchanging bitcoin and other forms of virtual currency, and Coinbase’s production of approximately 13,000 customer records to the IRS. With the new guidance, the IRS continues to demonstrate that it is focused on addressing reporting issues related to virtual currency.

The new guidance, however, has already received criticism in the industry for creating rules that may result in unfair or unexpected consequences. In the case of hard forks, for instance, some observers have suggested that, under the guidance, third parties can now create tax reporting obligations on taxpayers by foisting cryptocurrency on them through an unwanted airdrop. Others have noted that the new guidance does not create any “de minimis” tax exemptions for small cryptocurrency transactions, such as using bitcoin to purchase a cup of coffee. Future guidance may be forthcoming from the IRS in these and other areas. In the meantime, taxpayers should take heed of the IRS’s reminder, in the FAQ, to maintain records sufficient to establish the positions taken on their tax returns regarding virtual currency, such as records documenting receipts, sales, exchanges, or other dispositions of virtual currency and the fair market value of the virtual currency in question.

Take a look into our Federal Tax group for more information on related contacts.

Authors

Jim Mastracchio | +1 202 383 0210 | Email

Sarah Paul | +1 212 301 6587 | Email

Karl Zeswitz | +1 202 383 0518 | Email

Katie Sint | +1 202 383 0977 | Email

29

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: The SEC marches on – Cryptocurrency startup Block.one pays civil penalty of $24 million over unregistered initial coin offering

October 17, 2019

In a somewhat surprising move, the Securities and Exchange Commission (SEC) entered into a $24 million settlement with Block.one on September 30, 2019, over its unregistered initial coin offering (ICO) that raised upwards of $4 billion. While the SEC’s pursuit of Block.one is consistent with the agency’s focus on regulating the cryptocurrency sphere, the SEC’s decision to settle with Block.one for only $24 million, which is less than 1% of the token sale as a fine, has raised eyebrows in the cryptocurrency world. The reason for the relatively small penalty may be due to an absence of intentionally deceptive conduct in connection with representations made to investors.

Block.one is a blockchain technology company based in Hong Kong and Virginia that developed EOSIO, which is a software operating system structured to underlie one or more EOSIO-based blockchains. Over the course of about one year, between June 26, 2017 and June 1, 2018, Block.one conducted an allegedly unregistered ICO of digital tokens and raised billions of dollars in the process. It publically offered and sold 900 million digital assets (ERC-20 Tokens) in exchange for Ether, a digital asset, to raise funds to develop the EOSIO software.

The SEC’s settlement, entered into by way of a cease and desist order (the “Order”), asserts that Block.one violated Sections 5(a) and 5(c) of the Securities Act of 1933 by offering and selling securities without the appropriate registration filed with the SEC. In the Order, however, the SEC does not allege that Block.one engaged in fraudulent conduct or otherwise lied to investors regarding the ERC-20 Tokens. Accordingly, the SEC’s imposition of a small penalty as well as its choice not to avail itself of other mechanisms, such as disgorgement or rescission, may reflect a conclusion by the agency that Block.one’s conduct did not amount to deliberate investor harm.

Block.one consented to the Order, but did so without admitting or denying the SEC’s findings. The company further noted in its press release that it received a waiver from the SEC “so that Block.one will not be subject to certain ongoing restrictions that would usually apply with settlements of this type.”

While the settlement has fueled speculation that the SEC’s regulation of the cryptocurrency sphere is inconsistent, or that the SEC is taking a softer approach to regulation, a review of the underlying allegations here suggests that is not so. Companies and individuals under SEC scrutiny in this space may take comfort from the fact that the SEC seems to be evaluating the facts on a case-by-case basis, rather than applying a one-size-fits-all approach.

Authors

Sarah Paul | +1 212 301 6587 | EmailSarah Chaudhry | +1 212 389 5071 | Email

Related attorneys

Greg Kaufman | +1 202 383 0325 | Email

Brian Rubin | +1 202 383 0124 | Email

30

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: New IRS Tax Guidance Targets Crypto, and US Persons Who Use It

October 31, 2019

On October 9, 2019, the United States Internal Revenue Service issued Revenue Ruling 2019-24 and a series of frequently asked questions, identifying rules governing US taxation of digital currencies. Taxation in the US is unbelievably complex, but the new IRS guidance takes a step-by-step approach to address some of the most common issues facing holders of digital currency.

In their article for Cointelegraph, Eversheds Sutherland attorneys James Mastracchio, Sarah Paul and Katie Sint discuss the guidance and how individuals holding digital currency are impacted.

Full article

Authors

Jim Mastracchio | +1 202 383 0210 | Email

Sarah Paul | +1 212 301 6587 | Email

Katie Sint | +1 202 383 0977 | Email

31

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: The CLOUD Act – A cross-border data access agreement rises from the fog

October 23, 2019

In collaboration with Eversheds Sutherland attorneys Emma Gordon and Laura Dunseath.

On October 3, 2019, the United States and the United Kingdom entered into the world’s first ever agreement (the Agreement) under the Clarifying Lawful Overseas Use of Data Act (the CLOUD Act), the text of which has now become public. The Agreement will make it easier for American and British law enforcement agencies to obtain certain electronic data from technology companies based in each country by removing traditional legal barriers to access. And there may soon be more to come—the US Department of Justice recently announced that it is in formal negotiations with Australia to enter into a similar CLOUD Act agreement.

The CLOUD Act was passed by Congress in March 2018. It clarified that companies subject to US jurisdiction served with court orders must turn over data they control, regardless of where the data is stored. It also authorized the United States to enter into executive agreements with foreign governments regarding cross-border data requests. The Agreement between the US and the UK is the first such executive agreement since the CLOUD Act’s enactment. The Agreement will undergo a six-month Congressional review period mandated by the CLOUD Act, as well as review by the UK’s Parliament.

The Agreement allows law enforcement agencies, when armed with an appropriate court order from their home country, to request electronic data by going directly to “covered providers” based in the other country, rather than requesting the data through the other country’s government process. Covered provider, under the Agreement, means any private entity that provides to the public the ability to communicate, or to process or store computer data, by means of a computer system or a telecommunications system, or any private entity that processes or stores data on behalf of such an entity. If approved, the Agreement is expected to largely supplant the Mutual Legal Assistance Treaty (MLAT) process that the US and UK currently use to request electronic data from technology companies based in the other country, which can take years. The new process under the Agreement is estimated to take a matter of weeks, or even days.

The CLOUD Act has been controversial, due in part to concerns over data privacy rights. Some of the same types of data privacy concerns have been raised about the Agreement. Nevertheless, covered providers that may be subject to data requests under the Agreement can take at least some comfort from the safeguards that it provides. For instance, the Agreement contains a mechanism by which a covered provider can object to a data request order, and through which unresolved objections are ultimately decided by the covered provider’s home country. The Agreement requires the UK

to adopt and implement appropriate procedures to minimize the acquisition, retention, and dissemination of information concerning US persons that is acquired pursuant to an order. The Agreement, moreover, can only be used to obtain information about “serious crimes,” punishable by a maximum penalty of three years or more of incarceration, and contains certain use limitations, including giving the United States the power to veto the use of evidence obtained through the Agreement in cases that raise free speech concerns.

Questions remain about how the Agreement will work in practice, and whether the Agreement risks sacrificing data privacy rights for the sake of expediency. But the Agreement does have safeguards, and becoming familiar with those safeguards will be important for covered providers that fall within its reach.

Authors

Sarah Paul | +1 212 301 6587 | Email

Giselle Guerra | +1 713 470 6115 | Email

Michael Bahar | +1 202 383 0882 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Emma Gordon | +44 20 7919 4931| Email

Al Sand | +1 512 721 2721 | Email

Laura Dunseath | +44 20 7919 0712 | Email

Cross-border data transfers

32

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: Passing the eye test—Defense strategies and the Biometric Information Privacy Act

September 10, 2019

1 BIPA exempts some private entities, such as financial institutions or their affiliates that are subject to the Gramm-Leach-Bliley Act of 1999.

2 See No. 18-15982, 2019 WL 3727424 (9th Cir. Aug. 8, 2019).

3 U.S. Const., art. III, § 2.

4 Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1547-48 (2016); Lujan v. Defenders of Wildlife, 504 U.S. 555, 560 (1992).

5 Spokeo, 136 S. Ct. at 1548 (quoting Lujan, 504 U.S. at 560 n.1).

6 Id.

7 740 Ill. Comp. Stat. Ann. 14/20.

8 See Aguilar v. Rexnord LLC, No. 17-CV-9019, 2018 WL 3239715, at **3-4 (N.D. Ill. July 3, 2018) (finding lack of standing due to absence of concrete harm where em-ployee knew his biometric information was being collected to clock in and out); McCullough v. Smarte Carte, Inc., No. 16-C-03777, 2016 WL 4077108, at *4 (N.D. Ill. Aug. 1, 2016) (distinguishing BIPA from Article III).

As the use of biometric data continues to grow and become more prevalent across industries of all types and sizes, complying with data security and privacy laws has never been more critical or challenging. This is particularly true for businesses subject to the Biometric Information Privacy Act (BIPA), an Illinois law widely acknowledged as the leading law governing biometric security. BIPA imposes strict requirements on private, non-governmental entities that collect, store, use, or profit from biometric data belonging to Illinois residents.1 As discussed in our prior alert regarding BIPA compliance, BIPA has driven hundreds of class action lawsuits over the past several years.

The BIPA avalanche has only grown stronger following the Rosenbach v. Six Flags Entertainment Corp. decision by the Illinois Supreme Court in January 2019, where the Illinois Supreme Court held that plaintiffs need not have suffered or even allege actual harm beyond a procedural violation to bring a claim and, therefore, seek relief under BIPA. In August 2019, the Ninth Circuit weighed in on this issue, affirming the lower court’s finding that plaintiffs who did not consent to a company’s use of facial recognition technology had standing sufficient to assert a BIPA claim, thereby allowing a class to continue to pursue their claims in federal court.2 In light of these and other decisions, companies subject to BIPA have begun to explore new defenses to these claims.

While adherence to BIPA’s requirements and robust compliance efforts are the most obvious ways to mitigate exposure under BIPA, below is an overview of some potential defenses for parties already in litigation.

BIPA Defenses

StandingArticle III of the US Constitution grants federal courts the power to hear cases and controversies and limits the matters over which federal courts may preside.3 The US Supreme Court has interpreted Article III to require that plaintiffs suffer an actual or concrete injury in fact in order to seek redress in federal court.4 As a constitutional requirement, plaintiffs must therefore demonstrate an injury that is particularized and that affected them personally in order to bring suit in federal court.5 The injury must also be real rather than merely abstract, hypothetical, or conjectural.6

Most, but not all federal courts in Illinois have dismissed complaints where plaintiffs have not demonstrated an injury in fact sufficient to confer standing to litigate their BIPA claims. As a statutory matter, BIPA gives anyone who is aggrieved by a privacy violation under the act the opportunity to bring a claim.7 What constitutes an “aggrieved” person is not further defined under BIPA. Federal courts in Illinois have held that even if a person is “aggrieved” for purposes of satisfying the statutory requirement under BIPA, that allegation alone is not sufficient to satisfy the Article III standing requirement. Therefore, Article III standing is a separate issue from statutory standing under BIPA in that a plaintiff could conceivably be aggrieved under BIPA, but otherwise fail to satisfy the constitutional standing requirement under Article III.8 That said, at least one federal court in Illinois has held that a defendant’s alleged violation of a plaintiff’s right to

Biometrics

33

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

privacy was enough to satisfy standing.9 Further chipping away at the standing defense is a decision out of the Northern District of California, which held that BIPA codifies an individual right of privacy in one’s biometric information, the violation of which constitutes a concrete injury; this decision was affirmed by the Ninth Circuit in August 2019.10

Following the Rosenbach decision from the Illinois Supreme Court in January 2019,11 Illinois state court plaintiffs may not need to allege actual harm to establish standing necessary to pursue a BIPA claim. In March 2019, an Illinois appellate court, applying the Rosenbach decision, reversed a lower court’s dismissal because the plaintiff did not allege any actual damages – the appellate court found that alleging a statutory violation of BIPA was sufficient for standing pursuant to Rosenbach.12 It is unlikely that defendants in BIPA actions will give up the standing argument entirely, however, in part because allegations of harm can differ from complaint to complaint.

Moreover, although a motion to dismiss is the most common vehicle for raising a Spokeo/standing challenge, it is not the only one. Challenges to a plaintiff’s injury, or lack thereof, may also be raised by other means later in the litigation, such as at the class certification phase by arguing that not all class members have been injured or at the summary judgment phase by arguing a lack of evidence (and thus lack of triable fact) of injury.

ExtraterritorialityBIPA’s geographical reach or scope continues to remain somewhat of an open question. According to at least one court in the Northern District of Illinois in the Monroy v. Shutterfly 2017 decision, BIPA does not apply outside of Illinois as nothing in the statute indicates that it the Illinois legislature intended for it to have extraterritorial effect.13 The Monroy court held that the threshold question in determining whether the case even involves a potential extraterritorial application of BIPA is whether the circumstances giving rise to the case took place “primarily and substantially” within Illinois.14 If yes, then an extraterritoriality defense may not be viable. If no, then extraterritoriality, or the application of BIPA outside of its prescribed bounds, may be a potential issue to further explore. This is a case-by-case, fact-

9 Monroy v. Shutterfly, Inc., No. 16-C-10984, 2017 WL4099846, at *8 n.5 (N.D. Ill. Sept. 15, 2017) (“[p]utting aside the question of whether a merely procedural or techni-cal violation of the statute alone is sufficient to confer standing . . .” but finding that the plaintiff’s allegation of a privacy violation was sufficient).

10 290 F. Supp. 3d 948, 953-54 (N.D. Ca. 2018), aff’d, 2019 WL 3727424 (9th Cir. Aug. 8, 2019).

11 Rosenbach v. Six Flags Entm’t, Corp., No. 123186, 2019 IL 123186 (Ill. Jan. 25, 2019).

12 Rottner v. Palm Beach Tan, Inc., No. 1-18-0691, 2019 WL 1049107, at *1-2 (Ill. App. Ct. Mar. 4, 2019).

13 Monroy, 2017 WL4099846, at *5.

14 Id. at *6.

15 Id. (allowing extraterritoriality defense to be raised at a later time “if and when the record affords a clearer picture of the circumstances relating to [the plaintiff’s] claim”).

16 28 U.S.C. § 1446(b).

17 Duron v. UNIFOCUS (Texas), L.P., No. 1:18-cv-06479 (N.D. Ill. July 29, 2019), ECF No. 64.

dependent exercise. Because this is fact-intensive, the strength of an extraterritoriality defense may not be fully known until the parties have engaged in some discovery.15

As a separate, but related issue, defendants faced with a BIPA claim should consider whether the issue of geography can be used as a defense tactic. For example, a defendant sued in a state court should evaluate whether the case may be removed to federal court and/or transferred to a more favorable or closer-to-home venue. Successfully removing a state court case to federal court within 30 days after service of the complaint16 would allow the defendant, once in federal court, to file a motion to transfer the case to a location of its choosing, provided the new venue is proper and satisfies all applicable venue requirements. But this tactic must be weighed in light of its potential costs. In removing to federal court and transferring to a new venue, defendants must rely in large part on the luck of the draw when it comes to the assignment of the judge in the new venue. Removal and transfer, if not carefully considered, can lead to unanticipated consequences later in litigation.

Constitutional and Statutory DefensesAs BIPA litigation continues to develop, defendants have raised various defenses grounded in both the US Constitution and BIPA itself, with varying degrees of success. For example, some BIPA defendants have asserted a lack of personal jurisdiction. Personal jurisdiction is ripe for a defense when the defendant is an out-of-state defendant that conducts limited, or even no, business or conduct within the state where the case is pending. For example, a company alleged to have provided Illinois employers with biometric timekeeping equipment filed a motion to dismiss in July 2019 in the Northern District of Illinois, in part, on personal jurisdiction grounds, arguing that it does not have offices, employees, or a registered agent in Illinois.17 The company additionally argued that the plaintiffs failed to allege that the company was the plaintiffs’ employer or that it could ensure that Illinois employers complied with BIPA. The company’s personal jurisdiction argument essentially rested on its position

that holding it liable for alleged BIPA violations of other third-parties would violate the constitutional underpinning of personal jurisdiction.

34

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Personal jurisdiction is both a statutory and constitutional requirement. States have their own rules regarding what level of activity within the state is sufficient for the court to have personal jurisdiction over the defendant while the Due Process Clause of the Fourteenth Amendment also requires that the defendant have sufficient contacts18 with the forum state.

The strength of a personal jurisdiction defense in BIPA cases is arguably questionable as courts in the Northern District of Illinois have granted and denied motions to dismiss for lack of personal jurisdiction in BIPA claims.19 In one case, the plaintiff alleged that there was personal jurisdiction over the defendant in Illinois because the defendant was registered in Illinois, had an office there, and offered its face recognition technology to millions of users, including Illinois residents.20 But the court granted the defendant’s motion to dismiss for lack of personal jurisdiction because the defendant did not specifically target Illinois residents with its conduct. In another case, Norberg v. Shutterfly, also in the same district, the defendants similarly argued that they offered their online services nationwide and did not specifically target Illinois customers.21 In the Norberg case, however, the court denied the motion to dismiss, holding that BIPA is an Illinois statute with strong interests in allowing in-state plaintiffs to litigate at home in Illinois.22 In other words, the success of a defense may hinge entirely on the judge hearing the case.

The dormant Commerce Clause presents another potential constitutional defense to BIPA, insofar as BIPA’s geographical reach may regulate activity outside of Illinois. The dormant Commerce Clause limits states’ authority to pass legislation that impacts interstate commerce.23 As with the other constitutional challenges to BIPA, the strength of this argument remains largely unknown. At least one defendant has raised this defense in conjunction with a challenge to personal jurisdiction, but a court in the Northern District of Illinois declined to address the issue because it was raised at too early a stage in the litigation without sufficient facts regarding how BIPA could affect the defendant’s business in other states.24

Finally, a defendant may also raise a constitutional due process challenge by arguing that the high statutory damages available under BIPA (up to $5,000 per single violation plus attorneys’ fees) bears no relation to the harm alleged by the plaintiff (often minimal at best).

Statute of LimitationsDefendants may also argue, if applicable, that plaintiffs have not brought their claims within the required time period, violating the

18 See Int’l Shoe Co. v. Washington, 326 U.S. 310, 316-17 (1945).

19 No. 15-C-7681, 2016 WL 245910 (N.D. Ill. Jan. 21, 2016) (granting motion to dismiss for lack of personal jurisdiction); Norberg v. Shutterfly, Inc., 152 F. Supp. 3d 1103 (N.D. Ill. 2015) (denying motion to dismiss for lack of personal jurisdiction).

20 2016 WL 245910, at *2.

21 See Norberg, 152 F. Supp. 3d at 1105 (explaining defendant’s business).

22 Id.

23 Healy v. Beer Inst., 491 U.S. 324, 326 n.1 (1989).

24 Monroy, 2017 WL 4099846, at *7-8.

applicable statute of limitations. BIPA does not specify a statute of limitations, and so the strength of this defense remains yet another potentially open issue in BIPA litigation.

Class CertificationLastly, challenges to the certification, or court approval, of a class is a vital component of any defense of class action litigation. Under Rule 23 of the Federal Rules of Civil Procedure, a claim may proceed as a class action only if the class is so numerous that it would be impracticable to join each individual class member to the case separately; there are common issues to the class members; the claims or defenses of the class representatives are typical of those of the class; and the representatives will protect the class interests.

BIPA fact patterns may present grounds ripe for class certification challenges, for example, if issues such as consent or extraterritoriality require courts to evaluate them on an individualized, rather than class-wide basis.

Conclusion

BIPA defendants have an array of potential defenses at their disposal. The viability of these defenses will become clearer as BIPA jurisprudence continues to develop in the coming months and years. Defendants faced with allegations of BIPA violations should consider what defense tactics may best serve their business and desired outcomes in litigation while also considering what changes, if any, may be needed to their compliance programs to mitigate the risk of similar suits in the future.

Authors

Lew Wiener | +1 202 383 0140 | Email

Frank Nolan | +1 212 389 5083| Email

Emily Bork | +1 202 383 0870 | Email

Related attorneys

Michael Bahar | +1 202 383 0882 | Email

Bill Dudzinsky | +1 202 383 0106 | Email

Mark Herlach | +1 202 383 0172 | Email

Bob Owen | +1 312 724 8268 | Email

Sarah Paul | +1 212 301 6587 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Sarah Chaudhry | +1 212 389 5071 | Email

35

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: Biometrics beware – Compliance and the Biometric Information Privacy Act

April 11, 2019

1 Tex. Bus. & Com. Code Ann. § 503.001 (2009).

2 Wash. Rev. Code § 19.375 (2017).

3 See, e.g., Alaska, Delaware, Massachusetts, Michigan, and New York.

4 Over the last few years, a few bills have been proposed. For example, on March 18, 2019, members of the Senate Committee on Commerce, Science, & Transportation, introduced the Commercial Facial Recognition Privacy Act of 2019. As written, the bill only applies to facial recognition technology and would not supersede state laws. The bill has not yet moved out of committee.

5 See, e.g., the Health Insurance Portability and Accountability Act (HIPAA); Gramm-Leach-Bliley; and the Children’s Online Privacy Protection Act (COPPA).

6 740 ILCS 14/10 (2008).

7 Rivera v. Google, 238 F. Supp. 3d 1088, 1095 (N.D. Ill. 2017); see also Norberg v. Shutterfly, Inc., 152 F. Supp. 3d 1103, 1106 (N.D. Ill. 2015); In re Facebook Biometric Info. Privacy Litig., 185 F. Supp. 3d 1155, 1172 (N.D. Cal. 2016).

8 “Biometric identifiers do not include writing samples, written signatures, photographs, human biological samples used for valid scientific testing or screening, demo-graphic data, tattoo descriptions, or physical descriptions such as height, weight, hair color, or eye color.” 740 ILCS 14/10.

9 Id.

Companies in all industries and of all sizes are increasingly using biometric data—fingerprints, voiceprints, and facial structure, to name three—as a faster, more reliable, and more economical alternative to passwords and other forms of security. Biometric data is not typically stored in the form in which it is captured, and risks associated with a breach of biometric data are therefore greatly minimized compared to other forms of stored sensitive data. The downside is that, unlike a Social Security number, for example, a person’s biometric data generally cannot be altered, creating a much longer tail on the risk that does remain, and one with a greater potential for harm. As a result, states have begun enacting laws specifically addressing the collection and safekeeping of biometric data, with more states expected to follow suit in the coming years.

By far the most prominent of these laws is the Illinois Biometric Information Privacy Act (BIPA), which has been the subject of hundreds of class action lawsuits in the last few years alone. Companies that handle biometric data—especially but not only biometric data belonging to Illinois residents—should be aware of the numerous requirements that BIPA imposes. Texas1 and Washington2 have also enacted statutes governing its residents’ biometric data. Although neither provides a private right of action, both states’ laws do impose certain notice and consent requirements, along with biometric data retention limits. Other states continue to consider legislation akin to the Illinois, Washington, and Texas laws.3 There is not yet a single, overarching federal law governing biometrics,4 despite some industry-specific laws incorporating biometrics protections in limited fashion.5

In addition to being the only statute to provide a private right of action, BIPA imposes the strictest requirements on private entities that collect, store, or use biometric data. The law also contains important ambiguities, including as to the extent of its extraterritorial reach, whether and how it applies to photographs and the scope of its applicability to service providers which are

not customer-facing. Companies may want to consider complying with BIPA, even if not necessarily subject to its reach, to mitigate the risk of falling within BIPA’s strictures, to help deter costly litigation, and to provide a degree of insurance against future biometrics laws.

What is covered by BIPA?

BIPA encompasses what it defines as “biometric identifiers” and “biometric information,” to which we will refer collectively as “biometric data.” Biometric identifiers include “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.”6 Biometric information, in turn, is defined as “any information” based on a biometric identifier that can be used to identify that individual. According to one court, “whatever a private entity does in manipulating a biometric identifier into a piece of information, the resulting information is still covered by [BIPA] if that information can be used to identify the person,” even if the resulting information is a “mathematical representation or, even simpler, a unique number assigned to a person’s biometric identifier.”7

It is essential also to understand what kind of information is not covered by BIPA. Writing samples, demographic information, and physical descriptions are excluded, as are biological materials covered by the Genetic Information Privacy Act and information collected “in a health care setting.”

Importantly, photographs are excluded from the definition of a biometric identifier under BIPA,8 and the definition of biometric information explicitly states that it does not include information derived from items excluded under the definition of biometric identifiers.9 Nonetheless, a few courts have held that a scanned photograph can be subject to the requirements of BIPA in certain circumstances. As one court put it, the law “does not specify how the biometric measurements must be obtained,” and “particular biometric identifiers can, in fact, be collected in various ways without altering the fact that the measurements still are biometric

36

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

identifiers” subject to BIPA’s protections.10 According to that same court, the “bottom line is that a ‘biometric identifier’ is not the underlying medium itself, or a way of taking measurements, but instead is a set of measurements of a specified physical component (eye, finger, voice, hand, face) used to identify a person.”11 Therefore, if a private entity uses an individual’s biometric measurements contained in a photograph to ultimately identify that individual, it could constitute a “scan of face geometry,” one of the biometric identifiers defined in BIPA.

To presume that BIPA’s “scan of face geometry” biometric identifier would only apply to an in-person scan of an individual’s face, as opposed to any scan from which biometric measurements could be taken, would be a higher risk interpretation, or, in the words of the Northern District of Illinois, a “narrow” and “problematic” reading of BIPA.12 Indeed, Facebook, Google, and Shutterfly have been named in class action lawsuits arising from allegations that individuals’ biometric identifiers were gathered from photographs uploaded to the defendants’ websites.13

Therefore, to mitigate risk, if a company is using photographs for facial identification purposes, it may be wise for it to follow the BIPA requirements.

Who is subject to by BIPA?

Illinois residents asserting BIPA violations have mostly been employees or consumers whose biometric data was collected in the course of their employment or use of a defendant’s commercial services. BIPA regulates any private (non-governmental) entity that collects, stores, uses or profits from biometric data belonging to Illinois residents.14 Some private entities, however, are exempted. Most notably excepted, are those financial institutions or their affiliates subject to the privacy notice provisions of the Gramm-Leach-Bliley Act of 1999.15 Furthermore, by its own terms, BIPA cannot conflict with the HIPAA, the X-Ray Retention Act, or the Private Detective, Private Alarm, Private Security, Fingerprint Vendor, and Locksmith Act of 2004.

Whether BIPA applies extraterritorially to non-Illinois companies remains somewhat of an open question. At least two BIPA defendants located outside the state raised the issue at the motion to dismiss stage, and although the Northern District of Illinois found that there are “legitimate extraterritoriality

10 Rivera, 238 F. Supp. 3d at 1095 (emphasis in original).

11 Id. at 1096.

12 Monroy v. Shutterfly, No. 16 C 10984, 2017 WL 4099846, at *3 (N.D. Ill. Sept. 15, 2017); see also In re Facebook, 185 F. Supp. 3d at 1172.

13 See Rivera, 238 F. Supp. 3d at 1097 (“if Google simply captured and stored the photographs and did not measure and generate scans of face geometry, then there would be no violation”) (emphasis in original).

14 The exemption for governmental entities extends to contractors and subcontractors.

15 740 ILCS 14/25.

16 Rivera, 238 F. Supp. 3d at 1100; see also Monroy, 2017 WL 4099846, at *6.

17 740 ILCS 14/20.

18 740 ILCS 14/

concerns,” and that BIPA does not apply extraterritorially “as a matter of law,” the defense has not been sufficient—at least yet—to warrant dismissal of a BIPA claim.16 Similarly, the viability of constitutional defenses, including under the Dormant Commerce Clause or Due Process Clause, is not entirely clear.

What’s at stake?

For several years following its enactment in 2008, BIPA sat relatively dormant. Then, beginning in about 2015, several plaintiffs’ firms began filing putative class action complaints against some of the most well-recognized companies in America. The increasing proliferation of BIPA class action lawsuits is no surprise in light of two facts: first, the use of biometrics is growing rapidly in all industries and for a variety of purposes; and second, BIPA provides an exceptionally rich incentive to plaintiffs’ attorneys filing these lawsuits. Indeed, the statute provides for recovery of liquidated damages of $1,000 per negligent violation and $5,000 per intentional or reckless violation, plus recovery of fees and costs, including legal and expert expenses.17

In January 2019, the Illinois Supreme Court found that a plaintiff need not allege actual harm, separate from a violation of BIPA, to satisfy the “aggrieved” person standard under the law. See Legal Alert: The floodgates open – Illinois Supreme Court issues landmark ruling in biometrics case. This is particularly important in this context because there have been no BIPA complaints arising from a breach of biometric data, so plaintiffs have generally been unable to plead actual harm in connection with a violation of this law. Contrary to the Illinois Supreme Court, most federal courts have dismissed complaints where there are no allegations of actual harm necessary to satisfy the injury-in-fact requirement of Article III of the US Constitution.

Given the outsized damages at stake, let us turn to what BIPA requires of companies that handle biometric data.

What does BIPA require?

BIPA imposes five general requirements on companies that use biometric data in one form or another.18

1. Consent: Collection, Use, StorageThe subject of the vast majority of BIPA lawsuits thus far is found under Section 15(b), which imposes written consent requirements on private entities that “collect, capture, purchase, receive through

37

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

trade, or otherwise obtain” an individual’s biometric data. The obtaining entity must explain why and for how long the biometric data is being collected, stored, or used, and the individual (or that person’s legally authorized representative) must execute a written release.

A critical point of ambiguity in the law is whether, as a practical matter, its notice and consent requirements can apply equally to those that collect biometric information and to those that receive it. The provisions of Section 15 are not phrased in the passive voice, such that a processor or service provider would merely have to ensure, via contract with the collector, that appropriate consent has been obtained. Instead, the company itself appears to have to inform the individual and collect its written consent. Plaintiffs have seized on the broad provision to assert BIPA violations against third-party vendors who may store or use biometric data despite not necessarily interfacing directly with individuals from whom the biometric data is collected.

As a risk mitigation measure, therefore, to the extent a service provider or processor does not have the opportunity to collect consent directly, it may try to ensure, via contract or through software modifications, that the collectors are naming them in the consent requests. For the same reasons, service providers may consider including indemnity provisions specifically tied to complying with BIPA.

2. Consent: Disclosure & DisseminationThe statute includes a separate consent requirement for private entities that intend to disclose an individual’s biometric data. Entities responding to warrants or subpoenas are not bound by this requirement, nor are entities using the biometric data to complete a financial transaction by the individual.

Both consent requirements can be satisfied in the employment context by obtaining a written release as a condition of employment.

3. Prohibition against profitingIn addition to its consent requirements, BIPA explicitly prohibits private entities from selling, leasing, trading, or “otherwise profit[ing] from” an individual’s biometric data. This has not generally been the subject of BIPA class actions, thus raising a question of how broadly courts will interpret the phrase “otherwise profit.”

4. Retention PolicyCompanies subject to BIPA must also develop, publish, and abide by a retention schedule for biometric data they collect. Biometric data must be destroyed by the earlier of the time at which the purpose of the initial collection has been satisfied or three (3) years from the last interaction between the entity and the individual. Companies must make this retention schedule publicly available.

5. Reasonable Standard of CareFinally, entities that possess biometric data governed by BIPA must “store, transmit, and protect” biometric data: (1) using the reasonable standard of care in the entity’s industry; and (2) in a manner consistent with how the entity handles other sensitive information. This two-prong requirement highlights the need for companies to incorporate biometrics into their data compliance programs and to stay abreast both of security threats as well as prevention and response best practices. In other words, the requirement for reasonable security features means a risk-based approach and a dedicated—and documented—commitment to a continuous culture of cybersecurity.

Conclusion

States recognize that biometric data presents not only unique opportunities but also unique risks. To quote the Illinois legislature: “Biometrics, however, are biologically unique to the individual; therefore, once compromised, the individual has no recourse, is at heightened risk for identity theft, and is likely to withdraw from biometric-facilitated transactions.” With that as a backdrop, states have stepped in to regulate the space, in ways that carry the potential for substantial costs for even the appearance of noncompliance. Given the doubt surrounding BIPA’s extraterritorial reach, its implications for photographs, its applicability to service providers, and the practical difficulties in parsing data by state residency, companies unsure of whether and when BIPA applies may want to consider a proactive and prudential compliance program. Even those companies that do fall definitively under BIPA may want to make sure they have a mechanism in place to ensure continuous compliance with BIPA’s security requirements.

Authors

Lew Wiener | +1 202 383 0140 | Email

Michael Bahar | +1 202 383 0882 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

Frank Nolan | +1 212 389 5083| Email

Related attorneys

Mark Herlach | +1 202 383 0172 | Email

Emily Bork | +1 202 383 0870 | Email

Sarah Chaudhry | +1 212 389 5071 | Email

38

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Legal Alert: The floodgates open – Illinois Supreme Court issues landmark ruling in biometrics case

January 28, 2019

In a unanimous decision handed down on January 25, 2019, the Illinois Supreme Court reversed a lower court opinion and held that a plaintiff need not show actual harm to seek relief under the Biometric Information and Privacy Act (BIPA). Instead, a procedural violation is sufficient to bring forth a claim under the law. The decision in Rosenbach v. Six Flags Entertainment Corp. is welcome news to plaintiffs’ attorneys, who will now have fewer impediments to pursue no-injury class action lawsuits under this biometric information protection statute, which allows for recovery of statutory damages of up to $5,000 per single violation and attorneys’ fees, with no cap on damages. Although the Rosenbach decision is at odds with most—but not all—federal case law on this issue, companies that collect, store, and use biometric information should be aware of its implications. The law applies broadly to all companies and vendors that collect biometric information, though it does entirely exempt certain entities, such as those financial institutions covered under the Gramm-Leach-Bliley Act. Thus far, companies that collect biometric data from their employees and customers have been the primary targets of BIPA class actions.

The Rosenbach complaint was filed on behalf of a minor whose fingerprints were collected and retained by defendant Six Flags in connection with a season pass to the amusement park. The plaintiff alleged that Six Flags violated BIPA by collecting the biometric fingerprint information without obtaining prior written consent, and by failing to inform the plaintiff and other class members of its data storage, use, and collection practices. Six Flags moved to dismiss the complaint, arguing that Rosenbach did not qualify as an “aggrieved person” under BIPA because he did not suffer an actual injury as a result of Six Flags’ failure to comply with the law’s requirements. An Illinois appellate court ruled in favor of Six Flags in December of 2017 and dismissed the case. Thereafter, in a different case, another Illinois appellate court found that a plaintiff does not need to allege actual harm to pursue a BIPA claim. The Illinois Supreme Court was thus poised to resolve the lower court split.

In reversing the lower court’s dismissal, the Illinois Supreme Court pointed out that the legislature has enacted other statutes that explicitly require a plaintiff to suffer actual harm to bring a claim but had not done so under BIPA. The court also rejected the notion that the harm in this case was merely technical. According to the court (and the state legislature), there is an inherent right to privacy associated with biometric data.

Enacted in 2008, BIPA is the oldest and most punitive of the biometric protection statutes in the United States. It remains the only biometrics law (for now) that provides for a private right of action, statutory damages, and attorneys’ fees. In each of the last few years, plaintiffs’ lawyers have filed dozens of putative class actions alleging violations of BIPA, against companies of all sizes. Following Rosenbach, the number of lawsuits against companies for BIPA violations will likely increase exponentially, because businesses will no longer be able to argue that plaintiffs should be forced to show actual harm before bringing claims under the law—at least in those cases brought in Illinois state courts.

On the other hand, there is still uncertainty surrounding whether procedural harm alone is sufficient for plaintiffs to meet Article III’s standing requirement in federal court. In March 2018, a federal district court in California allowed plaintiffs to move forward in a BIPA lawsuit against Facebook—even though the plaintiffs did not allege any actual injury—because the court found that the Illinois legislature had “codified a right of privacy in personal biometric information” and that right “is quintessentially an intangible harm that constitutes an injury in fact.” In December 2018, however, a federal district court in Illinois reached an opposite conclusion, finding that a mere procedural violation of BIPA cannot confer Article III standing to plaintiffs.

How the BIPA standing issue gets resolved at the federal court level remains to be seen. Nevertheless, there will be an uptick of BIPA litigation in Illinois state court. Companies that have not paid particularly close attention to the law’s disclosure and written consent requirements now face increased pressure to do so or risk litigation exposure.

Authors

Lew Wiener | +1 202 383 0140 | Email

Michael Bahar | +1 202 383 0882 | Email

Frank Nolan | +1 212 389 5083 | Email

Emily Bork | +1 202 383 0870 | Email

Related attorneys

Mark Herlach | +1 202 383 0172 | Email

Mike Nelson | +1 212 389 5061 | Email

Mary Jane Wilson-Bilik | +1 202 383 0660 | Email

39

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

Back

Article: Illinois Courts Continue To Interpret BIPA Broadly

April 24, 2019

The Illinois Biometric Information Privacy Act (BIPA) has been the focus of a significant amount of litigation and is one of the most protective state statutes regarding individuals’ biometric data. Specifically, BIPA regulates private entities’, including employers’, collection, use and storage of biometric data. The plaintiffs’ bar has increasingly been bringing BIPA class actions against companies, including employers, who use, collect, and store individuals’ biometric data.

The potential damages for such lawsuits can be quite high. While there have not been many publicly disclosed settlements, one of the earliest and most prominent settlements resulted in an entity paying out approximately $1.5 million to settle BIPA claims based on its alleged improper storage and use of customers’ fingerprint data.

In their article for Law360, Eversheds Sutherland attorneys Lewis Wiener, Frank Nolan and Sarah Chaudhry analyze recent Illinois court decisions regarding BIPA, Rosenbach v. Six Flags Entertainment Corp. and Liu et al. v. Four Seasons Hotel, and discuss how companies can best comply with BIPA to withstand the scrutiny of an Illinois court.

Authors

Lew Wiener | +1 202 383 0140 | Email

Frank Nolan | +1 212 389 5083 | Email

Sarah Chaudhry | +1 212 389 5071 | Email

40

Cybersecurity and Data Privacy review and update: Looking back on 2019 and planning ahead for 2020

eversheds-sutherland.com

© Eversheds Sutherland (US) LLP 2020. All rights are reserved to their respective owners. Eversheds Sutherland (International) LLP and Eversheds Sutherland (US) LLP are part of a global legal practice, operating through various separate and distinct legal entities, under Eversheds Sutherland. For a full description of the structure and a list of offices, visit eversheds-sutherland.com. US20035_020420