Cyber Security, Information Governance & Privacy

We are pleased to announce that Troutman Sanders partner Ronald Raether will make a presentation on, “Incident Response Plans: Avoiding Common Mistakes through a Table Top Exercise,” at the Fraud & Breach Prevention Summit at the Hyatt Hotel in Dallas, Texas on April 24th, 2018 at 10:50 a.m. Ronald will also be on a panel discussion, “Know Your Attacker: Lessons Learned from Cybercrime Investigations,” on April 24th, 2018 at 4:00 p.m.

Ronald’s presentation will give attendees insight on:

  • Response to IRPs and why it matters
  • Privilege of Breach Response Efforts
  • Table Top Exercise and Goals
  • Purpose of the Incident Response Team
  • Membership of the Incident Response Team and Training
  • Walkthrough several common incident scenarios

Ronald’s panel discussion will give attendees insight on:

  • Today’s most prevalent cybercrime schemes
  • Traits of the threat actors that are most common
  • Lessons learned from actual crime investigations

ISMG hosts the Fraud & Breach Prevention Summit yearly in different locations across the United States. The conference will bring industry leaders from across the globe speaking on specialties ranging from IoT and the use of deception technology, on-going business email compromise trends, DDoS for extortion and ransomware attacks.

ISMG has designed its sessions to address the needs of CISOs, fraud and risk teams, security and IT professionals, and many others by providing hands-on tools, real-world problems and solutions for attendees to be able to take back to their office.

To register or obtain additional information, visit the ISMG Website. For a 10% discount on registration, use code: SAVE10%

Please join us on Tuesday, April 24th from 3:00 – 4:00 PM ET for a complimentary webinar with speakers Ronald Raether and Sheila Pham.

Learn how to make sure that all involved parties communicate effectively to develop a defensible cyber risk management program which will stand up under scrutiny and avoid common pitfalls. Effective cyber risk management requires resolving natural conflicts between functionality, security and privacy. Yet most companies fail to acknowledge, let alone address, key differences in these groups. Even the term “standard” is understood differently. From real-world examples, attendees will learn how to manage a program which will stand up under scrutiny and what common pitfalls to avoid.

Covered Topics

  • Legal, regulatory, and business issues to consider in creating a data governance and cybersecurity program
  • Determining what data privacy and cybersecurity standards to follow
  • Steps to take in creating a data governance and cybersecurity program
  • Importance of a tailored data governance and cybersecurity program
  • Industry best practices

Key Takeaways

  • Identify tensions and consider resolutions among various stakeholders
  • Identify various frameworks
  • Understand the necessary steps to create a tailored data governance and cybersecurity plan

Register here.

The Clarifying Lawful Overseas Use of Data Act, commonly referred to as the “CLOUD Act,” a last-minute addition to the $1.3 trillion federal spending bill, has been signed into law by President Donald Trump. The Act allots the United States government more access to Americans’ overseas data for law enforcement purposes and helps foreign governments access domestic data from their own citizens.

The Act was added to the 2,232-page omnibus spending bill one day ahead of its vote. The bill passed 256-167 in the House, and 65-23 in the Senate.

The law is essentially an update to the Electronic Communications Privacy Act, a series of laws that regulate how U.S. law enforcement officials can access data stored overseas. Congress passed the ECPA in 1986 which, due to obvious advances in technology over the past 30 years, is ill-equipped to handle today’s variety of electronic communications and related data.

Prior to the CLOUD Act, the United States could only access data stored overseas through mutual legal-assistance treaties (“MLATs”). With a MLAT, two or more nations are required to put in writing how they are willing to accommodate each other with legal investigations. Each proposed MLAT must receive a two-thirds approval from the Senate to pass.

Through the Act, law enforcement officials at any level, including local police, can require companies to turn over user data regardless of where the data is stored.

The Act also gives the executive branch the ability to enter into “executive agreements” with foreign nations. These agreements, which do not require congressional approval, could allow each nation to acquire stored personal data from other nations, regardless of the hosting nation’s privacy laws.

Through the CLOUD Act, electronic data stored overseas is now far more accessible to law enforcement officials. According to a 2013 report by the President’s Review Group, the average MLAT request took an average of ten months to fill. Under the new regime, this turnaround time undoubtedly will be shorter.

Troutman Sanders LLP will continue to monitor developments regarding implementation of the CLOUD Act.

Last month, the North American Reliability Corporation (“NERC”) approved a settlement agreement between the Western Electric Coordinating Council (“WECC”) and an unnamed power company that imposed a penalty of $2.7 million on the power company for improper cybersecurity oversight after the company inadvertently allowed critical cyber security data to be exposed online for 70 days.

According to NERC’s Notice of Penalty, the online data exposure was attributed to a third-party contractor who was doing work for the unnamed company. The contractor improperly accessed data from the company’s network and copied that data onto the contractor’s network. While the information was on the contractor’s network it was accessible online to anyone without password protection. The information exposed records of over 30,000 assets, including records associated with Critical Cyber Assets (CCAs) such as IP addresses and server host names.

The breach was discovered when a white hat security researcher found the information on the internet. The unnamed power company then notified WECC, its regulator, of the breach. The data incident ultimately revealed that the power company was in violation of NERC’s Critical Infrastructure Protection (“CIP”) Reliability Standards. The WECC found that the power company “failed to implement adequately its program to identify, classify, and protect information associated with CCAs [cyber security assets]” and failed to “implement adequately a program for managing access to protected information related to CCAs.”

When determining the penalty to assess on the power company, WECC took into consideration several factors, a few of which worked to the power company’s advantage: (1) that the violations constituted the power company’s first occurrence of violations of the subject NERC Reliability Standards; and (2) that the power company had an internal compliance program at the time of the violations.

The unnamed power company did not admit or deny the allegations, but agreed to the penalty and agreed to take corrective action to mitigate the violation and facilitate future compliance under the terms of the settlement. The penalty will be effective upon expiration of the 30-day period following the filing of NERC’s notice, or upon final determination by the Federal Energy Regulatory Commission.


Going slow and steady may work out for you if you’re a tortoise competing against an overly confident hare. However, if you’re in the mobile device industry and have been lagging on sending out security updates, it’s time to pick up the pace. A new Federal Trade Commission report issued last month found that while the industry has taken steps to expedite the security update process, more can be done to streamline the process and make it easier for consumers to ensure their devices are secure.

As noted in the FTC’s report, “[s]ecurity researchers and government agencies have consistently maintained that the best way to secure consumer information is to take reasonable steps to design secure products and maintain their security with updates that patch vulnerabilities in device software. Despite this consensus, security researchers and industry observers have reported that many mobile devices’ operating systems (the software that powers the devices’ basic functions) are not receiving the security patches they need to protect them from critical vulnerabilities.”

While the FTC commended the mobile device industry for its efforts to expedite the security update process, it set forth the following five recommendations as ways to continue and improve such efforts:

  1. Educating Consumers: Government, industry, and advocacy groups should work together to educate consumers about their role in the operating system update process and the significance of security update support. The FTC notes that the more consumers understand the importance of updates, the more likely they are to install available updates and consider security updates when making decisions to purchase, use, and upgrade devices.
  2. Start with Security: Businesses should consider security as a foundational aspect of their practices and procedures. As such, manufacturers, carriers, and operating system developers should ensure that reasonable security update support is a shared priority, reflected in each company’s policies, practices, and contracts.
  3. Learning from the Past: Companies should evaluate current practices by studying past practices. This requires keeping more consistent records on security support topics such as update decisions, support length, update frequency, customized patch development time, carrier testing time, and uptake rate. The FTC notes that an analysis of this data may provide an empirical basis for improving mobile device security.
  4. Security-Only Updates: The industry should consider how best to package security updates to encourage consumers to accept them. This may require offering security-only updates that do not include general software updates, which some users may be hesitant to accept due to feature changes or potential impact on memory, battery life, bandwidth, or the operating system.
  5. Providing Consumers With Better Information About the Security Update Process: Device manufacturers should consider adopting and stating minimum guaranteed support periods for their devices and should clearly explain the date on which updates will end.

The report was primarily based on the FTC’s findings from information they requested in May 2016 from eight mobile device manufacturers about how they issue security updates. It also took into consideration information it received from wireless carriers about their security updates practices. The FTC noted that while the data provided by these companies was not sufficiently representative to permit definitive conclusions about industry practices as a whole, it did provide remarkable insight into the security update practice that affects a large proportion of the devices on the U.S. market.

In November, we identified an emerging trend involving Article III standing in cases brought under Illinois’ Biometric Information Protection Act (“BIPA”). The Northern District of California’s recent decision in Patel v. Facebook Inc., No. 3:15-cv-03747-JD, 2018 U.S. Dist. LEXIS 30727 (N.D. Cal. Feb. 26, 2018), denying Facebook’s motion to dismiss for lack of subject matter jurisdiction, demonstrates that the trend is persisting.

BIPA’s Requirements

BIPA requires entities collecting, using, and storing biometric data (such as face, retina, and fingerprint scans) to, among other things, inform and obtain consent from the owners of the data. BIPA also requires the entity to establish a retention schedule such that the biometric data is destroyed when the purpose for its use has been satisfied or within three years, whichever occurs first. Users must be informed of this retention schedule.

Facebook Plaintiffs Have Standing

Patel originated as three separate cases filed in Illinois. After the parties stipulated transfer of the cases to California, they were consolidated into a single class action. The named plaintiffs allege that Facebook, through its “tag suggestions,” which employs “state-of-the-art facial recognition technology,” secretly collected plaintiffs’ biometric data without their consent.

Facebook moved to dismiss the class action complaint for lack of subject matter standing, arguing that plaintiffs’ alleged injury was not sufficiently concrete under Spokeo, Inc. v. Robins, 136 S. Ct. 1540 (2016). Pursuant to Spokeo, a plaintiff must demonstrate standing to sue by alleging the “irreducible constitutional minimum” of (1) an “injury in fact” (2) that is “fairly traceable to the challenged conduct of the defendants” and (3) “likely to be redressed by a favorable judicial decision.”

After clarifying that Spokeo “did not announce new standing requirements,” but instead “sharpened the focus on when an intangible harm such as the violation of a statutory right is sufficiently concrete to rise to the level of an injury in fact,” the Northern District of California in Patel found that Illinois’ legislature clearly desired to vest in Illinois residents “the right to control their biometric information by requiring notice before collection and giving residents the power to say no by withholding consent.” According to the court, that right “vanishes into thin air” when an on-line company like Facebook disregards the procedures—such as requiring consent and establishing a retention schedule—set forth in BIPA. The Court wrote, “The precise harm the Illinois legislature sought to prevent is then realized,” and that harm, though intangible, “quintessentially” constitutes a concrete injury, satisfying Article III standing. 

Comparing Patel With Previous BIPA Decisions 

At first blush, the Patel decision may appear inconsistent with recent decisions on Article III standing in BIPA cases, most notably, the Second Circuit’s recent decision in Santana v. Take-Two Interactive Software, No. 17-303, 2017 U.S. App. LEXIS 23446 (2d Cir. Nov. 21, 2017). In Take-Two, the plaintiffs alleged the video game developer violated BIPA when it failed to inform them of (1) the collection of their biometric data, and (2) its biometric data retention schedule, when the plaintiffs provided facial scans to create a 3-D rendition of their faces in order to generate personal avatars. Unlike in Patel, the Take-Two court affirmed the lower court’s decision in granting Take-Two’s motion to dismiss for lack of Article III standing. The Take-Two court explained that the alleged violations of BIPA, as stated by the plaintiffs, “fail[ed] to raise a material risk of harm.”

Essential to the Take-Two decision was the court’s rejection of allegations that Take-Two collected consumers’ biometric data without authorization. To create the avatar, the plaintiffs were required to agree to terms and conditions explaining that their avatar would be visible to other users and may be recorded. Then the plaintiffs had to place their faces within 6 to 12 inches of the camera and slowly turn their heads to the left and right for approximately 15 minutes. The court held that “no reasonable person” would believe that Take-Two was conducting anything other than a facial scan. Also notable was that the plaintiffs did not allege Take-Two lacked sufficient protocols, that its policies were inadequate, or that it was unlikely to abide by its internal procedures. The Second Circuit found that the plaintiffs’ allegations amounted to “only a bare procedural violation.”

Similar reasoning was applied in McCollough v. Smarte Carte, Inc., 2016 U.S. Dist. LEXIS 100404 (N.D. Ill. Aug. 1, 2016). In Smarte Carte, plaintiff Adina McCollough used her fingerprint to rent a locker at Chicago’s Union Station. In order to rent the locker, she was prompted to place her finger on a fingerprint scanner; thereafter, a rendition of her fingerprint appeared on a touchscreen. In order to unlock the locker, McCullough again had to place her finger on the scanner and her print displayed on the screen. A matched fingerprint unlocked the locker. McCollough claimed Smarte Carte violated BIPA by not informing her that her biometric data was being collected and by not obtaining her consent. Like Take-Two, the Smarte Carte court held that McCollough “undoubtedly understood” that her fingerprint was being retained so that she could retrieve her belongings from the locker and thus there was no concrete injury and no Article III standing.

The Patel court distinguished these cases, explaining that the harm alleged in Take-Two and Smarte Carte was a “far cry” from plaintiffs’ allegations in Patel, reasoning that, in both Take-Two and Smarte Carte, “plaintiffs had sufficient notice to make a meaningful decision about whether to permit the data collection.” In Patel, however, the plaintiffs alleged they were afforded “no notice and no opportunity to say no.” The court’s decision to deny Facebook’s motion to dismiss seemed to turn on that critical distinction.

The Emerging (Persistent) Trend

Comparing Patel, Smarte Carte, and Take-Two, the emerging trend we identified last year is holding up. Where a plaintiff can show she had no meaningful opportunity to withhold consent for the collection of her biometric data, she will likely have Article III standing. However, unless she can show her biometric data was stolen or was at risk of being stolen, where a plaintiff gives her biometric information knowingly and willingly, the technical failure to obtain consent or otherwise inform the user will likely be insufficient to assert Article III standing.

It is well known that secrets don’t make friends, and if you’re a public operating company, this is especially true for disclosures related to material cybersecurity issues. Last week, the Securities and Exchange Commission issued a guidance that serves as a reminder for public companies of their cybersecurity disclosure requirements under federal securities laws. The publication reinforces and expands on a similar guidance issued in 2011 by the Division of Corporation Finance, but also focuses on two topics that were not previously addressed: the importance of cybersecurity policies and procedures, and applicable insider trading prohibitions in the cybersecurity context.

The 24-page publication provides direction in essentially three areas. It provides an overview of the rules requiring disclosure of cybersecurity issues, it discusses the need for adequate controls and procedures that would enable a company to make timely disclosures, and it briefly reminds companies of their duty to comply with laws related to insider trading in connection with information about cybersecurity issues.

With a goal of “assist[ing] public companies in preparing disclosures about cybersecurity risks and incidents,” the guidance not only attempts to describe what cyber-related information is required to be disclosed, but also identifies the information that companies are not required to disclose, namely “specific, technical information about their security systems, the related networks and devices, or potential system vulnerabilities in such details as would make such systems, networks, and devices more susceptible to a cybersecurity incident.” While this seems to provide some useful guidance, companies may struggle to understand what this means since the publication also indicates that, “nevertheless, we [the SEC] expect companies to disclose cybersecurity risks and incidents that are material to investors …,” and that such disclosures should “avoid generic cybersecurity-related disclosure and provide specific information that is useful to investors.” How companies will provide specific information that is useful to investors but not specific enough to allow nefarious individuals to penetrate their systems is still left up to the companies to determine and requires expertise and understanding of the convergence of the information security issues and the law.

Although the SEC unanimously approved the guidance, it was with reservation. For example, on February 21, SEC Commissioner Kara M. Stein issued a statement on the guidance in which she explained that the SEC could have done more to help companies formulate more meaningful disclosures for investors, especially since the SEC had seven years of experience and insight to learn from since the 2011 guidance had been released. Rather than issuing a guidance that Stein believes “provide[d] only modest changes to the 2011 staff guidance,” she proposed that the SEC could have, among other things, examined what the staff had learned since the release of the 2011 guidance and capitalized on those findings, or discussed various disclosures that investors find useful, such as information relating to whether a particular cybersecurity incident is likely to occur, or how a company internally prioritizes cybersecurity risks and incidents. Even though Stein believes the SEC could have done more, she noted that, “it is hard to disagree with the [SEC] emphasizing the importance of the disclosure of cybersecurity risks and incidents.”

In the last few years, the right to privacy has been hotly debated in the United States. What critics do not understand or appreciate is that the next technological paradigm is completely dependent on improvements both to the quality and quantity of data.

As connected things (IoT) explode in popularity, they make things such as augmented reality (AR) and autonomous vehicles possible. And as interconnectivity grows, so do the opportunities. The companies that fail to leverage those opportunities may find themselves falling behind their competitors.

In venturing into these emerging paradigms, companies should stay informed of recent enforcement actions, cases, and laws to determine how their role within new ecosystems may be impacted.

This publication attempts to cover the ongoing evolution of the legal landscape for data-based products, so that organizations can continue to succeed in their development of data-based products.

Click here to download the report

While no one thinks it’s a good idea to talk about breakups in the month of February, with the deadline approaching for certain federal agencies to comply with the digital identity requirements outlined in the National Institute of Standards and Technology’s Special Publication (SP) 800-63-3, agencies should prepare themselves to say goodbye to outdated, first-generation technologies that are no longer viewed as effective digital authentication methods.

NIST 800-63-3 was published in June 2017 and represents the first major overhaul of the digital identity guidelines that were originally published over eleven years ago and were last revised in 2013.  The new revision retires the old model of the guidelines that required agencies to select appropriate digital identity technology based on a single level of assurance.  The new model, included in SP 800-63-3, recognizes that in today’s market there are three separate components to digital identity services, if not more, including identity proofing, authentication, and federation.  Using these components, SP 800-63-3 introduces three different areas of assurance (collectively referred to as “xALs”), each of which has three distinct assurances levels—Levels 1 to 3.  As with the prior version of SP 800-63, the assurance level of each xAL dictates the appropriate digital identity technology that agencies should use to meet the technical requirements for the selected xAL assurance level.

Aside from the most obvious change noted above, SP 800-63-3 also makes some notable changes relating to authentication.  For example:

  1. One-time-passwords (“OTPs”) delivered via email are no longer viewed as a multi-factor authentication method as passwords and email addresses are both considered to be “something you know” factors.
  2. “Restricted Authenticators” (e.g., sending a code to a known phone number) have been widely used as an authenticator and were previously viewed as a reliable method of authentication, but have become less reliable as criminal conduct and technology have evolved.  While Restricted Authenticators are not completely prohibited, the use of such authenticators imposes additional obligations on agencies.  Currently, authenticators using the public switched telephone network, including phone and Short Message Service (SMS)-based OTPs, are restricted.
  3. New password recommendations that, rather than requiring users to reset their passwords on a periodic basis, urge the use of passwords (referred to in the publication as memorized secrets) that are at least 64 characters long, and be comprised of phrases that users can easily memorize.
  4. The use of pre-registered knowledge tokens—for example, questions like “what is the name of your favorite law firm” (obviously, Troutman Sanders)—can no longer be used to authenticate or recover a lost, stolen, or forgotten credential.

Agencies have until June 2018 to meet the requirements of SP 800-63-3 and to say farewell to their outdated digital identity technologies.  We assume that NIST understood that the most painful goodbyes are the ones that are never explained, and that is why NIST actively tried to explain the reasoning behind the numerous updates and changes imposed by SP 800-63-3 through various communications, including their FAQ page that can be accessed here.  NIST has made it clear that although the new guidelines have changed substantially from past versions, SP 800-63-3 is not the be-all-end-all for digital identity guidelines.  With the market and technology constantly changing, and new threats continuously emerging, it’s only a matter of time before NIST tells us it’s time to move on again to more healthy circumstances.

As we previously reported, last year the United States District Court for the Middle District of North Carolina trebled a jury verdict against DISH Network L.L.C., resulting in a $61 million damages award.  Following the jury verdict, the Court asked class counsel to devise a means for identifying class members.

DISH’s records, referred to as the “Five9/SSN records,” were used as a means of identifying the approximately 180,000 class members whose phone numbers were on the National Do Not Call Registry.  In November 2017, plaintiff Thomas Krakauer moved for entry of judgment as to 11,471 persons, “contending their class membership cannot reasonably be disputed based on the existing data.”  DISH argued in opposition that only 221 class members existed for which there were no evidentiary conflicts as to their identity or entitlement to judgment.

On January 25, the Court granted Krakauer’s motion, with two small exceptions, and held that the identities of the class members are not reasonably subject to dispute: “Dish has been found liable for willfully violating the TCPA 51,000 times, and the Five9/SSN records identify in overwhelming numbers the persons Dish’s agent attempted to solicit on Dish’s behalf.”

Throughout its opinion, the Court repeatedly admonished DISH for its alleged “lack of respect” to the Court and “its continuing repetition of long-rejected arguments, and its attempt to obfuscate the issues, confuse the record, and shift arguments and facts.”  The Court concluded that “[r]esolving uncertainties as to the remaining 7000 or so class members need not consume an irrational amount of resources by the Court, the parties, and the Claims Administrator in order to make reasonable decisions.”

We will continue to monitor the case.