Internet of Things (IoT)

With the May 25, 2018 deadline quickly approaching, many businesses are scrambling to prepare for compliance with the EU’s General Data Protection Regulation (GDPR), and questions and conversations are heating up.  Still others are still trying to wrap their arms around what GDPR is and what it means for U.S. businesses.  For those of you still trying to wrap your heads around it, below are a few basics to help familiarize yourself with the regulation and its relevance to you.

  1. I’m a U.S. business. Why does GDPR matter to me?

The reach of the GDPR regulation extends not only to European-based businesses, but also to all companies that do business, have customers, or collect data from people in the EU.  If you even have a website that could collect data from someone visiting the site from the EU, your business could be affected. No matter where your business resides, if you intentionally offer goods or services to the European Union, or monitor the behavior of individuals within the EU, the GPDR could be applicable.

  1. What’s the risk?

In addition to the PR or brand risk of being associated with noncompliance, GDPR provides for some pretty significant monetary penalties .  Some violations are subject to fines up to 10 million EUR or up to 2% of global annual turnover, whichever is greater.  For other violations, it is double – up to 20 million euros or 4% of your global annual turnover, whichever is greater.  For large businesses, this could be a substantial amount.

  1. What should I be doing?

First, talk with your general counsel or outside law firm.  They can help you interpret the law, review contractual obligations and assess the company’s overall privacy policies to help guide your compliance strategy going forward.  They can also help create defensible interpretations within certain ambiguous language in the regulation (e.g., what is “personal data” for purposes of the GDPR?).  The Article 29 Working Party, made up of the data protection authorities (DPAs) from all EU member states, has published guidance to clarify certain provisions, which can be helpful during this process.

Second, create a cross-functional team including areas including (but not limited to): communications/PR, IT, customer experience, digital, legal and operations.  This may be fairly similar to any cross-functional teams you may have (and hopefully have) already established to prepare for data breaches.  This team can begin designing and implementing a compliance strategy.  Under certain conditions, your business may need to appoint a Data Protection Officer (DPO) (See Articles 29 and 30).

  1. What are some key points of the GDPR?

GDPR is a data privacy regulation in the EU that is aimed at protecting users’ rights and privacy online.  It requires business to assess what kinds of data they’re collecting and to make that data accessible to users.  The regulation is long and complex with several moving parts, but four key points may be worth noting.

Key Definitions:  You will see several references to controllers, data subjects, personal data, and processing.  This vocabulary may be unfamiliar in relation to U.S. law, but here is how these key terms are defined – as a business subject to GDPR, you may be a “controller” or you may be a “processor”.  The individual is the “data subject”:

  • “Controller” = “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law.”
  • “Processor” = “means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”
  • “Data subject”= “an identified or identifiable natural person (see definition of “personal data” above).”
  • “Personal data” = “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
  • “Processing” = “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”
  1. Some Key Articles/Provisions:

Article 12Transparent information, communication and modalities for the exercise of the rights of the data subject.

This article creates rules around how users give consent to record their data.  The data subject must be provided with accurate information on all relevant issues, such as the kind of data to be collected or process, and for what purposes. For some particularly sensitive data, (e.g., political opinion, religion, biometric data (including photographs), health data, etc.), consent must be “explicit”.   Consent must be “freely given”, meaning that the user has a “genuine” choice and be able to withdraw consent “without detriment”.  The data subject cannot be obliged to consent to data processing that is not necessary to provide the service he or she has requested.

For these reasons, the traditional “notice and consent” may not be sufficient, and actionable forms or buttons may be necessary.  “Silence, pre-ticked boxes or inactivity,” however, is presumed inadequate to confer consent.  Recital 32 of the GDPR notes that an affirmative action signaling consent may include ticking a box on a website, “choosing technical settings for information society services”, or “another statement or conduct” that clearly indicates assent to the processing.  “Silence, pre-ticked boxes, or inactivity” however, is presumed inadequate.  For those reaching European citizens digitally, working with IT or UX experts may prove important to create a seamless, but compliant, experience.

Article 17Right to erasure

The “right to be forgotten” means that businesses must be able to remove data on a user at their “without undue delay”.  Further, the businesses have an obligation to erase personal data “without undue delay” under certain additional circumstances.

Article 20. Right to data portability.

Users have the right to receive any data that a business may have on them the firm must provide such data in a “structured, commonly used and machine-readable format”.  Further, the data subject has the right to transmit such data to another business without being hindered by the business that provide the data where the processing is either (a) based on certain consents or (b) carried out by automated means.  Where technically feasible, the data subject also has the right to have the personal data transmitted directly from one controller to another.

Article 8. Conditions applicable to child’s consent in relation to information society services.

Article 8 limits the ability of children to consent to data processing without parental authorization.  Previous drafts of the GDPR had set the age of consent at 13 years old, which would have been consistent with the age set by the United States’ Children’s Online Privacy and Protection Act (“COPPA”). A last-minute proposal aimed  to raise the age of consent to 16 years old.  In the final draft, the age of consent is set at 16 unless a member state sets a lower age no below 13 years.  Thus, unless otherwise provided by member state law, controllers must obtain parental consent when processing the personal data of a child under the age of 16. With the difference between the U.S. age of consent under COPPA set at 13 (COPPA) and the European age of consent under the GDPR set at 16 (unless otherwise lowered by a member state), this could present some challenges for U.S. businesses offering international services.

Article 32.  Security of Processing.

Firms must follow security best practices across the board when collecting and protecting data. This may include, but isn’t limited to, specific password policies, information security frameworks (e.g., NIST, ISO, COBIT/ISACA, SSAE, etc.), and data encryption.

  1. What Else Should I Know?

If you believe your business might be affected, you should already be familiarizing yourself with the GDPR regulations and be well into your compliance plan.  The above summary is a sampling of key points and not a comprehensive analysis,, which should be undertaken to better understand your compliance obligations.  You should also be aware of the ePrivacy Regulation which will be following on the heels of the GDPR.

Whereas the GDPR covers the right to protection of personal data, while the ePrivacy Regulation encompasses a person’s right to a private life, including confidentiality.  There is some obvious overlap here, but the ePrivacy Regulation is intended to particularize GDPR for electronic communications — devices, processing techniques, storage, browsers etc.  The laws are intended to be in sync, but the ePrivacy regulations are still up in the air — optimistically forecasted to be finally approved by the end of 2018, although the implementation date remains to be seen.  In sum, GDPR compliance is all you can focus on right now, and hopefully GDPR compliance should position your business well for any additional compliance obligations that could subsequently arise from the finalized ePrivacy Regulation.

On August 1, 2017, the Senate introduced the “Internet of Things (IoT) Cybersecurity Improvement Act of 2017”, which aims to bolster the security of government-acquired IoT devices.  Sponsored by Sens. Mark Warner (D-VA), Cory Gardner (R-CO), Ron Wyden (D-OR), and Steve Daines (R-MT), the bill would require connected devices purchased by the government agencies to be patchable, rely on industry standard protocols, not use hard-coded passwords, and not contain any known security vulnerabilities.

The bill would also require each executive level agency head to inventory all connected devices used by the agency.  OMB and DHS would establish guidelines for the agencies based on DHS’s Continuous Diagnostics and Mitigation (CDM) program.  Specifically, the bill directs OMB to develop alternative network-level security requirements for devise within limited data process and software functionality.  It also directs DHS to issue guidelines regarding cybersecurity coordinated vulnerability disclosure policies to be required by contractors providing connected devices to the U.S. Government.  Finally, researchers would be exempted from liability under the Computer Fraud and Abuse Act and the Digital Millennium Copyright Act when engaging in good-faith research pursuant to adopted coordinated vulnerability disclosure guidelines.

This legislation follows calls for more security and standards addressing IoT devices to further safeguard information from potential attacks. For example, the Government Accountability Office (GAO) recently recommended that the Department of Defense update its policies to address IoT risks that leave them vulnerable to attacks.  In addition, Trump’s executive order on cybersecurity called for reports with recommendations to reduce the threat of botnets and other automated distributed attacks.

In a press release, Senator Warner, co-chair of the Senate Cybersecurity Caucus (SCC), states that the bill would provide “thorough, yet flexible guidelines for Federal Government procurements of connected devices.”  In the same statement, the SCC’s co-chair, Sen. Garner, states the bill would “ensure the federal government leads by example and purchases devices that meet basic requirements to prevent hackers from penetrating our government systems.”

To view the introduced legislation, click here.

To view the public statement, click here.

To view the fact sheet summary, click here.

Today, Vizio, Inc., agreed to pay $2.2 million to settle charges by the FTC and the New Jersey Attorney General that it installed software on its Smart TGVS to collect viewing data on 11 million consumer televisions without the consumers’ knowledge or consent. The $2.2 million payment includes a $1.5 million payment to the FTC, and a $1 million payment to the New Jersey Division of Consumer Affairs, although $300,000 will be suspended and vacated after 5 years upon compliance with the order.   In a concurring statement, Commission Ohlhausen supported the order, but questioned the FTC’s allegation that individualized television viewing activity falls within the definition of sensitive information.

The 2014 complaint alleged that Vizio and an affiliate company manufactures smart TVs that capture second-by-second information about video displayed on the Smart TV, including video from consumer cable, broadband, set-top box, DVD, over-the-air broadcasts, and streaming devices.  In addition, Vizio facilitated the integration of specific demographic information (e.g., sex, age, income, marital status, household size, educational level, home ownership, household value, etc.) to the viewing data.  Vizio then sold the information to third parties, who used it for various purposes, including targeted advertising to consumers across devices.

According to the complaint, Vizio touted its “Smart Interactivity” features that “enables program offers and suggestions”, but failed to inform consumers that the settings also enabled the collection of consumer’s viewing data. The complaint alleges that Vizio’s data tracking, – which occurred without viewer’s informed consent – was unfair and deceptive. The Complaint charges that the Defendants participated in deceptive and unfair acts in violation of Section 5 of the FTC act, and similar charges under the New Jersey Consumer Fraud Act, in connection with the unfair collection and sharing of consumers’ Viewing Data and deception concerning their “Smart Interactivity” features.

As part of the settlement, Vizio stipulated to a federal court order that:

  • Requires Vizio to prominently disclose and obtain affirmative express consent of its data collection and sharing practices;
  • prohibits misrepresentations about the privacy, security, or confidentiality of consumer information they collect;
  • requires Vizio to delete data collected before March 1, 2016; and
  • requires Vizio to implement (and review biennially) a comprehensive data privacy program.

In a concurring statement, Commissioner Ohlhausen supported Count II of the complaint, alleging that Vizio deceptively omitted information about its data collection and sharing program.  However, she expressed concern about the implications of Count I, which alleged that granular (household or individual) television viewing activity is sensitive information, and that sharing this viewing information without consent causes or is likely to cause  a “substantial injury” under Section 5 of the FTC Act.  Although Commissioner Ohlhausen acknowledged that there may be good policy reasons to consider such information, she states that the statute does not allow the FTC to find a practice unfair based primarily on public policy, and that this case demonstrates “the need for the FTC to examine more rigorously what constitutes ‘substantial injury” in the context of information about consumers. Ohlhausen indicated that she will launch an effort in the coming weeks to examine this issue further.

To view the stipulated order, click here.

To view Commissioner Ohlhausen’s concurring statement click here.