Information Governance and Risk Management

A Berlin regional court recently ruled that Facebook’s use of personal data was illegal because the social media platform did not adequately secured the informed consent of its users. A German consumer rights group, the Federal of German Consumer Organisations (vzvb) said that Facebook’s default settings and some of its terms of service were in breach of consumer law, and that the court had found parts of the consent to data usage to be invalid.  One concern highlighted by the consumer rights group was that, in Facebook’s app for smartphones, a service was pre-activated that revealed the user’s location to the person they were chatting to.  Also, in the privacy settings, ticks were already placed in boxes that allowed search engines to link to the user’s timeline, meaning that anyone would be able quickly and easily to find a user’s profile.

A week after the ruling, Facebook promised to radically overhaul its privacy settings, saying the work would prepare it for the introduction of the upcoming General Data Protection Regulations (GDPR).  Facebook has faced repeated attacks from Germany and other European regulators over issues ranging from perceived anti-competitive practices to alleged misuse of customer data. In October, the Article 29 Working Party (WP29) launched a task force to examine the sharing of user data between WhatsApp and Facebook, which it says does not have sufficient user consent.  “Whilst the WP29 notes there is a balance to be struck between presenting the user with too much information and not enough, the initial screen made no mention at all of the key information users needed to make an informed choice, namely that clicking the agree button would result in their personal data being shared with the Facebook family of companies,” the group told WhatsApp in October.

Similarly, a Belgian court earlier this month ordered Facebook to stop collecting data on users or face daily fines of €250,000 a day, or up to €100million.  The court ruled that Facebook had broken privacy laws by tracking people on third-party sites. “Facebook informs us insufficiently about gathering information about us, the kind of data it collects, what it does with that data and how long it stores it,” the court said. “It also does not gain our consent to collect and store all this information.”  The court ordered Facebook to delete all data it had gathered illegally on Belgian citizens, including people who were not users of the social network.

With regards to the German suit, Facebook said it would appeal, releasing a statement that it had already made significant changes to its terms of service sand data protection guidelines since the case was first brought in 2015. In the meantime, Facebook stated it would update its data protection guidelines and terms of services so that they comply with the new EU-wide GDPR rules.

On January 28, 2017, as part of Data Privacy Day, Facebook shared its data privacy principles for the first time. In a blog post drafted by Erin Egan, Facebook’s Chief Privacy Officer, Facebook posted these principles to help users understand how data is used and managed on the site. Among other things, Facebook’s data privacy principles stress user control of privacy, the goal of protecting users’ accounts and implementing security tools (like two-factor authentication), and user ownership of information shared. Facebook also announced the launch of a new education campaign to help users understand how data privacy is handled by the company. As part of this effort, Facebook is preparing to roll out a “Privacy Center” that features important privacy settings in a single place.

This publication comes ahead of the European Union’s (EU) General Data Protection Regulation (GDPR), which will be implemented on May 25, 2018. The GDPR will set stringent data privacy requirements for companies operating in the EU.  In recent years, Facebook has faced scrutiny from EU regulators over its handling of user data. Facebook hopes to embrace a more transparent data privacy approach to meet all GDPR obligations.

To view Facebook’s Privacy Principles, click here.

With the May 25, 2018 deadline quickly approaching, many businesses are scrambling to prepare for compliance with the EU’s General Data Protection Regulation (GDPR), and questions and conversations are heating up.  Still others are still trying to wrap their arms around what GDPR is and what it means for U.S. businesses.  For those of you still trying to wrap your heads around it, below are a few basics to help familiarize yourself with the regulation and its relevance to you.

  1. I’m a U.S. business. Why does GDPR matter to me?

The reach of the GDPR regulation extends not only to European-based businesses, but also to all companies that do business, have customers, or collect data from people in the EU.  If you even have a website that could collect data from someone visiting the site from the EU, your business could be affected. No matter where your business resides, if you intentionally offer goods or services to the European Union, or monitor the behavior of individuals within the EU, the GPDR could be applicable.

  1. What’s the risk?

In addition to the PR or brand risk of being associated with noncompliance, GDPR provides for some pretty significant monetary penalties .  Some violations are subject to fines up to 10 million EUR or up to 2% of global annual turnover, whichever is greater.  For other violations, it is double – up to 20 million euros or 4% of your global annual turnover, whichever is greater.  For large businesses, this could be a substantial amount.

  1. What should I be doing?

First, talk with your general counsel or outside law firm.  They can help you interpret the law, review contractual obligations and assess the company’s overall privacy policies to help guide your compliance strategy going forward.  They can also help create defensible interpretations within certain ambiguous language in the regulation (e.g., what is “personal data” for purposes of the GDPR?).  The Article 29 Working Party, made up of the data protection authorities (DPAs) from all EU member states, has published guidance to clarify certain provisions, which can be helpful during this process.

Second, create a cross-functional team including areas including (but not limited to): communications/PR, IT, customer experience, digital, legal and operations.  This may be fairly similar to any cross-functional teams you may have (and hopefully have) already established to prepare for data breaches.  This team can begin designing and implementing a compliance strategy.  Under certain conditions, your business may need to appoint a Data Protection Officer (DPO) (See Articles 29 and 30).

  1. What are some key points of the GDPR?

GDPR is a data privacy regulation in the EU that is aimed at protecting users’ rights and privacy online.  It requires business to assess what kinds of data they’re collecting and to make that data accessible to users.  The regulation is long and complex with several moving parts, but four key points may be worth noting.

Key Definitions:  You will see several references to controllers, data subjects, personal data, and processing.  This vocabulary may be unfamiliar in relation to U.S. law, but here is how these key terms are defined – as a business subject to GDPR, you may be a “controller” or you may be a “processor”.  The individual is the “data subject”:

  • “Controller” = “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law.”
  • “Processor” = “means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”
  • “Data subject”= “an identified or identifiable natural person (see definition of “personal data” above).”
  • “Personal data” = “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
  • “Processing” = “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”
  1. Some Key Articles/Provisions:

Article 12Transparent information, communication and modalities for the exercise of the rights of the data subject.

This article creates rules around how users give consent to record their data.  The data subject must be provided with accurate information on all relevant issues, such as the kind of data to be collected or process, and for what purposes. For some particularly sensitive data, (e.g., political opinion, religion, biometric data (including photographs), health data, etc.), consent must be “explicit”.   Consent must be “freely given”, meaning that the user has a “genuine” choice and be able to withdraw consent “without detriment”.  The data subject cannot be obliged to consent to data processing that is not necessary to provide the service he or she has requested.

For these reasons, the traditional “notice and consent” may not be sufficient, and actionable forms or buttons may be necessary.  “Silence, pre-ticked boxes or inactivity,” however, is presumed inadequate to confer consent.  Recital 32 of the GDPR notes that an affirmative action signaling consent may include ticking a box on a website, “choosing technical settings for information society services”, or “another statement or conduct” that clearly indicates assent to the processing.  “Silence, pre-ticked boxes, or inactivity” however, is presumed inadequate.  For those reaching European citizens digitally, working with IT or UX experts may prove important to create a seamless, but compliant, experience.

Article 17Right to erasure

The “right to be forgotten” means that businesses must be able to remove data on a user at their “without undue delay”.  Further, the businesses have an obligation to erase personal data “without undue delay” under certain additional circumstances.

Article 20. Right to data portability.

Users have the right to receive any data that a business may have on them the firm must provide such data in a “structured, commonly used and machine-readable format”.  Further, the data subject has the right to transmit such data to another business without being hindered by the business that provide the data where the processing is either (a) based on certain consents or (b) carried out by automated means.  Where technically feasible, the data subject also has the right to have the personal data transmitted directly from one controller to another.

Article 8. Conditions applicable to child’s consent in relation to information society services.

Article 8 limits the ability of children to consent to data processing without parental authorization.  Previous drafts of the GDPR had set the age of consent at 13 years old, which would have been consistent with the age set by the United States’ Children’s Online Privacy and Protection Act (“COPPA”). A last-minute proposal aimed  to raise the age of consent to 16 years old.  In the final draft, the age of consent is set at 16 unless a member state sets a lower age no below 13 years.  Thus, unless otherwise provided by member state law, controllers must obtain parental consent when processing the personal data of a child under the age of 16. With the difference between the U.S. age of consent under COPPA set at 13 (COPPA) and the European age of consent under the GDPR set at 16 (unless otherwise lowered by a member state), this could present some challenges for U.S. businesses offering international services.

Article 32.  Security of Processing.

Firms must follow security best practices across the board when collecting and protecting data. This may include, but isn’t limited to, specific password policies, information security frameworks (e.g., NIST, ISO, COBIT/ISACA, SSAE, etc.), and data encryption.

  1. What Else Should I Know?

If you believe your business might be affected, you should already be familiarizing yourself with the GDPR regulations and be well into your compliance plan.  The above summary is a sampling of key points and not a comprehensive analysis,, which should be undertaken to better understand your compliance obligations.  You should also be aware of the ePrivacy Regulation which will be following on the heels of the GDPR.

Whereas the GDPR covers the right to protection of personal data, while the ePrivacy Regulation encompasses a person’s right to a private life, including confidentiality.  There is some obvious overlap here, but the ePrivacy Regulation is intended to particularize GDPR for electronic communications — devices, processing techniques, storage, browsers etc.  The laws are intended to be in sync, but the ePrivacy regulations are still up in the air — optimistically forecasted to be finally approved by the end of 2018, although the implementation date remains to be seen.  In sum, GDPR compliance is all you can focus on right now, and hopefully GDPR compliance should position your business well for any additional compliance obligations that could subsequently arise from the finalized ePrivacy Regulation.

On December 5, 2017, NIST published a revised version of the NIST Cybersecurity Framework (i.e., Draft 2 of Version 1.1) (“Framework”).  According to NIST, Version 1.1 of the Framework refines, clarifies, and enhances Version 1.0 of the Framework issued in February 2014, and the recently published Draft 2 of Version 1.1 is informed by over 120 comments on the first draft proposed in January 10, 2017, as well as comments and discussion by attendees at NIST’s workshop in May 2017.

Among the various revisions, they include revisions intended to: (1) clarify and revise cybersecurity measurement language; (2) clarify the use of the Framework to manage cybersecurity within supply chains; (3) better account for authorization, authentication, and identity proofing; (4) better consider coordinated vulnerability disclosure, including the addition of a subcategory related to the vulnerability disclosure lifecycle; and (5) remove statements related to federal applicability in light of various intervening policies and guidance (e.g., Executive Order 13800, OMG Memorandum M-17-25, and Draft NIST Interagency Report (NISTIR) 8170) on federal use of the Framework.

NIST seeks public comment on the following questions by January 19, 2018:

  • Do the revisions in Version 1.1 Draft 2 reflect the changes in the current cybersecurity ecosystem (threats, vulnerabilities, risks, practices, technological approaches), including those developments in the Roadmap items?
  • For those using Version 1.0, would the proposed changes affect their current use of the Framework? If so, how?
  • For those not currently using Version 1.0, would the proposed changes affect their decision about using the Framework? If so, how?

Feedback and comments should be directed to cyberframework@nist.gov.

To view a markup (.pdf) of the revised draft Framework, click here.

To view a clean version (.pdf) of the revised draft Framework, click here.

To view the draft roadmap (.pdf), click here.

To view the draft Framework Core (.xls), click here.

On November 15, 2017, the Trump administration released the Vulnerabilities Equities Policy and Process. This document describes the process by which U.S. agencies and departments determine whether to disclose or restrict information on vulnerabilities in information systems and technologies. The Vulnerabilities Equities Process (VEP) balances whether to disclose vulnerability information to the vendor or supplier in the expectation that the vulnerability will be fixed or to temporarily restrict disclosure of the information so that it can be used for national security and/or law enforcement purposes.

The Equities Review Board (ERB), consisting of individuals from numerous agencies, functions as the forum for interagency deliberation and determination concerning the VEP. The National Security Agency will function as the VEP Executive Secretariat. The VEP Executive Secretariat will oversee communications, documentation and recordkeeping for the VEP. The VEP Executive Secretariat will also publish a report of unclassified information on an annual basis.

The VEP provides steps for submitting and reviewing identified vulnerabilities:

  • When an agency determines that a vulnerability reaches the threshold for entry into the VEP, it will notify the VEP Executive Secretariat and provide a recommendation for disclosure or restriction of the vulnerability.
  • The VEP Executive Secretariat will provide notice to all agencies of the ERB and request agencies to respond if they have a strong interest (i.e., “equity”) in the vulnerability. Any agencies with a strong interest in the vulnerability must concur or disagree with the recommendation.
  • The ERB will then reach a consensus on whether or not to disclose or restrict the vulnerability

To view the VEP Charter, click here.

To view the fact sheet, click here.

On August 7 2017, the U.S. Securities and Exchange Commission (SEC), through its Office of Compliance Inspections and Examinations (OCIE), published a Risk Alert summarizing observations on how broker dealers, investment advisers, and investment companies have addressed cybersecurity issues. The OCIE examined 75 financial firms registered with the SEC. The examinations focused on the firms’ written policies regarding cybersecurity. The OCIE observed increased cybersecurity preparedness since a similar 2014 observational initiative was conducted but also noticed areas of compliance and oversight that could be improved.

In particular, the OCIE observed that almost all firms that were examined maintain cyber-security related written procedures regarding protection of customer and shareholder records and information. Additionally, the examinations confirmed many of the firms are conducting cybersecurity risk assessments, penetration tests and vulnerability scans, and maintaining clearly defined cybersecurity organizational charts for workforces. However, the OCIE also observed that, in some cases, firms are administering vague or unclear cybersecurity policies, are not adequately following cybersecurity policies, or are not conducting adequate system maintenance to address system vulnerabilities. The Risk Alert concluded that, despite some improvements, cybersecurity remains one of the top compliance risks for financial firms. The OCIE noted that it will continue to monitor financial firms’ compliance in this area.

To view the Risk Alert, click here.

 

This month, the Federal Trade Commission (FTC) issued guidance for businesses operating websites and online services looking to comply with the Children’s Online Privacy Protection Act (“COPPA”). COPPA addresses the collection of personal information from children under 13.  Importantly, the determination of whether a business’s website is “directed to children under 13” (and thus subject to certain COPPA requirements) is based on a variety of factors – thus even website that do not target children as its primary audience may nonetheless be subject to COPPA’s requirements based on the website’s subject matter, visual and audio content, ads on the site that may be directed to children, and other factors.

The FTC’s guidance notes that updates to the COPPA regulations were made in July 2013 to reflect changes in technology, and reminded businesses that violations can result in law enforcement actions as well as civil penalties.  The compliance guidance sets out steps to (1) determining whether your business is covered by COPPA; (2) if so, what steps need to be taken to ensure compliance, including privacy policy provisions, notifying and obtaining verifiable consent from parents, (3) providing methods for parents to review, delete, or revoke consent, and (4) implementing reasonable security procedures. Finally, the guidance provides a chart describing limited exceptions to the parental consent requirement.

  • Step 1: Determine if Your Company is a Website or Online Service that Collects Personal Information from Kids Under 13.
  • Step 2: Post a Privacy Policy that Complies with COPPA.
  • Step 3: Notify Parents Directly Before Collecting Personal Information from Their Kids.
  • Step 4: Get Parents’ Verifiable Consent Before Collecting Personal Information from Their Kids.
  • Step 5: Honor Parents’ Ongoing Rights with Respect to Personal Information Collected from Their Kids.
  • Step 6: Implement Reasonable Procedures to Protect the Security of Kids’ Personal Information.
  • Chart: Limited Exceptions to COPPA’s Verifiable Parental Consent Requirement

The six COPPA compliance steps are described below. To view the FTC’s full guidance webpage, click here.

NOTE:  In addition to COPPA, it may be worth determining whether California’s state version of COPPA, the California Online Privacy Protection Act (“CalOPPA”) applies to your business and, if so, whether additional compliance measures may be necessary. CAlOPPA broadly applies whenever a website or app collects “personally identifiable information” or PII (as defined in the state’s business code) from a California resident, and thus applies to the vast majority of online businesses, even if not based in California.

 

 

 

 

On May 31, 2017, the Federal Financial Institutions Examination Council (FFIEC) released an update to its Cybersecurity Assessment Tool.

The Cybersecurity Assessment Tool was originally released by the FFIEC in June of 2015 to help financial institutions identify their risks and assess their cybersecurity preparedness.  The Cybersecurity Assessment Tool is intended to be used by financial institutions of all sizes to perform a self-assessment and inform their risk management strategies. Upon the release of the original Cybersecurity Assessment Tool, the FFIEC noted its plan to update the Cybersecurity Assessment Tool as threats, vulnerabilities, and operational environments evolve.

According to the FFIEC’s May 31st press release, the update to the Cybersecurity Assessment Tool “addresses changes to the FFIEC IT Examination Handbook by providing a revised mapping in Appendix A to the updated Information Security and Management booklets”. The updated Cybersecurity Assessment Tool also provides “additional response options, allowing financial institution management to include supplementary or complementary behaviors, practices and processes that represent current practices of the institution in supporting its cybersecurity activity assessment.”

Financial institutions can find the updated version of the Cybersecurity Assessment Tool here.

On March 10, 2017, the White House Office of Management and Budget (“OMB”) released its 2016 Federal Information Security Modernization Act (“FISMA”) Annual Report to Congress. The FISMA Report describes the current state of Federal cybersecurity. It provides Congress with information on agencies’ progress towards meeting cybersecurity goals and identifies areas that need improvement. Additionally, the report provides information on Federal cybersecurity incidents, ongoing efforts to mitigate and prevent future incidents, and progress in implementing adequate cybersecurity programs and policies.

According to the FISMA report, agencies reported over 30,899 cyber incidents that led to the compromise of information or system functionality in 2016. However, only sixteen of these incidents met the threshold for a “major incident” (which triggers a series of mandatory steps for agencies, including reporting certain information to Congress). The report categorizes the types of agency-reported incidents. The largest number of reported incidents (more than one-third) was “other,” meaning the attack method did not fit into a specific category or the cause of the attack was unidentified. The second largest was loss or theft of computer equipment. Attacks executed from websites or web-based applications were the third most common type of incident.

Despite these incidents, the report notes that there were government-wide improvements in cybersecurity, including agency implementation of:

  • Information Security Continuous Monitoring (“ISCM”) capabilities that provide situational awareness of the computers, servers, applications, and other hardware and software operating on agency networks;
  • Multi-factor authentication credentials that reduce the risk of unauthorized access to data by limiting users’ access to the resources and information required for their job functions; and
  • Anti-Phishing and Malware Defense capabilities that reduce the risk of compromise through email and malicious or compromised web sites.

Federal agencies will look to continue these cybersecurity improvements in 2017.

To view the Report, click here.

Vintage toned Wall Street at sunset, NYC.

Today, acting FTC Chairman Maureen K. Ohlhausen and FCC Chairman Ajit Pai issued a joint statement on the FCC’s issuance of a temporary stay of a data security regulation for broadband providers scheduled to take effect on March 2.  In their statement, they advocate for a “comprehensive and consistent framework”, so that Americans do not have to “figure out if their information is protected differently depending on which part of the Internet holds it.”

The Chairmen stated that for this reason, they disagreed with the FCC’s 2015 unilateral decision to strip the FTC of its authority over broadband provider’s privacy and data security practices, and believed that jurisdiction over broadband providers’ privacy and data security practices should be returned to the FTC, thus subjecting “all actors in the online space” to the same rules.

Until then, the joint statement provides, the two chairmen “will work together on harmonizing the FCC’s privacy rules for broadband provider with the FTC’s standards for other companies in the digital economy.”  The statement provides that the FCC order was inconsistent with the FTC’s privacy framework. The stay will remain in place only until the FCC is able to rule on a petition for reconsideration of its privacy rules.

In response to concerns that the temporary delay of a rule not yet in effect will leave consumers unprotected, the Chairmen agree that it is vital to fill the consumer protection gap, but that “how that gap is filled matters” – it does not serve consumer’s interests to create two separate and distinct frameworks – one for Internet service providers and another for all other online companies.

Going forward, the statement says, the FTC and the FCC will work together to establish a uniform and technology-neutral privacy framework for the online world.

To view the joint FTC and FCC statement, click here.

To view the FCC Order staying the regulation, click here.