Over a dozen lawsuits have been filed by users and investors against Facebook after it was revealed last month that Cambridge Analytica, a political research firm, obtained personal information on millions of Facebook users. Cambridge Analytica obtained the data through a personality test app linked to Facebook accounts. Many of the lawsuits claim the information was used to create profiles and target audiences for purposes of categorizing voters in the 2016 presidential election. Most of the lawsuits accuse Facebook of failing to protect users’ personal information despite stating in its privacy policy that Facebook users own and control personal information posted on Facebook. Some of the lawsuits go beyond allegations of privacy violations and accuse Facebook of negligence, consumer fraud, unfair competition, securities fraud and racketeering. On March 16, Facebook announced that it was suspending Cambridge Analytica for violating Facebook’s policies on data gathering

Starting April 9, Facebook will begin alerting users whose data may have been harvested by Cambridge Analytica. As part of this process, the company plans to post a link at the top of users’ news feeds that will allow them to see which apps are connected to their Facebook accounts and what information those apps are permitted to see. Additionally, Facebook CEO Mark Zuckerberg is scheduled to testify before U.S. Congress on April 10 and April 11. Zuckerberg will appear before the Senate Judiciary and Commerce committees on April 10 and the House Energy and Commerce Committee on the morning of April 11. Zuckerberg’s testimony will hopefully shed more light into how this alleged violation occurred and its broader implications on data privacy in general.

 

 

On Wednesday, March 28, 2018, the Alabama Data Breach Notification Act of 2018 (SB318) was signed into law by the Governor, making Alabama round out the roster of 50 states with data breach notification laws.  (South Dakota’s data breach notification was signed by its governor on March 21, 2018, making it the 49th state.)  The new law will be effective on June 1, 2018.  Below is a more detailed summary of the Alabama law:

Definitions.

The Alabama law defines a security breach as the “unauthorized acquisition of data in electronic form containing Sensitive Personally Identifying Information (“Sensitive PII”).  As is typical, a breach does not include either: (a) good faith acquisitions by employees or agents unless used for unrelated purposes; (b) the release of public records not otherwise subject to confidentiality or nondisclosure requirements; or (c) any lawful investigative, protection or intelligence activities by a state law enforcement or intelligence agency.

“Sensitive PII” is defined to include: (a) an Alabama resident’s first name or first initial and last name in combination with one or more of the following regarding the same resident:

  • A non-truncated SSN number or tax identification number;
  • A non-truncated driver’s license number, state ID number, passport, military ID, or other unique identification number issued on a government document;
  • A Financial account number, including bank account number, credit card or debit card, in combination with any security code, access code, password, expiration date, or PIN, that is necessary to access the financial account or to conduct a transaction that will credit or debit the financial account.
  • Any information regarding an individual’s medical history, mental or physical conditions, or medical treatment or diagnosis by a health care professional.
  • An individual’s health insurance policy number or subscriber identification number and any unique identifier used by a health insurer to identify the individuals.
  • A user name or email address, in combination with a password or security question and answer that would permit access to an online account affiliated with the covered entity that is reasonably likely to contain or is used to obtain Sensitive PII.

Notification Requirements.

  • Notification to Individuals. If a covered entity determines that an unauthorized acquisition of Sensitive PII has or is reasonably believed to have occurred, and is reasonably likely to cause substantial harm, it shall notify affected individuals as expeditiously as possible and without unreasonable delay but no later than 45 days after the determination of both a breach and a likelihood of substantial harm. A federal or state law enforcement agency may request delayed notification if it may interfere with an investigation.  If an entity determines that notice is not required, it shall document the determination and maintain the documentation for at least 5 years.
    • Format and Content. Written notice can be by mail or email, and must include: (1) the estimated date or date range of the breach; (2) a description of the Sensitive PII acquired; (3) a general description of actions taken to restore the security and confidentiality of the personal information; (4) steps an affected individual can take to protect himself or herself from identity theft; and (5) contact information for the covered entity in case of inquiries.
    • Substitute Notice. Substitute notice can be provided if direct notice would cause excessive cost relative to the covered entity’s resources, if the affected individuals exceed 100,000 persons, or if there is a lack of sufficient contact information for the required individual to be notified.  Costs are deemed excessive automatically if they exceed $500,000.  Substitute notice may include both posting on the website for 30 days and using print or broadcast media in the major urban and rural areas where the individuals reside.   An alternative form of substitute notice may be approved by the Attorney General.
  • Notification to Attorney General. If the affected individuals exceed 1,000, the entity must notify the Attorney General as expeditiously as possible and without unreasonable delay, but no more than 45 days from receiving notice of a breach by a third party agent or upon determining a breach and substantial likelihood of harm has occurred. Notice must include: (1) an event synopsis; (2) the approximate number of affected individuals in Alabama; (3) any free services being offered by the covered entity to individuals and instructions on how to use them; and (4) contact information for additional inquiries.  The covered entities may provide supplemental or updated information at any time, and information marked as confidential is not subject to any open records or freedom of information laws.
  • Notification to Consumer Reporting Agencies. If the covered entity discovers notice is required to more than 1,000 individuals at a single time, it shall also notify, without unreasonable delay, all consumer reporting agencies.
  • Third Party Notification. Third party agents experiencing a breach of a system maintained on behalf of a covered entity shall notify the covered entity as expeditiously as possible and without unreasonable delay, but no later than 10 days following the determination (or reason to believe) a breach has occurred.

Enforcement

  • Enforcement Authority. Violating the notification provisions is an unlawful trade practice under the Alabama Deceptive Trade Practices Act (ADTPA), and the Attorney General has exclusive authority to bring an action for penalties. There is no private cause of action.  The Attorney General also has exclusive authority to bring a class action for damages, but recovery is limited to actual damages plus reasonable attorney’s fees and costs.  The Attorney General must submit an annual report.
  • Penalties. Any entity knowingly violating the notification provisions is subject to ADTPA penalties, which can be up to $2,000/day, up to a cap of $500,000 per breach.   (“Knowing” means willfully or with reckless disregard.)  In addition to these penalties, a covered entity violating the notification provisions shall be liable for a penalty of up to $5,000/day for each day it fails to take reasonable action to comply with the notice provisions. Government entities are subject to the notice requirements, but exempt from penalties, although the Attorney General may bring an action to compel performance or enjoin certain acts.

Other Requirements

  • While enforcement authority is limited to notification violations, the statute also instructs entities to take “reasonable security measures”, provides guidance on conducting a “good faith and prompt investigation” of a breach, and requires covered entities to take reasonable measures to dispose of Sensitive PII. It is unclear how these provisions might be enforced, except potentially to determine if a notification violation was willful or with reckless disregard.
    • Reasonable Security Measures”. Covered entities and third party agents must implement and maintain reasonable security measures to protect Sensitive PII, and the law provides guidance on what elements to include.  It also provides guidance on what an assessment of a covered entity’s security measures might consider and emphasize.
    • Breach Investigation. A covered entity shall conduct a “good faith and prompt investigation”, and the law lists considerations to include in the investigation.
    • Records Disposal. A covered entity or third-party agent must take reasonable measures to dispose of or arrange for the disposal of records containing Sensitive PII when they are no longer to be retained, and the law includes examples of such disposal methods.

On February 21, 2018, the Securities and Exchange Commission (SEC) published a release entitled “Commission Statement and Guidance on Public Company Cybersecurity Disclosures” (“Release”).  Designed to assist public companies in preparing disclosures concerning cybersecurity risk and incidents, the release expands upon the SEC’s previous guidance in 2011 to emphasize particular areas, including board oversight, disclosure control and procedures, insider trading and Regulation FD. In addition, the release addresses two topics not developed in the 2011 guidance: (1) the importance of cybersecurity policies and procedures, and (2) the application of insider trading prohibitions in the cybersecurity context.

The SEC’s Release covers the following major points:

Disclosure Guidance

  • Scope of Risk Disclosure Include Potential Incidents. The SEC stated it is critical that public companies take all required actions to inform investors about material cybersecurity risks and incidents in a timely fashion, including those companies that are subject to material cybersecurity risks but may not yet have been the target of a cyber-attack.
  • Weighing Materiality in Disclosure Obligations. In determining disclosure obligations, “companies generally weigh, among other things, the potential materiality of any identified risk, and, in the case of incidents, the importance of any compromised information and of the impact of the incident on the company’s operations.” Materiality may depend on the “nature, extent, and potential magnitude, particularly as they relate to any compromised information or the business and scope of company operations” and the range of harm incidents could cause, including harm to a company’s reputation, financial performance, customer and vendor relationships, and potential litigation or regulatory investigations or enforcement actions.  This includes regulatory actions by state and federal authorities as well as non-US authorities.
  • Timing and Content. The Release acknowledges the challenge of determining the appropriate timing for disclosures, as companies must have time to understand the incident’s scope and determine how much to disclose.
  • Risk Factors. The Release cites Regulations S-K and Form 20-F as requiring companies to disclose the most significant factors that make investments in the company’s securities speculative or risky. Companies should disclose cybersecurity-related risks if they are among such factors, including risks that arise in connection with acquisitions.  The Release states it would be helpful for companies to consider the following issues, among others, in evaluating cybersecurity risk factor disclosure:
    • The occurrence of prior cybersecurity incidents, including their severity and frequency;
    • The probability of the occurrence and potential magnitude of cybersecurity incidents;
    • The adequacy of preventative actions taken to reduce cybersecurity risks and the associated costs, including, if appropriate, discussing the limits of the company’s ability to prevent or mitigate certain cybersecurity risks;
    • The aspects of the company’s business and operations that give rise to material cybersecurity risk and the potential cost and consequences of such risks, including industry-specific risks and third party suppliers and service provider risks;
    • The costs associated with maintaining cybersecurity protections, including, if applicable, insurance coverage relating to cybersecurity incidents or payments to service providers;
    • The potential for reputational harm;
    • Existing or pending laws and regulations that may affect the requirements to which companies are subject to relating to cybersecurity and the associated costs to companies; and
    • Litigation, regulatory investigation, and remediation costs associated with cybersecurity incidents.
  • Content of Disclosure. Companies are not required to include disclosure that would provide a “roadmap” for how to breach a company’s security protections — such as technical information about systems, networks or devices — or other potential vulnerabilities in such detail as would make such assets more susceptible to an incident. However, the Commission does expect companies to disclose cybersecurity risks and incidents that are material to investors, to make appropriate disclosures timely and sufficiently prior to the offer and sale of securities, and to take steps to prevent officers, directors, and other insiders from trading securities until investors have been appropriately informed.  Companies should watch for situations in which they need to correct or update prior disclosures as additional information is learned.  In meeting their disclosure obligations, companies may need to disclose previous or ongoing incidents in order to place discussion of risks in the appropriate context.  For instance, if a company previously experienced a material denial-of-service attack, it likely would not be sufficient to merely disclose that there is a risk that a denial-of-service incident may occur.  Instead, the company may need to discuss the occurrence of that incident and its consequences as part of a broader discussion of the types of incidents that pose particular risks to the company’s business and operations.

Policies and Procedures

  • Board Oversight. Under current Item 407(h) of Regulation S-K, companies must disclose the board of directors’ role in the risk oversight of the company, and the Release suggests specific discussion of the nature of its role in cyber risk management, especially if cyber risks are material to the company’s business.  The Release indicates that disclosing a company’s cybersecurity risk management program and how its board engages with management on cybersecurity issues “allows investors to assess how a board is discharging its responsibilities in this increasingly important area.”  As a response to this release, companies may wish to consider broadening or deepening their board’s engagement with these issues.
  • Disclosure Controls and Procedures. The SEC stated that it is “crucial” for a public company to have disclosure controls and procedures that provide an appropriate method of discerning the impact of such matters on the company and its business, financial conditions, and results of operations, as well as a protocol to determine the potential materiality of such risks and incidents.  These controls and procedures must ensure senior management is promptly made aware of important cybersecurity issues to enable informed disclosure decisions regarding the substance of any issues and to facilitate appropriate officer certifications and disclosures regarding the effectiveness of the controls and procedures.  Companies should “identify cybersecurity risks and incidents, assess and analyze their impact on a company’s business, evaluate the significance associated with such risks and incidents, provide for open communications between technical experts and disclosure advisors and make timely disclosure regarding such risks and incidents.”  In addition, a company should not limit its disclosure controls and procedures to only what is specifically required, but should also ensure timely collection and evaluation of information potentially subject to disclosure.
  • Insider Trading and Regulation FD. The Commission reminds companies that, in some circumstances,  cybersecurity risks and incidents may constitute material nonpublic information, and that existing insider trading and Regulation FD policies should already include any type of material nonpublic information.  However, the Commission states, companies should consider highlighting this possibility through training, or adding cybersecurity incidents to lists within these policies of examples of potentially material information. Policy administrators should establish processes to ensure they are aware of developing cybersecurity incidents when determining whether to close certain trading windows or approve specific trades.  Even when there has been no insider trading violation, companies may be subject to scrutiny if executives trade prior to disclosure of cyber incidents that develop into significant events.  Companies must be mindful of making selective disclosures of cybersecurity events to the persons enumerated under Regulation FD (namely, persons reasonably expected to trade on the basis of such information) before that information is publicly announced.  Policies and procedures for addressing a cybersecurity event should inform those handling the situation of the need to maintain appropriate confidentiality until a public announcement is ready to be made.
  • Conclusion.  The Release highlights the SEC’s increased attention to disclosures related to cybersecurity and concerns that investors may not be fully informed about the growing risks with cybersecurity. In response to this Release, publicly traded companies should:
    • Review and consider refreshing the disclosures in their periodic reports and registration statements, taking into account the detailed criteria contained in the Release and how the impact of incidents (and the risks of potential incidents) may be material to the information that must be presented.
    • Evaluate policies, procedures, and practices related to disclosure to consider whether they need to be updated or refreshed in light of this Release, and to ensure that their boards’ oversight is in line with the risks faced by the company;
    • Consider whether information regarding cybersecurity- and privacy-related risks and incidents is appropriately developed and communicated to result in accurate and timely disclosures and to avoid inadvertent insider trading and Regulation FD violations.

A Berlin regional court recently ruled that Facebook’s use of personal data was illegal because the social media platform did not adequately secure the informed consent of its users. A German consumer rights group, the Federal of German Consumer Organisations (vzvb) said that Facebook’s default settings and some of its terms of service were in breach of consumer law, and that the court had found parts of the consent to data usage to be invalid. One concern highlighted by the consumer rights group was that, in Facebook’s app for smartphones, a service was pre-activated that revealed the user’s location to the person they were chatting to.  Also, in the privacy settings, ticks were already placed in boxes that allowed search engines to link to the user’s timeline, meaning that anyone would be able quickly and easily to find a user’s profile.

A week after the ruling, Facebook promised to radically overhaul its privacy settings, saying the work would prepare it for the introduction of the upcoming General Data Protection Regulations (GDPR).  Facebook has faced repeated attacks from Germany and other European regulators over issues ranging from perceived anti-competitive practices to alleged misuse of customer data. In October, the Article 29 Working Party (WP29) launched a task force to examine the sharing of user data between WhatsApp and Facebook, which it says does not have sufficient user consent.  “Whilst the WP29 notes there is a balance to be struck between presenting the user with too much information and not enough, the initial screen made no mention at all of the key information users needed to make an informed choice, namely that clicking the agree button would result in their personal data being shared with the Facebook family of companies,” the group told WhatsApp in October.

Similarly, a Belgian court earlier this month ordered Facebook to stop collecting data on users or face daily fines of €250,000 a day, or up to €100million.  The court ruled that Facebook had broken privacy laws by tracking people on third-party sites. “Facebook informs us insufficiently about gathering information about us, the kind of data it collects, what it does with that data and how long it stores it,” the court said. “It also does not gain our consent to collect and store all this information.”  The court ordered Facebook to delete all data it had gathered illegally on Belgian citizens, including people who were not users of the social network.

With regards to the German suit, Facebook said it would appeal, releasing a statement that it had already made significant changes to its terms of service sand data protection guidelines since the case was first brought in 2015. In the meantime, Facebook stated it would update its data protection guidelines and terms of services so that they comply with the new EU-wide GDPR rules.

On January 28, 2017, as part of Data Privacy Day, Facebook shared its data privacy principles for the first time. In a blog post drafted by Erin Egan, Facebook’s Chief Privacy Officer, Facebook posted these principles to help users understand how data is used and managed on the site. Among other things, Facebook’s data privacy principles stress user control of privacy, the goal of protecting users’ accounts and implementing security tools (like two-factor authentication), and user ownership of information shared. Facebook also announced the launch of a new education campaign to help users understand how data privacy is handled by the company. As part of this effort, Facebook is preparing to roll out a “Privacy Center” that features important privacy settings in a single place.

This publication comes ahead of the European Union’s (EU) General Data Protection Regulation (GDPR), which will be implemented on May 25, 2018. The GDPR will set stringent data privacy requirements for companies operating in the EU.  In recent years, Facebook has faced scrutiny from EU regulators over its handling of user data. Facebook hopes to embrace a more transparent data privacy approach to meet all GDPR obligations.

To view Facebook’s Privacy Principles, click here.

With the May 25, 2018 deadline quickly approaching, many businesses are scrambling to prepare for compliance with the EU’s General Data Protection Regulation (GDPR), and questions and conversations are heating up.  Still others are still trying to wrap their arms around what GDPR is and what it means for U.S. businesses.  For those of you still trying to wrap your heads around it, below are a few basics to help familiarize yourself with the regulation and its relevance to you.

  1. I’m a U.S. business. Why does GDPR matter to me?

The reach of the GDPR regulation extends not only to European-based businesses, but also to all companies that do business, have customers, or collect data from people in the EU.  If you even have a website that could collect data from someone visiting the site from the EU, your business could be affected. No matter where your business resides, if you intentionally offer goods or services to the European Union, or monitor the behavior of individuals within the EU, the GPDR could be applicable.

  1. What’s the risk?

In addition to the PR or brand risk of being associated with noncompliance, GDPR provides for some pretty significant monetary penalties .  Some violations are subject to fines up to 10 million EUR or up to 2% of global annual turnover, whichever is greater.  For other violations, it is double – up to 20 million euros or 4% of your global annual turnover, whichever is greater.  For large businesses, this could be a substantial amount.

  1. What should I be doing?

First, talk with your general counsel or outside law firm.  They can help you interpret the law, review contractual obligations and assess the company’s overall privacy policies to help guide your compliance strategy going forward.  They can also help create defensible interpretations within certain ambiguous language in the regulation (e.g., what is “personal data” for purposes of the GDPR?).  The Article 29 Working Party, made up of the data protection authorities (DPAs) from all EU member states, has published guidance to clarify certain provisions, which can be helpful during this process.

Second, create a cross-functional team including areas including (but not limited to): communications/PR, IT, customer experience, digital, legal and operations.  This may be fairly similar to any cross-functional teams you may have (and hopefully have) already established to prepare for data breaches.  This team can begin designing and implementing a compliance strategy.  Under certain conditions, your business may need to appoint a Data Protection Officer (DPO) (See Articles 29 and 30).

  1. What are some key points of the GDPR?

GDPR is a data privacy regulation in the EU that is aimed at protecting users’ rights and privacy online.  It requires business to assess what kinds of data they’re collecting and to make that data accessible to users.  The regulation is long and complex with several moving parts, but four key points may be worth noting.

Key Definitions:  You will see several references to controllers, data subjects, personal data, and processing.  This vocabulary may be unfamiliar in relation to U.S. law, but here is how these key terms are defined – as a business subject to GDPR, you may be a “controller” or you may be a “processor”.  The individual is the “data subject”:

  • “Controller” = “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law.”
  • “Processor” = “means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”
  • “Data subject”= “an identified or identifiable natural person (see definition of “personal data” above).”
  • “Personal data” = “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
  • “Processing” = “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”
  1. Some Key Articles/Provisions:

Article 12Transparent information, communication and modalities for the exercise of the rights of the data subject.

This article creates rules around how users give consent to record their data.  The data subject must be provided with accurate information on all relevant issues, such as the kind of data to be collected or process, and for what purposes. For some particularly sensitive data, (e.g., political opinion, religion, biometric data (including photographs), health data, etc.), consent must be “explicit”.   Consent must be “freely given”, meaning that the user has a “genuine” choice and be able to withdraw consent “without detriment”.  The data subject cannot be obliged to consent to data processing that is not necessary to provide the service he or she has requested.

For these reasons, the traditional “notice and consent” may not be sufficient, and actionable forms or buttons may be necessary.  “Silence, pre-ticked boxes or inactivity,” however, is presumed inadequate to confer consent.  Recital 32 of the GDPR notes that an affirmative action signaling consent may include ticking a box on a website, “choosing technical settings for information society services”, or “another statement or conduct” that clearly indicates assent to the processing.  “Silence, pre-ticked boxes, or inactivity” however, is presumed inadequate.  For those reaching European citizens digitally, working with IT or UX experts may prove important to create a seamless, but compliant, experience.

Article 17Right to erasure

The “right to be forgotten” means that businesses must be able to remove data on a user at their “without undue delay”.  Further, the businesses have an obligation to erase personal data “without undue delay” under certain additional circumstances.

Article 20. Right to data portability.

Users have the right to receive any data that a business may have on them the firm must provide such data in a “structured, commonly used and machine-readable format”.  Further, the data subject has the right to transmit such data to another business without being hindered by the business that provide the data where the processing is either (a) based on certain consents or (b) carried out by automated means.  Where technically feasible, the data subject also has the right to have the personal data transmitted directly from one controller to another.

Article 8. Conditions applicable to child’s consent in relation to information society services.

Article 8 limits the ability of children to consent to data processing without parental authorization.  Previous drafts of the GDPR had set the age of consent at 13 years old, which would have been consistent with the age set by the United States’ Children’s Online Privacy and Protection Act (“COPPA”). A last-minute proposal aimed  to raise the age of consent to 16 years old.  In the final draft, the age of consent is set at 16 unless a member state sets a lower age no below 13 years.  Thus, unless otherwise provided by member state law, controllers must obtain parental consent when processing the personal data of a child under the age of 16. With the difference between the U.S. age of consent under COPPA set at 13 (COPPA) and the European age of consent under the GDPR set at 16 (unless otherwise lowered by a member state), this could present some challenges for U.S. businesses offering international services.

Article 32.  Security of Processing.

Firms must follow security best practices across the board when collecting and protecting data. This may include, but isn’t limited to, specific password policies, information security frameworks (e.g., NIST, ISO, COBIT/ISACA, SSAE, etc.), and data encryption.

  1. What Else Should I Know?

If you believe your business might be affected, you should already be familiarizing yourself with the GDPR regulations and be well into your compliance plan.  The above summary is a sampling of key points and not a comprehensive analysis,, which should be undertaken to better understand your compliance obligations.  You should also be aware of the ePrivacy Regulation which will be following on the heels of the GDPR.

Whereas the GDPR covers the right to protection of personal data, while the ePrivacy Regulation encompasses a person’s right to a private life, including confidentiality.  There is some obvious overlap here, but the ePrivacy Regulation is intended to particularize GDPR for electronic communications — devices, processing techniques, storage, browsers etc.  The laws are intended to be in sync, but the ePrivacy regulations are still up in the air — optimistically forecasted to be finally approved by the end of 2018, although the implementation date remains to be seen.  In sum, GDPR compliance is all you can focus on right now, and hopefully GDPR compliance should position your business well for any additional compliance obligations that could subsequently arise from the finalized ePrivacy Regulation.

The FTC is seeking public comment on a petition by Sear’s to reopen and modify its 2009 consent order to restrict the broad definition of “tracking application”.

Background.  In 2009, the FTC issued an order settling charges that Sears Holdings Management Corporation (“Sears”) had failed to adequately disclose the scope of consumers’ personal information it collected via a downloadable software application.  While Sears represented to consumers that the software would track their “online browsing”, the FTC alleged that the software would also monitor consumers’ other online secure sessions – including sessions on third parties’ websites — and collect information transmitted in those sessions, “such as the contents of shopping carts, online bank statements, drug prescription records, video rental records, library borrowing histories, and the sender, recipient, subject, and size for web-based emails.”  The software would also track some computer activities unrelated to the Internet.  The proposed settlement called for Sears to stop collecting data from consumers who downloaded the software, and to destroy all data it had previously collected.

The 2009 Sears case is significant, among other reasons, because, the FTC found a violation of Section 5 of the FTC Act notwithstanding Sears’ disclosure, because the disclosure was not sufficiently conspicuous.  Specifically, while Sears did disclose the full scope of the software’s specific functions, the details of such functions were contained on approximately the 75th line of the scroll box containing the privacy statement and user license agreement.  The FTC order stated that because such description was not displayed clearly and prominently, that Sears was being “unfair and deceptive” under Section 5 of the FTC Act.

Petition.  On October 30, 2017, Sears petitioned the FTC to reopen and modify its final order to modify the broad definition of “tracking application”.   Sears states that the current definition should be updated because of changing circumstances over the past eight years which result in the definition unnecessarily restricting Sears’s ability to compete in the mobile app marketplace. Sears states that the requested modification would enable the company to “keep step with current market practices” related to retail online tracking applications.

  • Definition. Paragraph 4 of the consent order defines “tracking application” as:  “any software program or application disseminated by or on behalf of respondent, its subsidiaries or affiliated companies, that is capable of being installed on consumers’ computers and used by or on behalf of respondent to monitor, record, or transmit information about activities occurring on computers on which it is installed, or about data that is stored on, created on, transmitted from, or transmitted to the computers on which it is installed.” 
  • Modification. Sears requests that the following additional language be inserted after the word “installed”: “unless the information monitored, recorded, or transmitted is limited solely to the following: (a) the configuration of the software program or application itself; (b) information regarding whether the program or application is functioning as represented; or (c) information regarding consumers’ use of the program or application itself.”
  • Rationale. Sears states that the proposed modification is necessary to carve out commonly accepted and expected behaviors from the scope of the Order without modifying the Order’s core manage of providing notice to consumers when software applications engaged in potentially invasive tracking.  Sears states subparts (a) and (b) would exclude “activities common to all modern software applications” while subpart (c) would exclude “information tracking that is commonly accepted by consumers and that does not present the type of risks to consumer privacy that the Order was intended to remedy.” Sears further states that the proposed modification mirrors language that the FTC has used to exclude such commonly accepted practices from more recent consent orders.

Solicitation of Public Comment.  On November 8, the FTC issued a release seeking public comment on Sear’s petition requesting that it reopen and modify the 2009 order and definition.  The FTC will decide whether to approve Sears’ petition following the expiration of the 30-day public comment period.  Public comments may be submitted under December 8, 2017.

To view the 2009 FTC Order, click here.

To view Sears’s Petition, click here:

To view FTC’s solicitation of public comment click here.

 

On August 7 2017, the U.S. Securities and Exchange Commission (SEC), through its Office of Compliance Inspections and Examinations (OCIE), published a Risk Alert summarizing observations on how broker dealers, investment advisers, and investment companies have addressed cybersecurity issues. The OCIE examined 75 financial firms registered with the SEC. The examinations focused on the firms’ written policies regarding cybersecurity. The OCIE observed increased cybersecurity preparedness since a similar 2014 observational initiative was conducted but also noticed areas of compliance and oversight that could be improved.

In particular, the OCIE observed that almost all firms that were examined maintain cyber-security related written procedures regarding protection of customer and shareholder records and information. Additionally, the examinations confirmed many of the firms are conducting cybersecurity risk assessments, penetration tests and vulnerability scans, and maintaining clearly defined cybersecurity organizational charts for workforces. However, the OCIE also observed that, in some cases, firms are administering vague or unclear cybersecurity policies, are not adequately following cybersecurity policies, or are not conducting adequate system maintenance to address system vulnerabilities. The Risk Alert concluded that, despite some improvements, cybersecurity remains one of the top compliance risks for financial firms. The OCIE noted that it will continue to monitor financial firms’ compliance in this area.

To view the Risk Alert, click here.

 

This month, the Federal Trade Commission (FTC) issued guidance for businesses operating websites and online services looking to comply with the Children’s Online Privacy Protection Act (“COPPA”). COPPA addresses the collection of personal information from children under 13.  Importantly, the determination of whether a business’s website is “directed to children under 13” (and thus subject to certain COPPA requirements) is based on a variety of factors – thus even website that do not target children as its primary audience may nonetheless be subject to COPPA’s requirements based on the website’s subject matter, visual and audio content, ads on the site that may be directed to children, and other factors.

The FTC’s guidance notes that updates to the COPPA regulations were made in July 2013 to reflect changes in technology, and reminded businesses that violations can result in law enforcement actions as well as civil penalties.  The compliance guidance sets out steps to (1) determining whether your business is covered by COPPA; (2) if so, what steps need to be taken to ensure compliance, including privacy policy provisions, notifying and obtaining verifiable consent from parents, (3) providing methods for parents to review, delete, or revoke consent, and (4) implementing reasonable security procedures. Finally, the guidance provides a chart describing limited exceptions to the parental consent requirement.

  • Step 1: Determine if Your Company is a Website or Online Service that Collects Personal Information from Kids Under 13.
  • Step 2: Post a Privacy Policy that Complies with COPPA.
  • Step 3: Notify Parents Directly Before Collecting Personal Information from Their Kids.
  • Step 4: Get Parents’ Verifiable Consent Before Collecting Personal Information from Their Kids.
  • Step 5: Honor Parents’ Ongoing Rights with Respect to Personal Information Collected from Their Kids.
  • Step 6: Implement Reasonable Procedures to Protect the Security of Kids’ Personal Information.
  • Chart: Limited Exceptions to COPPA’s Verifiable Parental Consent Requirement

The six COPPA compliance steps are described below. To view the FTC’s full guidance webpage, click here.

NOTE:  In addition to COPPA, it may be worth determining whether California’s state version of COPPA, the California Online Privacy Protection Act (“CalOPPA”) applies to your business and, if so, whether additional compliance measures may be necessary. CAlOPPA broadly applies whenever a website or app collects “personally identifiable information” or PII (as defined in the state’s business code) from a California resident, and thus applies to the vast majority of online businesses, even if not based in California.

 

 

 

 

On May 31, 2017, the Federal Financial Institutions Examination Council (FFIEC) released an update to its Cybersecurity Assessment Tool.

The Cybersecurity Assessment Tool was originally released by the FFIEC in June of 2015 to help financial institutions identify their risks and assess their cybersecurity preparedness.  The Cybersecurity Assessment Tool is intended to be used by financial institutions of all sizes to perform a self-assessment and inform their risk management strategies. Upon the release of the original Cybersecurity Assessment Tool, the FFIEC noted its plan to update the Cybersecurity Assessment Tool as threats, vulnerabilities, and operational environments evolve.

According to the FFIEC’s May 31st press release, the update to the Cybersecurity Assessment Tool “addresses changes to the FFIEC IT Examination Handbook by providing a revised mapping in Appendix A to the updated Information Security and Management booklets”. The updated Cybersecurity Assessment Tool also provides “additional response options, allowing financial institution management to include supplementary or complementary behaviors, practices and processes that represent current practices of the institution in supporting its cybersecurity activity assessment.”

Financial institutions can find the updated version of the Cybersecurity Assessment Tool here.