Advertising & Marketing

A Berlin regional court recently ruled that Facebook’s use of personal data was illegal because the social media platform did not adequately secured the informed consent of its users. A German consumer rights group, the Federal of German Consumer Organisations (vzvb) said that Facebook’s default settings and some of its terms of service were in breach of consumer law, and that the court had found parts of the consent to data usage to be invalid.  One concern highlighted by the consumer rights group was that, in Facebook’s app for smartphones, a service was pre-activated that revealed the user’s location to the person they were chatting to.  Also, in the privacy settings, ticks were already placed in boxes that allowed search engines to link to the user’s timeline, meaning that anyone would be able quickly and easily to find a user’s profile.

A week after the ruling, Facebook promised to radically overhaul its privacy settings, saying the work would prepare it for the introduction of the upcoming General Data Protection Regulations (GDPR).  Facebook has faced repeated attacks from Germany and other European regulators over issues ranging from perceived anti-competitive practices to alleged misuse of customer data. In October, the Article 29 Working Party (WP29) launched a task force to examine the sharing of user data between WhatsApp and Facebook, which it says does not have sufficient user consent.  “Whilst the WP29 notes there is a balance to be struck between presenting the user with too much information and not enough, the initial screen made no mention at all of the key information users needed to make an informed choice, namely that clicking the agree button would result in their personal data being shared with the Facebook family of companies,” the group told WhatsApp in October.

Similarly, a Belgian court earlier this month ordered Facebook to stop collecting data on users or face daily fines of €250,000 a day, or up to €100million.  The court ruled that Facebook had broken privacy laws by tracking people on third-party sites. “Facebook informs us insufficiently about gathering information about us, the kind of data it collects, what it does with that data and how long it stores it,” the court said. “It also does not gain our consent to collect and store all this information.”  The court ordered Facebook to delete all data it had gathered illegally on Belgian citizens, including people who were not users of the social network.

With regards to the German suit, Facebook said it would appeal, releasing a statement that it had already made significant changes to its terms of service sand data protection guidelines since the case was first brought in 2015. In the meantime, Facebook stated it would update its data protection guidelines and terms of services so that they comply with the new EU-wide GDPR rules.

With the May 25, 2018 deadline quickly approaching, many businesses are scrambling to prepare for compliance with the EU’s General Data Protection Regulation (GDPR), and questions and conversations are heating up.  Still others are still trying to wrap their arms around what GDPR is and what it means for U.S. businesses.  For those of you still trying to wrap your heads around it, below are a few basics to help familiarize yourself with the regulation and its relevance to you.

  1. I’m a U.S. business. Why does GDPR matter to me?

The reach of the GDPR regulation extends not only to European-based businesses, but also to all companies that do business, have customers, or collect data from people in the EU.  If you even have a website that could collect data from someone visiting the site from the EU, your business could be affected. No matter where your business resides, if you intentionally offer goods or services to the European Union, or monitor the behavior of individuals within the EU, the GPDR could be applicable.

  1. What’s the risk?

In addition to the PR or brand risk of being associated with noncompliance, GDPR provides for some pretty significant monetary penalties .  Some violations are subject to fines up to 10 million EUR or up to 2% of global annual turnover, whichever is greater.  For other violations, it is double – up to 20 million euros or 4% of your global annual turnover, whichever is greater.  For large businesses, this could be a substantial amount.

  1. What should I be doing?

First, talk with your general counsel or outside law firm.  They can help you interpret the law, review contractual obligations and assess the company’s overall privacy policies to help guide your compliance strategy going forward.  They can also help create defensible interpretations within certain ambiguous language in the regulation (e.g., what is “personal data” for purposes of the GDPR?).  The Article 29 Working Party, made up of the data protection authorities (DPAs) from all EU member states, has published guidance to clarify certain provisions, which can be helpful during this process.

Second, create a cross-functional team including areas including (but not limited to): communications/PR, IT, customer experience, digital, legal and operations.  This may be fairly similar to any cross-functional teams you may have (and hopefully have) already established to prepare for data breaches.  This team can begin designing and implementing a compliance strategy.  Under certain conditions, your business may need to appoint a Data Protection Officer (DPO) (See Articles 29 and 30).

  1. What are some key points of the GDPR?

GDPR is a data privacy regulation in the EU that is aimed at protecting users’ rights and privacy online.  It requires business to assess what kinds of data they’re collecting and to make that data accessible to users.  The regulation is long and complex with several moving parts, but four key points may be worth noting.

Key Definitions:  You will see several references to controllers, data subjects, personal data, and processing.  This vocabulary may be unfamiliar in relation to U.S. law, but here is how these key terms are defined – as a business subject to GDPR, you may be a “controller” or you may be a “processor”.  The individual is the “data subject”:

  • “Controller” = “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law.”
  • “Processor” = “means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”
  • “Data subject”= “an identified or identifiable natural person (see definition of “personal data” above).”
  • “Personal data” = “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
  • “Processing” = “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”
  1. Some Key Articles/Provisions:

Article 12Transparent information, communication and modalities for the exercise of the rights of the data subject.

This article creates rules around how users give consent to record their data.  The data subject must be provided with accurate information on all relevant issues, such as the kind of data to be collected or process, and for what purposes. For some particularly sensitive data, (e.g., political opinion, religion, biometric data (including photographs), health data, etc.), consent must be “explicit”.   Consent must be “freely given”, meaning that the user has a “genuine” choice and be able to withdraw consent “without detriment”.  The data subject cannot be obliged to consent to data processing that is not necessary to provide the service he or she has requested.

For these reasons, the traditional “notice and consent” may not be sufficient, and actionable forms or buttons may be necessary.  “Silence, pre-ticked boxes or inactivity,” however, is presumed inadequate to confer consent.  Recital 32 of the GDPR notes that an affirmative action signaling consent may include ticking a box on a website, “choosing technical settings for information society services”, or “another statement or conduct” that clearly indicates assent to the processing.  “Silence, pre-ticked boxes, or inactivity” however, is presumed inadequate.  For those reaching European citizens digitally, working with IT or UX experts may prove important to create a seamless, but compliant, experience.

Article 17Right to erasure

The “right to be forgotten” means that businesses must be able to remove data on a user at their “without undue delay”.  Further, the businesses have an obligation to erase personal data “without undue delay” under certain additional circumstances.

Article 20. Right to data portability.

Users have the right to receive any data that a business may have on them the firm must provide such data in a “structured, commonly used and machine-readable format”.  Further, the data subject has the right to transmit such data to another business without being hindered by the business that provide the data where the processing is either (a) based on certain consents or (b) carried out by automated means.  Where technically feasible, the data subject also has the right to have the personal data transmitted directly from one controller to another.

Article 8. Conditions applicable to child’s consent in relation to information society services.

Article 8 limits the ability of children to consent to data processing without parental authorization.  Previous drafts of the GDPR had set the age of consent at 13 years old, which would have been consistent with the age set by the United States’ Children’s Online Privacy and Protection Act (“COPPA”). A last-minute proposal aimed  to raise the age of consent to 16 years old.  In the final draft, the age of consent is set at 16 unless a member state sets a lower age no below 13 years.  Thus, unless otherwise provided by member state law, controllers must obtain parental consent when processing the personal data of a child under the age of 16. With the difference between the U.S. age of consent under COPPA set at 13 (COPPA) and the European age of consent under the GDPR set at 16 (unless otherwise lowered by a member state), this could present some challenges for U.S. businesses offering international services.

Article 32.  Security of Processing.

Firms must follow security best practices across the board when collecting and protecting data. This may include, but isn’t limited to, specific password policies, information security frameworks (e.g., NIST, ISO, COBIT/ISACA, SSAE, etc.), and data encryption.

  1. What Else Should I Know?

If you believe your business might be affected, you should already be familiarizing yourself with the GDPR regulations and be well into your compliance plan.  The above summary is a sampling of key points and not a comprehensive analysis,, which should be undertaken to better understand your compliance obligations.  You should also be aware of the ePrivacy Regulation which will be following on the heels of the GDPR.

Whereas the GDPR covers the right to protection of personal data, while the ePrivacy Regulation encompasses a person’s right to a private life, including confidentiality.  There is some obvious overlap here, but the ePrivacy Regulation is intended to particularize GDPR for electronic communications — devices, processing techniques, storage, browsers etc.  The laws are intended to be in sync, but the ePrivacy regulations are still up in the air — optimistically forecasted to be finally approved by the end of 2018, although the implementation date remains to be seen.  In sum, GDPR compliance is all you can focus on right now, and hopefully GDPR compliance should position your business well for any additional compliance obligations that could subsequently arise from the finalized ePrivacy Regulation.

Today, the FTC issued its National Do Not Call Registry Data book for Fiscal Year 2017 (October 1, 2016 to September 30, 2017).

The National Do Not Call Registry Data Book contains statistical data about phone numbers on the Registry, telemarketers and sellers accessing phone numbers on the Registry, and complaints consumers submit to the FTC about telemarketers allegedly violating the Do Not Call rules. Statistical data on Do Not Call (DNC) complaints is based on unverified complaints reported by consumers, not on a consumer survey. This year’s Data Book has been redesigned to provide more information on robocall complaints, new information about the types of calls consumers reported to the FTC, and includes a complete state-by-state analysis.  In addition, the FTC has developed a mini site on its website to make the information in the FY 2017 Data Book more accessible for the public, such as providing a webpage for each state. For the first time, the data behind the report will be available in (.csv) data files.  Leading the number of complaints per 100,000 in population was New Jersey, with Puerto Rico in last place.

Here are some statistics from our firm’s geographic footprint states (rankings are based on complaints per 100,000 population).

Alabama:

  • Complaints: Ranked #22 (108,003) – up from 66,812 in 2016.
  • Active registrations: Ranked #31 (3,393,619)  — up from 3.33M in 2016
  • Complaints by call type:
  1. Robocall: 71,627
  2. Live caller: 34,896
  3. Type not reported: 1,480
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 13,673
  2. Warranties and Protection Plans: 5,036
  3. Home Security and Alarms: 3,973
  4. Imposters: 3,390
  5. Medical and Prescriptions: 3,305
  6. Vacation and timeshare 1,783
  7. Computer & Technical Support: 1,386
  8. Lotteries, prizes and sweepstakes : 1,110
  9. Work from home: 584
  10. Home improvement and cleaning: 253

District of Columbia (Washington, D.C.)

  • Complaints: Not ranked (24,303) – up from 18,304 in 2016.
  • Active registrations: Not ranked (620,154)  — up from 605,725 in 2016
  • Complaints by call type:
  1. Robocall: 16,724
  2. Live caller: 7,383
  3. Type not reported: 196
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 4,103
  2. Vacation and timeshares: 1,628
  3. Imposters: 1,262
  4. Warranties and protection plans: 665
  5. Medical and prescriptions: 459
  6. Computer and technical support: 419
  7. Lotteries, prizes and sweepstakes: 413
  8. Energy, solar, and utilities: 196
  9. Home improvement and cleaning: 188
  10. Home security and alarms: 178

Florida:

  • Complaints: Ranked #3 (588,021) – up from 385,490 in 2016
  • Active Registrations: Ranked #29 (14,605,866) – up from 14.39M in 2016
  • Complaints by call type:
  1. Robocall: 363,801
  2. Live caller: 219,301
  3. Type not reported: 4,919
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 63,231
  2. Vacation and timeshare: 19,058
  3. Warranties and protection plans: 18,939
  4. Imposters: 18,262
  5. Medical and prescriptions: 16,997
  6. Home security and alarms: 10,906
  7. Computer and technical support: 8,977
  8. Lotteries, prizes and sweepstakes: 6,270
  9. Energy, solar and utilities: 5,108
  10. Work from home: 3,370

Georgia:

  • Complaints: Ranked #15 (242,242) – up from 168,478  in 2016.
  • Active registrations: Ranked #34 (7,095,159)  — up from 6.97M in 2016
  • Complaints by call type:
  1. Robocall: 153,542
  2. Live caller: 86,095
  3. Type not reported: 2,605
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 28,121
  2. Vacation and timeshares: 11,711
  3. Warranties and protection plans: 10,658
  4. Medical and prescriptions: 7,913
  5. Imposters: 7,817
  6. Home security and alarms: 5,789
  7. Computer and technical support: 3,091`
  8. Lotteries, prizes and sweepstakes: 2,840
  9. Work from home: 1,467
  10. Home improvement and cleaning: 1,237

Mississippi:

  • Complaints: Ranked #46 (39,969) – up from 25,221 in 2016.
  • Active registrations: Ranked #48 (1,636,395)  — up from 1.60M in 2016
  • Complaints by call type:
  1. Robocall: 25,674
  2. Live caller: 13,567
  3. Type not reported: 728
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 6,111
  2. Warranties and protection plans: 2,252
  3. Imposters: 1,413
  4. Vacation and timeshares: 1,261
  5. Home security & alarms: 1,075
  6. Medical and prescriptions: 1,068
  7. Computer and technical support: 554
  8. Lotteries, prizes and sweepstakes: 528
  9. Work from home: 254
  10. Home improvement and cleaning: 59

Yesterday, the Federal Trade Commission (FTC) and Federal Communications Commission (FCC) announced their intent to coordinate which of the two agencies would coordinate online consumer protection efforts following the adoption of the Restore Internet Freedom Order, and published a draft Memorandum of Understanding (MOU) that outlines those efforts.

The draft MOU outlines a number of ways in which the FCC and FTC will coordinate and collaborate, including:

  • The FCC will review informal complaints concerning the compliance of Internet service providers (ISPs) with the disclosure obligations set forth in the new transparency rule. Those obligations include publicly providing information concerning an ISP’s practices with respect to blocking, throttling, paid prioritization, and congestion management. Should an ISP fail to make the required disclosures—either in whole or in part—the FCC will take enforcement action.
  • The FTC will investigate and take enforcement action as appropriate against ISPs concerning the accuracy of those disclosures, as well as other deceptive or unfair acts or practices involving their broadband services.
  • The FCC and the FTC will broadly share legal and technical expertise, including the secure sharing of informal complaints regarding the subject matter of the Restoring Internet Freedom Order. The two agencies also will collaborate on consumer and industry outreach and education.

 

The FCC is expected to vote on the order at its December 14 meeting. This order would reverse the 2015 “net neutrality” order reclassifying broadband Internet access service as a Title II common carrier service.  According to the FTC’s press release, one of the impacts of this reclassification was to “strip the FTC of its authority to protect consumers and promote competition with respect to Internet service providers because the FTC does not have jurisdiction over common carrier activities.”  By reversing the order, the FCC would return jurisdiction to the FTC to policy the conduct of ISPs with respect to their disclosures and privacy practices.  Once adopted, the order would require broadband Internet access service providers to disclose their network management practices, performance, and commercial terms of services.  The FTC could then police their implementation of those practices under the “unfair and deceptive practices” requirement under Section 5 of the FTC Act.

In response to the MOU, FCC Chairman Ajit Pai stated that the MOU “will be a critical benefit for online consumers because it outlines the robust process by which the FCC and FTC will safeguard the public interest. …  This approach protected a free and open Internet for many years prior to the FCC’s 2015 Title II Order and it will once again following the adoption of the Restoring Internet Freedom Order.”  Acting FTC Chairman, Maureen K. Ohlhausen, stated that “[t]he FTC is committed to ensuring that Internet service providers live up to the promises they make to consumers .. [and that] [t]he MOU we are developing with the FCC, in addition to the decades of FTC law enforcement experience in this area, will help us carry out this important work.”

FCC Commissioner Mignon Clyburn, who opposes the proposed order, released the following statement:  “The agreement announced today between the FCC and FTC is a confusing, lackluster,  reactionary afterthought: an attempt to paper over weaknesses in the Chairman’s draft proposal repealing the FCC’s 2015 net neutrality rules.  Two years ago, the FCC signed a much broader pro-consumer agreement with the FTC that already covers this issue. There is no reason to do this again other than as a smoke and mirrors PR stunt, distracting from the FCC’s planned destruction of net neutrality protections later this week.”

To view the MOU, click here.

The FTC is seeking public comment on a petition by Sear’s to reopen and modify its 2009 consent order to restrict the broad definition of “tracking application”.

Background.  In 2009, the FTC issued an order settling charges that Sears Holdings Management Corporation (“Sears”) had failed to adequately disclose the scope of consumers’ personal information it collected via a downloadable software application.  While Sears represented to consumers that the software would track their “online browsing”, the FTC alleged that the software would also monitor consumers’ other online secure sessions – including sessions on third parties’ websites — and collect information transmitted in those sessions, “such as the contents of shopping carts, online bank statements, drug prescription records, video rental records, library borrowing histories, and the sender, recipient, subject, and size for web-based emails.”  The software would also track some computer activities unrelated to the Internet.  The proposed settlement called for Sears to stop collecting data from consumers who downloaded the software, and to destroy all data it had previously collected.

The 2009 Sears case is significant, among other reasons, because, the FTC found a violation of Section 5 of the FTC Act notwithstanding Sears’ disclosure, because the disclosure was not sufficiently conspicuous.  Specifically, while Sears did disclose the full scope of the software’s specific functions, the details of such functions were contained on approximately the 75th line of the scroll box containing the privacy statement and user license agreement.  The FTC order stated that because such description was not displayed clearly and prominently, that Sears was being “unfair and deceptive” under Section 5 of the FTC Act.

Petition.  On October 30, 2017, Sears petitioned the FTC to reopen and modify its final order to modify the broad definition of “tracking application”.   Sears states that the current definition should be updated because of changing circumstances over the past eight years which result in the definition unnecessarily restricting Sears’s ability to compete in the mobile app marketplace. Sears states that the requested modification would enable the company to “keep step with current market practices” related to retail online tracking applications.

  • Definition. Paragraph 4 of the consent order defines “tracking application” as:  “any software program or application disseminated by or on behalf of respondent, its subsidiaries or affiliated companies, that is capable of being installed on consumers’ computers and used by or on behalf of respondent to monitor, record, or transmit information about activities occurring on computers on which it is installed, or about data that is stored on, created on, transmitted from, or transmitted to the computers on which it is installed.” 
  • Modification. Sears requests that the following additional language be inserted after the word “installed”: “unless the information monitored, recorded, or transmitted is limited solely to the following: (a) the configuration of the software program or application itself; (b) information regarding whether the program or application is functioning as represented; or (c) information regarding consumers’ use of the program or application itself.”
  • Rationale. Sears states that the proposed modification is necessary to carve out commonly accepted and expected behaviors from the scope of the Order without modifying the Order’s core manage of providing notice to consumers when software applications engaged in potentially invasive tracking.  Sears states subparts (a) and (b) would exclude “activities common to all modern software applications” while subpart (c) would exclude “information tracking that is commonly accepted by consumers and that does not present the type of risks to consumer privacy that the Order was intended to remedy.” Sears further states that the proposed modification mirrors language that the FTC has used to exclude such commonly accepted practices from more recent consent orders.

Solicitation of Public Comment.  On November 8, the FTC issued a release seeking public comment on Sear’s petition requesting that it reopen and modify the 2009 order and definition.  The FTC will decide whether to approve Sears’ petition following the expiration of the 30-day public comment period.  Public comments may be submitted under December 8, 2017.

To view the 2009 FTC Order, click here.

To view Sears’s Petition, click here:

To view FTC’s solicitation of public comment click here.

 

Woman Touching Screen Electronic Tablet Hand.Project Manager Researching ProcessOn November 11, 2016, Facebook announced to USA TODAY that it would no longer allow advertisers to exclude specific racial and ethnic groups when placing ads related to housing, credit or employment, according to a statement by Erin Egan, Facebook’s vice-president of U.S. public policy to USA Today.  According to the news article, Facebook will also require advertisers to affirm that they will not place discriminatory ads on Facebook, and will plan to offer educational materials to help advertisers understand their obligations.

Continue Reading Facebook to Stop Ads Targeting, Excluding Racial and Ethnic Groups