Advertising & Marketing

On January 21, 2019, the French Data Protection Authority, the Commission Nationale de L’Informatique et de Libertés (“CNIL”) announced a sanction of 50 million euros against Google.  On May 25 and 28, 2018, the CNIL received complaints from two different associations, asserting that Google did not have a valid legal basis for the processing of personal data of the users of its services, particularly with respect to ad personalization.  The complaints were brought by “None of Your Business”, a nonprofit organization chaired by Max Schrems (yes, that Max Schrems), and “La quadrature du Net”, a French digital rights advocacy group. The decision is significant for at least two reasons: (1) because it reveals CNIL’s analysis in how it was permitted to issue the decision and sanction despite Google’s European headquarters and (2) because it is the first time the CNIL has leveraged its new powers under GPDR to issue a sanction greater than its € 20 million pre-GDPR limits.

Coordination of Enforcement

The GDPR establishes a “one stop shop mechanism”, providing that an organization with a main establishment in the European Union shall have only one interlocutor, the Data Protection Authority (“DPA”) in the country where its main establishment is located, which shall serve as the “lead authority”.  In Google’s case, their European headquarters is in Ireland.  The lead authority must coordinate the cooperation between the other DPAs before taking any decision about cross-border processing carried out by the company. The CNIL cited the definition of “main establishment” in Article 4(16)(a):  “as regards a controller with establishments in more than one Member State, the place of its central administration in the Union, unless the decisions on the purposes and means of the processing of personal data are taken in another establishment of the controller in the Union and the latter establishment has the power to have such decisions implemented, in which case the establishment having taken such decisions is to be considered to be the main establishment …”.  It then discussed several elements of Google’s European headquarters in Ireland,

After lengthy discussion, the CNIL concluded that the restricted training taking place at Google’s European headquarters reveals that it could not be considered as a main establishment within the meaning of Article 4(16) when it is not established that the Ireland headquarters had decision making power as to privacy policies presented to the user during the creation of this account during the configuration of the Android mobile phone.  In the absence of a main establishment, therefore, the CNIL was competent to initiate this procedure and to exercise its powers. The CNIL therefore asserted authority to make decision regarding Google’s processing operations, and implemented the new European framework interested by all European authorities in the EDPB’s guidelines.

CNIL’s restricted committee carried out online inspections in September 2018 to verify the compliance of the processing operations implemented by Google with the French Data Protection Act and the GDPR by analyzing the browsing pattern of a user and the documents he or she can have access to when creating a Google account during the configuration of Android mobile equipment. On the basis of its inspections, the CNIL’s restricted committee observed two types of breaches of the GPDR.

Violation of Transparency and Information.

First, the CNIL noticed that the information provided by Google was not easily accessible for users:

“Essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are excessively disseminated across several documents, with buttons and links on which it is required to click to access complementary information. The relevant information is accessible after several steps only, implying sometimes up to 5 or 6 actions. For instance, this is the case when a user wants to have complete information on his or her data collected for the personalization purposes or for the geo-tracking service.”

The restricted committee also observed that some information is not always clear or comprehensive:

“Users are not able to fully understand the extent of the processing operations carried out by Google. But the processing operations are particularly massive and intrusive because of the number of services offered (about twenty), and the amount and the nature of the data processed and combined. The restricted committee observe[d] in particular that the purposes of processing are described in a too generic and vague manner, and so are the categories of data processed for these various purposes. Similarly, the information communicated is not clear enough so that the user can understand that the legal basis of processing operations for the ads personalization is the consent, and not the legitimate interest of the company. Finally, the restricted committee notices that the information about the retention period is not provided for some data.”

Violation of the obligation to have a Legal Basis for ads Personalization Processing.

Although Google stated that it obtained user consent to process data for ads personalization purposes, the committee considered that the consent was not validly obtained for two reasons:

“First, the restricted committee observed that the users’ consent was not sufficiently informed.   The information on processing operations for the ads personalization is diluted in several documents and does not enable the user to be aware of their extent. For example, in the section “Ads Personalization”, it is not possible to be aware of the plurality of services, websites and applications involved in these processing operations (Google search, You tube, Google home, Google maps, Playstore, Google pictures…) and therefore of the amount of data processed and combined.”

Second, the committee observed that consent collected by Google was neither “specific” nor “unambiguous”.  Admittedly, when a user creates an account he or she can modify some account options by clicking on the button <<More options>>, accessible above the button <<Create Account>>.  It is notably possible to configure the display of personalized ads.  However, the use not only has to click on <<More options>> to access the configuration, but the display of ads personalization is pre-checked. However, the GDPR requires that consent is “unambiguous” only with a clear affirmative action from the user (e.g., opting in by ticking a non-pre-ticked box for instance, as opposed to opting out by clearing a pre-ticked box). Finally, before creating an account, the user is asked to tick the boxes << I agree to Google’s Terms of Service>> and “I agree to the processing of my information as described above and further explained in the Privacy Policy” in order to create the account.  In other words, the user gives his or her consent in full for all of the processing operations purposes carried by Google based on this consent (e.g., ads personalization, speech recognition, etc.). However, GDPR requires that consent is “specific” only if it is given distinctly for each purpose.


As a result of its findings, the committee publicly imposed a financial penalty of 50 million euros against Google, representing the first time that the CNIL applied the new sanction limits provided by the GDPR.  CNIL stated that the amount and publicity of the sanction was “justified by the severity of the infringements observed regarding the essential principles of the GDPR:  transparency, information, and consent.”

Despite the measures implemented by Google (documentation and configuration tools), CNIL stated that the infringements observed “deprive the users of essential guarantees regarding processing operations that can reveal important parts of their private life since they are based on a huge amount of data, a wide variety of services, and almost unlimited possible combinations.”  The committee recalled that the extent of the processing operations in question “imposes to enable the users to control their data and therefore to sufficiently inform them and allow them to validly consent.”  Moreover, the committee, stated, the violations were continuous breaches of the regulation as they are still observed to date; it Is not a one-off, time-limited infringement.  The CNIL also noted the important place the Android operating system has on the French market, with thousands of French citizens creating Google accounts everyday when using their smartphone. Finally, the restricted committee points out that the economic model of the company is partly based on the ads personalization.

Google Response.

In a statement obtained by ABC News, a Google spokesperson said the company is “studying the decision” to determine its next steps:

“People expect high standards of transparency and control from us. We’re deeply committed to meeting those expectations and the consent requirements of the GDPR. We’re studying the decision to determine our next steps.”

To view the CNIL press release, click here.

To view the CNIL decision (in French), click here.

A Berlin regional court recently ruled that Facebook’s use of personal data was illegal because the social media platform did not adequately secure the informed consent of its users. A German consumer rights group, the Federal of German Consumer Organisations (vzvb) said that Facebook’s default settings and some of its terms of service were in breach of consumer law, and that the court had found parts of the consent to data usage to be invalid. One concern highlighted by the consumer rights group was that, in Facebook’s app for smartphones, a service was pre-activated that revealed the user’s location to the person they were chatting to.  Also, in the privacy settings, ticks were already placed in boxes that allowed search engines to link to the user’s timeline, meaning that anyone would be able quickly and easily to find a user’s profile.

A week after the ruling, Facebook promised to radically overhaul its privacy settings, saying the work would prepare it for the introduction of the upcoming General Data Protection Regulations (GDPR).  Facebook has faced repeated attacks from Germany and other European regulators over issues ranging from perceived anti-competitive practices to alleged misuse of customer data. In October, the Article 29 Working Party (WP29) launched a task force to examine the sharing of user data between WhatsApp and Facebook, which it says does not have sufficient user consent.  “Whilst the WP29 notes there is a balance to be struck between presenting the user with too much information and not enough, the initial screen made no mention at all of the key information users needed to make an informed choice, namely that clicking the agree button would result in their personal data being shared with the Facebook family of companies,” the group told WhatsApp in October.

Similarly, a Belgian court earlier this month ordered Facebook to stop collecting data on users or face daily fines of €250,000 a day, or up to €100million.  The court ruled that Facebook had broken privacy laws by tracking people on third-party sites. “Facebook informs us insufficiently about gathering information about us, the kind of data it collects, what it does with that data and how long it stores it,” the court said. “It also does not gain our consent to collect and store all this information.”  The court ordered Facebook to delete all data it had gathered illegally on Belgian citizens, including people who were not users of the social network.

With regards to the German suit, Facebook said it would appeal, releasing a statement that it had already made significant changes to its terms of service sand data protection guidelines since the case was first brought in 2015. In the meantime, Facebook stated it would update its data protection guidelines and terms of services so that they comply with the new EU-wide GDPR rules.

With the May 25, 2018 deadline quickly approaching, many businesses are scrambling to prepare for compliance with the EU’s General Data Protection Regulation (GDPR), and questions and conversations are heating up.  Still others are still trying to wrap their arms around what GDPR is and what it means for U.S. businesses.  For those of you still trying to wrap your heads around it, below are a few basics to help familiarize yourself with the regulation and its relevance to you.

  1. I’m a U.S. business. Why does GDPR matter to me?

The reach of the GDPR regulation extends not only to European-based businesses, but also to all companies that do business, have customers, or collect data from people in the EU.  If you even have a website that could collect data from someone visiting the site from the EU, your business could be affected. No matter where your business resides, if you intentionally offer goods or services to the European Union, or monitor the behavior of individuals within the EU, the GPDR could be applicable.

  1. What’s the risk?

In addition to the PR or brand risk of being associated with noncompliance, GDPR provides for some pretty significant monetary penalties .  Some violations are subject to fines up to 10 million EUR or up to 2% of global annual turnover, whichever is greater.  For other violations, it is double – up to 20 million euros or 4% of your global annual turnover, whichever is greater.  For large businesses, this could be a substantial amount.

  1. What should I be doing?

First, talk with your general counsel or outside law firm.  They can help you interpret the law, review contractual obligations and assess the company’s overall privacy policies to help guide your compliance strategy going forward.  They can also help create defensible interpretations within certain ambiguous language in the regulation (e.g., what is “personal data” for purposes of the GDPR?).  The Article 29 Working Party, made up of the data protection authorities (DPAs) from all EU member states, has published guidance to clarify certain provisions, which can be helpful during this process.

Second, create a cross-functional team including areas including (but not limited to): communications/PR, IT, customer experience, digital, legal and operations.  This may be fairly similar to any cross-functional teams you may have (and hopefully have) already established to prepare for data breaches.  This team can begin designing and implementing a compliance strategy.  Under certain conditions, your business may need to appoint a Data Protection Officer (DPO) (See Articles 29 and 30).

  1. What are some key points of the GDPR?

GDPR is a data privacy regulation in the EU that is aimed at protecting users’ rights and privacy online.  It requires business to assess what kinds of data they’re collecting and to make that data accessible to users.  The regulation is long and complex with several moving parts, but four key points may be worth noting.

Key Definitions:  You will see several references to controllers, data subjects, personal data, and processing.  This vocabulary may be unfamiliar in relation to U.S. law, but here is how these key terms are defined – as a business subject to GDPR, you may be a “controller” or you may be a “processor”.  The individual is the “data subject”:

  • “Controller” = “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law.”
  • “Processor” = “means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”
  • “Data subject”= “an identified or identifiable natural person (see definition of “personal data” above).”
  • “Personal data” = “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
  • “Processing” = “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”
  1. Some Key Articles/Provisions:

Article 12Transparent information, communication and modalities for the exercise of the rights of the data subject.

This article creates rules around how users give consent to record their data.  The data subject must be provided with accurate information on all relevant issues, such as the kind of data to be collected or process, and for what purposes. For some particularly sensitive data, (e.g., political opinion, religion, biometric data (including photographs), health data, etc.), consent must be “explicit”.   Consent must be “freely given”, meaning that the user has a “genuine” choice and be able to withdraw consent “without detriment”.  The data subject cannot be obliged to consent to data processing that is not necessary to provide the service he or she has requested.

For these reasons, the traditional “notice and consent” may not be sufficient, and actionable forms or buttons may be necessary.  “Silence, pre-ticked boxes or inactivity,” however, is presumed inadequate to confer consent.  Recital 32 of the GDPR notes that an affirmative action signaling consent may include ticking a box on a website, “choosing technical settings for information society services”, or “another statement or conduct” that clearly indicates assent to the processing.  “Silence, pre-ticked boxes, or inactivity” however, is presumed inadequate.  For those reaching European citizens digitally, working with IT or UX experts may prove important to create a seamless, but compliant, experience.

Article 17Right to erasure

The “right to be forgotten” means that businesses must be able to remove data on a user at their “without undue delay”.  Further, the businesses have an obligation to erase personal data “without undue delay” under certain additional circumstances.

Article 20. Right to data portability.

Users have the right to receive any data that a business may have on them the firm must provide such data in a “structured, commonly used and machine-readable format”.  Further, the data subject has the right to transmit such data to another business without being hindered by the business that provide the data where the processing is either (a) based on certain consents or (b) carried out by automated means.  Where technically feasible, the data subject also has the right to have the personal data transmitted directly from one controller to another.

Article 8. Conditions applicable to child’s consent in relation to information society services.

Article 8 limits the ability of children to consent to data processing without parental authorization.  Previous drafts of the GDPR had set the age of consent at 13 years old, which would have been consistent with the age set by the United States’ Children’s Online Privacy and Protection Act (“COPPA”). A last-minute proposal aimed  to raise the age of consent to 16 years old.  In the final draft, the age of consent is set at 16 unless a member state sets a lower age no below 13 years.  Thus, unless otherwise provided by member state law, controllers must obtain parental consent when processing the personal data of a child under the age of 16. With the difference between the U.S. age of consent under COPPA set at 13 (COPPA) and the European age of consent under the GDPR set at 16 (unless otherwise lowered by a member state), this could present some challenges for U.S. businesses offering international services.

Article 32.  Security of Processing.

Firms must follow security best practices across the board when collecting and protecting data. This may include, but isn’t limited to, specific password policies, information security frameworks (e.g., NIST, ISO, COBIT/ISACA, SSAE, etc.), and data encryption.

  1. What Else Should I Know?

If you believe your business might be affected, you should already be familiarizing yourself with the GDPR regulations and be well into your compliance plan.  The above summary is a sampling of key points and not a comprehensive analysis,, which should be undertaken to better understand your compliance obligations.  You should also be aware of the ePrivacy Regulation which will be following on the heels of the GDPR.

Whereas the GDPR covers the right to protection of personal data, while the ePrivacy Regulation encompasses a person’s right to a private life, including confidentiality.  There is some obvious overlap here, but the ePrivacy Regulation is intended to particularize GDPR for electronic communications — devices, processing techniques, storage, browsers etc.  The laws are intended to be in sync, but the ePrivacy regulations are still up in the air — optimistically forecasted to be finally approved by the end of 2018, although the implementation date remains to be seen.  In sum, GDPR compliance is all you can focus on right now, and hopefully GDPR compliance should position your business well for any additional compliance obligations that could subsequently arise from the finalized ePrivacy Regulation.

Today, the FTC issued its National Do Not Call Registry Data book for Fiscal Year 2017 (October 1, 2016 to September 30, 2017).

The National Do Not Call Registry Data Book contains statistical data about phone numbers on the Registry, telemarketers and sellers accessing phone numbers on the Registry, and complaints consumers submit to the FTC about telemarketers allegedly violating the Do Not Call rules. Statistical data on Do Not Call (DNC) complaints is based on unverified complaints reported by consumers, not on a consumer survey. This year’s Data Book has been redesigned to provide more information on robocall complaints, new information about the types of calls consumers reported to the FTC, and includes a complete state-by-state analysis.  In addition, the FTC has developed a mini site on its website to make the information in the FY 2017 Data Book more accessible for the public, such as providing a webpage for each state. For the first time, the data behind the report will be available in (.csv) data files.  Leading the number of complaints per 100,000 in population was New Jersey, with Puerto Rico in last place.

Here are some statistics from our firm’s geographic footprint states (rankings are based on complaints per 100,000 population).


  • Complaints: Ranked #22 (108,003) – up from 66,812 in 2016.
  • Active registrations: Ranked #31 (3,393,619)  — up from 3.33M in 2016
  • Complaints by call type:
  1. Robocall: 71,627
  2. Live caller: 34,896
  3. Type not reported: 1,480
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 13,673
  2. Warranties and Protection Plans: 5,036
  3. Home Security and Alarms: 3,973
  4. Imposters: 3,390
  5. Medical and Prescriptions: 3,305
  6. Vacation and timeshare 1,783
  7. Computer & Technical Support: 1,386
  8. Lotteries, prizes and sweepstakes : 1,110
  9. Work from home: 584
  10. Home improvement and cleaning: 253

District of Columbia (Washington, D.C.)

  • Complaints: Not ranked (24,303) – up from 18,304 in 2016.
  • Active registrations: Not ranked (620,154)  — up from 605,725 in 2016
  • Complaints by call type:
  1. Robocall: 16,724
  2. Live caller: 7,383
  3. Type not reported: 196
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 4,103
  2. Vacation and timeshares: 1,628
  3. Imposters: 1,262
  4. Warranties and protection plans: 665
  5. Medical and prescriptions: 459
  6. Computer and technical support: 419
  7. Lotteries, prizes and sweepstakes: 413
  8. Energy, solar, and utilities: 196
  9. Home improvement and cleaning: 188
  10. Home security and alarms: 178


  • Complaints: Ranked #3 (588,021) – up from 385,490 in 2016
  • Active Registrations: Ranked #29 (14,605,866) – up from 14.39M in 2016
  • Complaints by call type:
  1. Robocall: 363,801
  2. Live caller: 219,301
  3. Type not reported: 4,919
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 63,231
  2. Vacation and timeshare: 19,058
  3. Warranties and protection plans: 18,939
  4. Imposters: 18,262
  5. Medical and prescriptions: 16,997
  6. Home security and alarms: 10,906
  7. Computer and technical support: 8,977
  8. Lotteries, prizes and sweepstakes: 6,270
  9. Energy, solar and utilities: 5,108
  10. Work from home: 3,370


  • Complaints: Ranked #15 (242,242) – up from 168,478  in 2016.
  • Active registrations: Ranked #34 (7,095,159)  — up from 6.97M in 2016
  • Complaints by call type:
  1. Robocall: 153,542
  2. Live caller: 86,095
  3. Type not reported: 2,605
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 28,121
  2. Vacation and timeshares: 11,711
  3. Warranties and protection plans: 10,658
  4. Medical and prescriptions: 7,913
  5. Imposters: 7,817
  6. Home security and alarms: 5,789
  7. Computer and technical support: 3,091`
  8. Lotteries, prizes and sweepstakes: 2,840
  9. Work from home: 1,467
  10. Home improvement and cleaning: 1,237


  • Complaints: Ranked #46 (39,969) – up from 25,221 in 2016.
  • Active registrations: Ranked #48 (1,636,395)  — up from 1.60M in 2016
  • Complaints by call type:
  1. Robocall: 25,674
  2. Live caller: 13,567
  3. Type not reported: 728
  • Top 10 Complaints by topics (not all complaints report a topic):
  1. Reducing debt: 6,111
  2. Warranties and protection plans: 2,252
  3. Imposters: 1,413
  4. Vacation and timeshares: 1,261
  5. Home security & alarms: 1,075
  6. Medical and prescriptions: 1,068
  7. Computer and technical support: 554
  8. Lotteries, prizes and sweepstakes: 528
  9. Work from home: 254
  10. Home improvement and cleaning: 59

Yesterday, the Federal Trade Commission (FTC) and Federal Communications Commission (FCC) announced their intent to coordinate which of the two agencies would coordinate online consumer protection efforts following the adoption of the Restore Internet Freedom Order, and published a draft Memorandum of Understanding (MOU) that outlines those efforts.

The draft MOU outlines a number of ways in which the FCC and FTC will coordinate and collaborate, including:

  • The FCC will review informal complaints concerning the compliance of Internet service providers (ISPs) with the disclosure obligations set forth in the new transparency rule. Those obligations include publicly providing information concerning an ISP’s practices with respect to blocking, throttling, paid prioritization, and congestion management. Should an ISP fail to make the required disclosures—either in whole or in part—the FCC will take enforcement action.
  • The FTC will investigate and take enforcement action as appropriate against ISPs concerning the accuracy of those disclosures, as well as other deceptive or unfair acts or practices involving their broadband services.
  • The FCC and the FTC will broadly share legal and technical expertise, including the secure sharing of informal complaints regarding the subject matter of the Restoring Internet Freedom Order. The two agencies also will collaborate on consumer and industry outreach and education.


The FCC is expected to vote on the order at its December 14 meeting. This order would reverse the 2015 “net neutrality” order reclassifying broadband Internet access service as a Title II common carrier service.  According to the FTC’s press release, one of the impacts of this reclassification was to “strip the FTC of its authority to protect consumers and promote competition with respect to Internet service providers because the FTC does not have jurisdiction over common carrier activities.”  By reversing the order, the FCC would return jurisdiction to the FTC to policy the conduct of ISPs with respect to their disclosures and privacy practices.  Once adopted, the order would require broadband Internet access service providers to disclose their network management practices, performance, and commercial terms of services.  The FTC could then police their implementation of those practices under the “unfair and deceptive practices” requirement under Section 5 of the FTC Act.

In response to the MOU, FCC Chairman Ajit Pai stated that the MOU “will be a critical benefit for online consumers because it outlines the robust process by which the FCC and FTC will safeguard the public interest. …  This approach protected a free and open Internet for many years prior to the FCC’s 2015 Title II Order and it will once again following the adoption of the Restoring Internet Freedom Order.”  Acting FTC Chairman, Maureen K. Ohlhausen, stated that “[t]he FTC is committed to ensuring that Internet service providers live up to the promises they make to consumers .. [and that] [t]he MOU we are developing with the FCC, in addition to the decades of FTC law enforcement experience in this area, will help us carry out this important work.”

FCC Commissioner Mignon Clyburn, who opposes the proposed order, released the following statement:  “The agreement announced today between the FCC and FTC is a confusing, lackluster,  reactionary afterthought: an attempt to paper over weaknesses in the Chairman’s draft proposal repealing the FCC’s 2015 net neutrality rules.  Two years ago, the FCC signed a much broader pro-consumer agreement with the FTC that already covers this issue. There is no reason to do this again other than as a smoke and mirrors PR stunt, distracting from the FCC’s planned destruction of net neutrality protections later this week.”

To view the MOU, click here.

The FTC is seeking public comment on a petition by Sear’s to reopen and modify its 2009 consent order to restrict the broad definition of “tracking application”.

Background.  In 2009, the FTC issued an order settling charges that Sears Holdings Management Corporation (“Sears”) had failed to adequately disclose the scope of consumers’ personal information it collected via a downloadable software application.  While Sears represented to consumers that the software would track their “online browsing”, the FTC alleged that the software would also monitor consumers’ other online secure sessions – including sessions on third parties’ websites — and collect information transmitted in those sessions, “such as the contents of shopping carts, online bank statements, drug prescription records, video rental records, library borrowing histories, and the sender, recipient, subject, and size for web-based emails.”  The software would also track some computer activities unrelated to the Internet.  The proposed settlement called for Sears to stop collecting data from consumers who downloaded the software, and to destroy all data it had previously collected.

The 2009 Sears case is significant, among other reasons, because, the FTC found a violation of Section 5 of the FTC Act notwithstanding Sears’ disclosure, because the disclosure was not sufficiently conspicuous.  Specifically, while Sears did disclose the full scope of the software’s specific functions, the details of such functions were contained on approximately the 75th line of the scroll box containing the privacy statement and user license agreement.  The FTC order stated that because such description was not displayed clearly and prominently, that Sears was being “unfair and deceptive” under Section 5 of the FTC Act.

Petition.  On October 30, 2017, Sears petitioned the FTC to reopen and modify its final order to modify the broad definition of “tracking application”.   Sears states that the current definition should be updated because of changing circumstances over the past eight years which result in the definition unnecessarily restricting Sears’s ability to compete in the mobile app marketplace. Sears states that the requested modification would enable the company to “keep step with current market practices” related to retail online tracking applications.

  • Definition. Paragraph 4 of the consent order defines “tracking application” as:  “any software program or application disseminated by or on behalf of respondent, its subsidiaries or affiliated companies, that is capable of being installed on consumers’ computers and used by or on behalf of respondent to monitor, record, or transmit information about activities occurring on computers on which it is installed, or about data that is stored on, created on, transmitted from, or transmitted to the computers on which it is installed.” 
  • Modification. Sears requests that the following additional language be inserted after the word “installed”: “unless the information monitored, recorded, or transmitted is limited solely to the following: (a) the configuration of the software program or application itself; (b) information regarding whether the program or application is functioning as represented; or (c) information regarding consumers’ use of the program or application itself.”
  • Rationale. Sears states that the proposed modification is necessary to carve out commonly accepted and expected behaviors from the scope of the Order without modifying the Order’s core manage of providing notice to consumers when software applications engaged in potentially invasive tracking.  Sears states subparts (a) and (b) would exclude “activities common to all modern software applications” while subpart (c) would exclude “information tracking that is commonly accepted by consumers and that does not present the type of risks to consumer privacy that the Order was intended to remedy.” Sears further states that the proposed modification mirrors language that the FTC has used to exclude such commonly accepted practices from more recent consent orders.

Solicitation of Public Comment.  On November 8, the FTC issued a release seeking public comment on Sear’s petition requesting that it reopen and modify the 2009 order and definition.  The FTC will decide whether to approve Sears’ petition following the expiration of the 30-day public comment period.  Public comments may be submitted under December 8, 2017.

To view the 2009 FTC Order, click here.

To view Sears’s Petition, click here:

To view FTC’s solicitation of public comment click here.


Woman Touching Screen Electronic Tablet Hand.Project Manager Researching ProcessOn November 11, 2016, Facebook announced to USA TODAY that it would no longer allow advertisers to exclude specific racial and ethnic groups when placing ads related to housing, credit or employment, according to a statement by Erin Egan, Facebook’s vice-president of U.S. public policy to USA Today.  According to the news article, Facebook will also require advertisers to affirm that they will not place discriminatory ads on Facebook, and will plan to offer educational materials to help advertisers understand their obligations.

Continue Reading Facebook to Stop Ads Targeting, Excluding Racial and Ethnic Groups