On Monday, March 13, 2023, The Texas House Business & Industry committee held a hearing for the main data privacy bill for this legislative session by Representative Capriglione of Southlake, TX, a Dallas suburb. The 34-page bill filed earlier this year aims to comprehensively address how companies and consumers interact with personal data. Similar to California, European, and a handful of other state data privacy laws enacted last year, the bill outlines five rights that consumers have to control their data including the right to know when personal data is being collected and the right to access their data. However, the bill is somewhat unique in that, whereas other applicability provisions filter out small businesses based on annual gross revenue or volume of data collected or processed, the Texas bill expressly exempts a “small business” as defined by the U.S. Small Business Administration (SBA). In addition, whereas several other bills address “personal data” versus “de-identified” data, the Texas bill uniquely distinguishes between “de-identified data” (which cannot be linked to an individual) and “pseudonymous data” (which cannot be linked to an individual without additional information) and includes requirements regarding the handling of both kinds of data to prevent re-identification. The bill goes into further detail regarding the business and consumer relationship.

The committee made changes to the original language of the bill, and it is likely the bill will see more modifications as it travels to the House floor on Tuesday, April 4th, and eventually to the Senate. We will continue to monitor and engage in the crafting of this legislation.

If you conduct business in Texas or are not sure how this law will affect your data collection practices and consumer interactions, feel free to reach out to Balch’s Data Privacy and Security team.

To view information about the bill, click here

To view the committee hearing, click here (Beginning at 1:11:05)

As state legislatures around the country continue to introduce comprehensive consumer privacy bills, those states who have already enacted them continue to flesh out proposed regulations and other guidance, in some cases even after the effective dates of those laws. .

California. The new CPRA amendments (including expiration of the CCPA employee and B2B exemptions) took effect on January 1, 2023. On February 14, 2023, the California Privacy Protection Agency (“CPPA”) filed its final draft of the California Privacy Act of 2020 (“CPRA”) regulations with the California Office of Administrative Law (“OAL”).  This filing begins a 30-day review period, where the OAL has until March 292, 2023 to review the regulations.  If approved, they will be submitted to the California Secretary of State for filing. Otherwise, the OAL will provide notice of the CPAA with a written decision of its reasons for disapproving the package.  To view the latest CPRA regulations, click here.

Update: Today, the CPPA has announced an upcoming public meeting on March 3, 2023. More details are available at: https://cppa.ca.gov/meetings/.

Colorado. The Colorado Consumer Privacy act took effect on January, 1, 2023.  Meanwhile, on January 27, 2023, the Colorado Attorney General released an updated draft of its rules on the Colorado Privacy Act, based on input received through January 18, 2023. A hearing on the proposed rules took place on February 1, 2023, and the comment period closed on February 3, 2023. 137 comments were filed. As of this posting, a revised draft has not yet been released.  To review the latest (Jan 27, 2023) Colorado Rules, click here.

Virginia.  The Virginia Consumer Data Protection Act (“VCDPA”) took effect on January 1, 2023.  The Office of Attorney General has not indicated plans to develop implementing regulations, but did release some summary FAQs on February 2, 2023. To view the FAQs, click here.

Connecticut.  The Connecticut Data Privacy Act (“CTDPA”) takes effect on July 1, 2023.   Although no implementing regulations have yet been proposed, the Attorney General has released a portal with FAQS. To view the FAQs, click here.

Utah. The Utah act will not be effective until December 31, 2023.  There has been no indication of proposed regulations yet.

Whether one of these laws applies to your business or not depends on various factors laid out in the respective statutes, which are all similar but slightly different. They include gross annual revenue, whether products or services are targeted to the state’s residents, and the volume of personal data controlled or processed, and/or the amount of revenue derived from the sale of personal data. To view a comparison chart detailing the applicability of each law, contact the Chair of Balch’s Data Security and Privacy Team at bnrobinson@balch.com.

As the FTC signals an intention of cracking down on children’s privacy, and as several comprehensive consumer privacy laws take effect in 2023 (with more on the way in legislatures across the country), some states have chosen to tackle children’s privacy more specifically at the state level. So far, only California’s has been enacted, most of the others have either been introduced or referred to committee.

In addition, as further detailed below, several state legislatures continue to introduce comprehensive consumer privacy laws similar to the ones passed in California, Colorado, Utah, and Virginia.

  • Texas (HB 896). This bill would appear to prohibit companies from knowingly allowing anyone under 18 years of age from using a social media platform, and requiring a mechanism for parents to request removal of accounts. The bill was introduced on December 7, 2022, but has not yet seen further movement.  
  • California (AB 2273).  Beginning on July 1, 2024, the California Age-Appropriate Design Code Act —  which was signed by the Governor and enacted on Sept 15, 2022 – will require businesses that provide an online service, product or feature likely to be accessed by children to comply with specified requirements including strict default privacy settings (absent a compelling reason), clear communication of privacy information, and preemptive data protection impact assessments prior to introduce any new such online services, products, or features. Such assessments must be made available to the Attorney general within 5 business days.  The bill creates the Children’s Data Protection Working Group to report to the legislature on best practices. For violations, the Attorney General may seek injunctive relieve or civil penalties of not more than $2,500 per affected child for each negligent violation or not more than $7,500 per affected children for each intentional violation.
  • New Jersey (A4919/S3493).  Similar to California’s law, these two identical companion bills would require a social media platform business, before offering a new online service, product, or feature likely to be accessed by children, to: (a) complete a data protection impact assessment (to be provided to the Attorney General upon request within 3 business days); (b) document any risk of material detriment to children arising from the data management practices of the social media platform identified in the assessment and create a mitigation plan; (c) estimate the appropriate age for use of the service, product or feature based on the risks; (d) configure default privacy settings to a high level of privacy (absent a compelling reason); (e) provide clear and prominent privacy information suited to the age of the children likely to have access; (f) if the service, product or feature allows the children’s parent, guardian or any other consumer to monitor the child’s online activity or track their location, provide an obvious signal to the children when it is being monitored or tracked; (g) enforce published terms, policies and community standards established by the platform, including privacy policies and those concerning children; and (h) provide prominent, accessible, and responsible tools to help children, or their parents / guardians, to exercise privacy rights and report concerns. The bill also creates the New Jersey Children’s Data Protection Commission to take input from stakeholders and make recommendations regarding best practices to the legislature. Enforcement by the Attorney General shall include injunctive relief as well as penalties of not more than $2,500 per child for negligent violations and $7500 per children for intentional violations. A4919 was introduces on December 5, 2022 and referred to committee. S3493 was introduced in the Senate and referred to committee on January 19, 2023.
  • Oregon (SB 196).  Oregon introduces a bill similar to the ones in California and New Jersey, including the requirement of data protection impact assessments, identification and mitigation of risks, authorization of Attorney General to bring injunctive relief and civil penalties, and the establishment of a task force on age-appropriate design to study effects on children and mitigation methods.  The bill’s requirements and restrictions would become operative on July 1, 2024, and the task force would sunset on January 2, 2025. In this bill, the assessment would be due within 3 business days upon request from the Attorney General.  In addition to injunctive relief and civil penalties, the Attorney General would also be able to recover attorneys’ fees and other enforcement costs and disbursements. SB 196 was introduced on January 9 and referred to the Senate judiciary committee on January 13, 2023.
  • Virginia (HB 1688/SB 1026). These two companion bills would amend the recently enacted Consumer Data Protection Act to add a section that would require an operator to obtain verifiable parental consent prior to registering a child with the operator’s product or service or before collecting, using, or disclosing such child’s personal data that has been verified by such parent or guardian.  (An “operator” is defined as any natural or legal entity that conducts business or produces products or services targeted to consumers and that collects or maintains personal data from or about such consumers.) The operator shall give the parent/guardian the option to consent to the collection and use of the child’s personal data without consenting to the disclosure of such data to third parties.  Verifiable parental consent may be obtained by: (a) providing a signed consent form; (b) using a credit/debit card or other online payment system that provides notification of any transaction with the operator to the primary account holder or (c) providing a form of government -issued identification to the operators.  In addition, a controller shall not knowingly process personal data of a child for purposes of: (i) targeted advertising; (ii) the sale of such personal data or (iii) profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. SB 1026 was prefiled and referred to the Committee on General Laws and Technology on January 7, 2023.  It passed through the committee on January 25, 2023 on a 9 to 6 vote. HB 1688 was prefiled and referred to the Committee on Communications, Technology and Innovation on January 9, 2023.
  • West Virginia (HB2460).  This bill would make it unlawful for an operator of a   website or online service directed to children – or any operator that has actual knowledge that is collecting personal information from a child – to collect personal information from a child in a manner that violates the restrictions in the bill. The bill requires the Attorney General to propose rules no later than March 1, 2023 that would require such an operator to: (i) provide notice on the website of what information is collected from children by the operator, how the operator uses such information, and the operator’s disclosure practices for such information; and (ii) to obtain verifiable parental consent for the collection, use, or disclosure of personal information from children. It would require the operator to provide, upon parental request, a description of the specific types of personal information collected from the children and the opportunity at any time to refuse to permit further use or maintenance, or future collection, of personal information from that child. The operator must provide reasonable means for the parent to obtain any collected information.  They must not conditions a child’s participation in a game, offering of a prize, or another activity on disclosing more personal information than is reasonably necessary to participate in such activity.  And they must establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children.

The bill also lays out certain enumerated circumstances where parental consent is not required, such as one-time responses to requests from the parent or child, to obtain parental consent, for the protection of the child or where necessary for  security or legal reasons, and where such information is not maintained in retrievable form afterwards.  The bill would enforced by the Attorney General, whose powers would be consistent with the West Virginia Consumer Credit and Protection Act, which authorizes injunctive relief as well as civil penalties, up to $5,000 for each violation in the event of repeated and willful violations. This bill was introduced and referred to the Judiciary Committee on January 11, 2023.

Comprehensive State Consumer Privacy Bills:

Indiana (SB 5).  This bill, similar to the ones enacted last year in CO, VA, UT, and CT, which introduced comprehensive consumer privacy rights, advanced out of the Indiana Senate Committee on Commerce and Technology on an 11-0 vote on January 19, 2023.  The committee added an amendment to add a sunset on the right to cure in 2028.   The bill would be effective January 1, 2026.   

Mississippi (SB 2080).  This bill, introduced on January 9, 2023, is identical to the bill introduced in the 2022 session, and attempts to create the same types of rights created in the other state bills, including the right to access, delete, and opt out of sales. (“Sales” is undefined in the introduced version, as is “consumer”.)  It also includes an opt-in provision for minors (under 16), a notice/transparency requirement, and a prohibition on discrimination for opt out. There is a limited right of private action for both statutory and actual damages, with a right to cure period and other limitations for individuals seeking statutory damages.  It also includes AG authority to bring civil penalties of up to $7,500 for each violation.  The bill would take effect on July 1, 2024.

New Hampshire (SB255).  This bill was introduced on January 19, and referred to judiciary committee. This is a comprehensive consumer privacy bill much like those enacted last year in CO, VA, UT, and CT.  Unlike several of those state laws, however, this bill carried no exemption for B2B personal information. (California’s B2B exemption expired effective 1/1/2023; the other laws define “consumer” as limited to their individual and household context, thus exempting B2B information.) It also contains no right to cure.

Washington (HB1616).  This bill was reintroduced and referred to the Civil Rights and Judiciary Committee on January 26, 2023. Entitled the “People’s Privacy Act”,  the bill offers an opt-in model similar to Brazil’s General Data Protection Law (“LGPD”), which in turn is closely aligned with Europe’s GDPR. Importantly, this bill includes a private right of action.   Entities covered by the bill would include non-governmental entities conducting business in the state which process captured personal information and (a) have earned or received $10M or more of annual revenue through 300 or more transactions or (b) process or maintain the captured personal information of 1,000 or more unique individuals during the course of a calendar year. “conducting business in Washington” means producing, soliciting or offering for use or sale any information, product or service in a manner that intentionally targets Washington residents or may be reasonably be expected to contact Washington residents, whether or not such business is for-profit or nonprofit.

Covered entities must make both a long-form and short-form privacy policy “persistently and conspicuously available” at or prior to the point of sale of a product or service.  For continuing interactions, opt-in consent must be renewed not less than annually, or it will be deemed withdrawn. The state department of commerce must adopt regulations within six months of enactment. It also calls for the development of standard of care for the security of captured personal information. The bill also contains individual rights similar to the others – right to know, access, correct, delete, and refuse consent for processing not essential to the primary transaction, as well as portability – and covered entities must comply with such requests not later than 30 days after receiving a verifiable request.  

This summary is probably dated – there other several bills are also being introduced round the country. If you have questions about these or any other state bills, please do not hesitate to reach out and let us know.

On December 8, 2022, the Division of Corporation Finance within the Securities and Exchange Commission (“SEC”) published guidance on disclosure obligations related to recent disruptions in the crypto asset market. The Sample Letter to Companies Regarding Recent Developments in Crypto Asset Markets aims to improve compliance with disclosure obligations under SEC regulations.

Federal law requires security issuing companies to disclose information relevant to investments in statements or reports. Additionally, companies must supplement these required disclosures with “such further material information, if any, as may be necessary to make the required statements, in the light of the circumstances under which they are made, not misleading.” 17 C.F.R. § 240.12b-20; Id. § 230.408. The Sample Letter provides an example of what the Division could issue to companies concerning crypto assets. It also provides a list of considerations for companies to determine whether they should address crypto asset market developments in their filings.

Though non-exhaustive, the list within the sample letter provides insight into what the Division considers “further material information” necessary to prevent misleading disclosures. The Sample Letter indicates that companies should give thought to the following when determining whether to supplement or update their disclosures:

  1. Whether “significant crypto asset market developments” could affect financial conditions, results, or share price, including crypto asset price volatility;
  2. Whether and how bankruptcies within the crypto asset market could impact the company;
  3. Whether the company has direct or indirect exposures to participants in the market undergoing bankruptcy, excessive withdrawals or redemptions, or compliance failures;
  4. Whether the company has safeguards in place for customers’ crypto assets and procedures to prevent self-dealing and conflicts of interests;
  5. Whether the company holds crypto assets as collateral, experienced excessive withdrawals or redemptions, or is exposed to potential effects on the company’s financial condition and liquidity due to crypto assets.

The Sample Letter also provides a list of risk factors for companies to consider when making disclosures, including changes in regulatory developments, any reputational harm, and any gaps in risk management processes related to the crypto asset market. Companies should comprehensively evaluate the effect that the crypto asset market could have on their business and determine whether additional disclosures to the SEC are necessary to meet these reporting obligations.

This guidance, coupled with the SEC charges against former FTX CEO, Samuel Bankman-Fried, signals that the SEC intends to use its authority to regulate the crypto industry.

On October 7, 2022, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities.  The order aims to address concerns expressed by the Court of Justice of the European Union (CJEU) in the Schrems II case, in which it ruled the E.U.-U.S. Privacy Shield inadequate as a cross-border transfer mechanism.  The order aims to provide the European Commission with the basis to adopt a new adequacy determination.  In turn, this would restore the legal basis by which cross-border data flows can occur between the U.S. and the E.U., providing greater legal certainty for companies with respect to cross-border data transfers under GDPR.

Executive Requirements

Among other things, the Executive Order requires the following:

  • Further safeguards and consumer protections for US signals intelligence activities, specifically prioritizing targeted (over bulk) collection and restricting agencies’ processing of E.U. personal data to activities necessary and proportionate to advance a national security purpose.
  • A two-tier redress mechanism to address complaints, starting with the agency Civil Liberties Protection Officer (“CLPO”) with review by a newly created and independent Data Protection Review Court established by the Attorney General.
  • Updating of police and procedures by various US Intelligence Community elements, to be reviewed by the Privacy and Civil Liberties Oversight Board (“PCLOB”); and
  • A multi-layer mechanism for individuals from qualifying states and regional economic integration organizations, as designated under the Executive Order, to obtain independent and binding review and redress.

European Union Response

In response to the Executive Order, the European Commission announced it will now prepare a draft adequacy decision and launch adoption procedures, which could take up to six months. The European Commission confirmed that, prior to adopting an adequacy decision, it must obtain an opinion from the European Data Protection Board and receive approval from an EU Member State committee.  In addition, the European Parliament has a right of scrutiny over adequacy decisions.  Finally, the European Commission highlighted that an adequacy decision is not the only tool for international transfers and that all previously approved safeguards in the area of national security will be available for all transfers to the U.S. under the GDPR.

To view the Executive Order, click here.

To view the White House fact sheet, click here.

To view the EU Q&A, click here.

To view the U.S. Attorney General Final rule establishing the Data Protection Review Court, click here.

Yesterday, on August 24, 2022, California Attorney General Rob Bonta (“AG”) announced a settlement with Sephora, Inc., resolving allegations that the company violated the California Consumer Privacy Act (“CCPA”).  The order includes permanent injunctive relief as well as a $1.2 million fine. This action stems from a June 2021 enforcement sweep by the attorney general of large retailers to determine whether they continue to sell personal information when a consumer signals an opt out via the Global Privacy Control (“GPC”), a browser extension used to notify businesses of their privacy preferences, and which acts as a mechanisms that website can use to indicate they support the specification. This action is significant not only because it is the first CCPA enforcement action from the California AG’s office, but also because it hones in on the subject of much debate regarding what constitutes a “sale” of personal information under the CCPA.

Background

According to the AG’s complaint, Sephora installed third-party companies’ tracking software on its website and in its app so that third parties can monitor consumer as they shop. In this case, they would track data such as:

“whether a consumer is using a Macbook or Dell, the brand of eyeliner that a consumer puts in their ‘shopping cart,’ and even the precise location of the consumer. Some of these third party companies curate entire profiles of users who visit Sephora’s website, which the third parties then use for Sephora’s benefit.  The third party might provide detailed analytics information about Sephora’s customers and provide that to Sephora, or offer Sephora the opportunity to purchase online ads targeting specific consumers, such as those who left eyeliner in their shopping cart after leaving Sephora’s website. This data about consumer is frequently kept by companies and used for the benefit of other businesses, without the knowledge or consent of the consumer.”

The AG’s complaint calls the right to opt out as “the hallmark of the CCPA”, and states that if companies make consumer personal information available to third parties and receive a benefit from the arrangement such as in the form of ads targeting specific consumers – they are deemed to be “selling” consumer personal information under the law”, which triggers additional CCPA obligations, such as providing right to opt out and prominently displaying a “Do Not Sell My information” link and mechanisms on the website and mobile app. In contrast, the AG complaint states, Sephora’s privacy policy told consumers that “we do not sell personal information” and failed to provide the link.

At the heart of this enforcement action is whether Sephora engaged in the “sale” of personal information, which is broadly defined in the CCPA as the sharing or exchange of data “for monetary or other valuable consideration.” Other similar state laws define sales more strictly as exchanges for “monetary consideration” only. What constitutes “valuable consideration” in this context has been the subject of much debate since the passage of CCPA, with little guidance until now. According to the AG:

“Sephora allowed the third party companies access to its customers’ online activities in exchange for advertising or analytics services. Sephora knew that these third parties would collect personal information when Sephora installed or allowed the installation of the relevant code on its website or in its app. Sephora also knew that it would receive discounted or higher-quality analytics and other services derived from the data about consumer’s online activities, including the option to target advertisements to customers that had merely browsed for products online.”

Most importantly, but buried in the middle of the AG Complaint, it says “Sephora also did not have valid service provider contracts in place with each third party, which is one exception to ‘sale’ under the CCPA.” Therefore, the AG complaint states, “[a]ll of these transactions were sales under the law.

When Sephora failed to cure within 30 days, the AG entered into a tolling agreement effective September 15, 2021, which led to the filing of the complaint in California Superior court, and ultimately the final order approving the final judgment and permanent injunction on August 24.

Analysis.

The complaint and the final judgment charge Sephora with several categories of violations, including failure to provide notice of sale, failure to honor opt out of sales, failure to provide the “Do Not Sell My Information” link to opt out of sales, and others. But the heart of this case is the statement that Sephora was, indeed, “selling” information as defined by the CCPA. All of the claimed violations – failing to disclose sales of information, failure to provide the “do not sell” link, failure to response to GPC signals opting out of the sale of information – they all stem from the premise that Sephora was, in fact, “selling” the information as defined, for valuable consideration.  The complaint suggests that targeted advertising could be a benefit constituting “valuable consideration”, and alleges that Sephora “gave companies access to consumer personal information in exchange for free or discounted analytics and advertising benefits”.  But this benefit would be irrelevant if the companies were service providers under the CCPA. Under the CCPA (Cal. Civ. Code 1798.140(v)), a service provider is defined as “a … legal entity … that processes information on behalf of a business and to which the business discloses a consumer’s personal information for a business purpose pursuant to a written contract, provided that the contract prohibits the entity receiving the information from retaining, using, or disclosing the personal information for any purpose other than for the specific purpose of performing the services specified in the contract for the business, or as otherwise permitted by this title, including retaining, using, or disclosing the personal information for a commercial purpose other than providing the services specified in the contract with the business.” (emphasis supplied).

I suspect – although it is not clear from the documents – that Sephora may have assumed that it was not selling the information because it had determined the analytics companies fell under the “service provider” exemption. Had Sephora’s assumption been correct, it would not be required to assume all of the obligations they have now been found to have violated. However, in the AG’s words, Sephora “did not have valid service provider contracts” in place with the third parties, and thus did not fall under the service provider exemption. Thus, they were required to abide by the obligations associated with “sales” of personal information, which they did not.

Lessons Learned and Open Questions

Lessons Learned.

For other companies engaged in targeted advertising and analytics, a superficial reading of this settlement might lead to the conclusion that, simply by engaging third party web analytics providers, they must be “selling” data and thus must comply with the heightened “sales” obligations.    However, I believe a more careful reading reveals the true lesson — to make sure you have sufficient contracts in place with your third-party analytics providers that contain the necessary restrictions required by the CCPA (e.g., Cal Civ. Code 1798.140(v); Cal Code Regs. 11.7051) The CPRA adds two nearly identical categories of entities “contractors” and “service providers”, although their definitions are similar. It also includes new contractual requirements for sharing information with a service provider or contractors. By ensuring that sufficient contractual agreements are in place between companies and third-party analytics companies, companies can more assuredly rely on the service provider exemption from the definition of “sales”.

Open Questions.

However, this does beg the question as to what deficiencies may have existed in the arrangements between Sephora and its third-party providers to deem them insufficient and thus “sales” under the CCPA, as determined by the AG.  The AG Complaint states: “Sephora also did not have valid service provider contracts in place with each third party”.  But it is not clear whether the AG means that Sephora did not have contracts in place at all, or whether it did, but such contracts were not “valid”.

It seems doubtful that a large and sophisticated company such as Sephora would have no contract in place at all.  Thus, it may be that either: (a) there were formal contracts in place but they lacked the sufficient terms and conditions required by the CCPA and regulations; or perhaps (b) the company simply created user accounts pursuant to “clickwrap” terms and conditions that either similarly lacked such sufficient terms and conditions, or the nature of such clickwrap terms and conditions were deemed insufficient to be valid contracts by the AG (but see, e.g., B.D. v. Blizzard Entertainment, Inc., 76 Cal. App.5th 931 (March 29, 2022)).

Conclusion

In summary, the first CCPA enforcement action issued by the AG is significant in its own right, but also because it emphasizes the importance of the heightened obligations associated with selling personal information to third parties. It is also important because it raises questions about the important and much-debated topic of what constitutes “valuable consideration” and a “sale” under the CCPA. Although we shall see how many additional enforcement actions the AG takes while rulemaking and enforcement authority transitions to the CPPA under the CPRA, additional interpretations by the AG (as well as their consistency or deviations therefrom from the CPPA) will be informative as to how companies may comply with issues regarding sales of personal information.

To view the AG press release, click here.

To view the AG complaint, click here.

To view the settlement order, click here.

The Department of Justice (“DOJ”), on behalf of the Federal Trade Commission (“FTC”), filed a complaint and motion for entry of a stipulated order with the Northern District of California, which would require Twitter to pay civil penalties and take other corrective actions for their violation of the FTC Act and a previous 2011 FTC Order.  The complaint states that Twitter “represented to users that it collected their telephone numbers and email addresses to secure their accounts, [but] Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences” dating from at least May 2013 to September 2019.  Moreover, the complaint alleges that Twitter’s ‘misrepresentation’ and ‘deceptive’ actions breach the Swiss-U.S. and EU-U.S. Privacy Shield Frameworks.

The proposed order would require Twitter to:

  • Pay a $150 million in civil penalties;
  • Allow users to use alternative multi-factor authentication methods (besides telephone numbers);
  • Refrain from profiting from using collected data in undisclosed manners;
  • Inform users that their information was misused;
  • Establish a privacy and information security program that oversees risks associated with current and existing products;
  • Disclose to the FTC any future data breaches; and
  • Limit employees’ access to user’s personal data.

Twitter is not an outlier.  Instead, it signals an increased focus on privacy enforcement at the state and federal levels going forward.  Many companies may expect fines and penalties for not complying with state, federal, and international data privacy laws.  For instance, California recently updated the California Consumer Privacy Act (“CCPA”) with the passing of Proposition 24, the California Privacy Rights Act (“CPRA”), which added additional consumer privacy rights and created a new state agency, the California Privacy Protection Agency (“CPPA”).  The CPPA recently took over rulemaking authority from the California Attorney General and is beginning the rulemaking process.

Moreover, within the next twelve months, similarly comprehensive state privacy laws in Virginia, Colorado, Utah, and Connecticut will also become effective.  To avoid expensive penalties, companies should consider reviewing their privacy policies and their internal controls surrounding customer’s nonpublic, personal information and customer’s privacy preferences.  Privacy policies should accurately and explicitly reflect current business practices, and most importantly, comply with the upcoming privacy laws.

For more information about current and upcoming privacy laws, and how your company may manage privacy compliance, please contact us.

To view a copy of the DOJ/FTC complaint, click here.

To view the motion and stipulated order, click here.

To view a joint statement issued by FTC Chair Lina Khan and Commissioner Rebecca Kelly Slaughter, click here.

To view a concurring statement issued by FTC Commissioners Christine S. Wilson and Noah Joshua Phillips, click here.

On May 19, 2022, the Federal Trade Commission voted 5-0 to adopt a policy statement regarding increased scrutiny of the Children’s Online Privacy Protection Act (COPPA) violations involving education technology companies.  The statement reaffirmed COPPA provisions around limiting educational technology’s collection, use, retention and security requirements for children’s data. The FTC stated:

“When Congress enacted the [COPPA], it empowered the Commission with tools beyond administering compliance with notice and consent regimes. The Commission’s COPPA authority demands enforcement of meaningful substantive limitations on operators’ ability to collect, use, and retain children’s data, and requirements to keep that data secure. The Commission intends to fully enforce these requirements—including in school and learning settings where parents may feel they lack alternatives.”

The FTC states that the development and proliferation of more sophisticated technologies has raised concerns that businesses might engage in harmful conduct, which led to the FTC’s 2013 revisions to the COPPA Rule, including provisions to hold third party advertising networks liable for collection of children’s personal information from child-directed sites in violation of the Rule and to expand the definition of personal information to include persistent identifiers used to target advertising to children. The FTC further cited online learning devices and services during the COVID-19 pandemic as making concerns about data collection in the educational context “particularly acute.”

In a separate statement, President Joe Biden applauded the FTC, stating that children and families “shouldn’t be forced to accept tracking and surveillance” to access online educational products.  Biden said the FTC “is making it clear that such requirements would violate the [COPPA], and that the agency will be cracking down on companies that persist in exploiting our children to make money.”

The FTC intends to scrutinize compliance with the “full breadth” of the COPPA rules and statute, focusing particular emphasis on:

  • Prohibitions Against Mandatory Collection. Prohibitions against businesses requiring collection of personal information beyond what is reasonably needed for the child to participate in the activity.
  • Use Prohibitions. Restrictions and limitations on how COPPA-covered companies can use children’s personal information, such as for marketing, advertising, or other commercial purposes unrelated to the provision of the school-requested online service.
  • Retention Prohibitions. COPPA-covered companies must not retain personal information longer than reasonably necessary to fulfill the purpose for which it was collected (e.g., for speculative future potential uses).
  • Security Requirements. COPPA-covered companies must have procedures to maintain the confidentiality, security, and integrity of children’s personal information. A COPPA-covered company’s lack of reasonable security can violate COPPA even absent a breach.

The Policy Statement concludes that:

“Children should not have to needlessly hand over their data and forfeit their privacy in order to do their schoolwork or participate in remote learning, especially given the wide and increasing adoption of ed tech tools. Going forward, the Commission will closely scrutinize the providers of these services and will not hesitate to act where providers fail to meet their legal obligations with respect to children’s privacy.”

To view the FTC policy statement, click here.

To view President Biden’s statement, click here.

In November 2021, non‑state issued digital assets reached a combined market capitalization of $3 trillion, up from approximately $14 billion in early November 2016.  Several global monetary authorities are exploring, and in some cases introducing, central bank digital currencies (CBDCs).  On March 9, 2022, President Biden issued an executive order to mandate multiple reports and studies by various agencies around digital asset policy and regulation.

Goals of Executive Order

The goals of the order emphasize:

  • Protecting U.S. consumers, investors and businesses
  • Protecting US and global financial stability and mitigating system risk
  • Mitigating illicit finance and national security risks posed by misuse of digital assets
  • Reinforcing US leadership in the global financial system and in technological and economic competitiveness
  • Promoting access to safe and affordable financial services; and
  • Supporting technological advances that promote responsible development and use of digital assets.

Agencies Involved

A number of government agencies are given roles and responsibilities for these efforts, including 13 Cabinet departments (including Treasury, DOJ, State, and Homeland Security), all major financial services regulators, several science and technology offices, economic and policy officials, intelligence agencies, and even agencies such as the Department of Energy (DOE) and the Environmental Protection Agency (EPA).

In addition the breadth of content of such reports vary widely, from considering privacy/consumer protection, to reporting on energy and climate change to creating a framework for international engagement. Each of the over 20 agencies tasked with reports under this order has a role and assignment.

Key Directives of Executive Order

The executive order:

  • Establishes a comprehensive federal framework to ensure the U.S. continues to play a leading role in the innovation and governance of digital assets domestically and abroad.
  • Directs relevant departments and agencies to initiate research into the merits of a U.S. Central Bank Digital Currency (USBDC). This includes agency participation in international efforts and projects; a strategic Federal Reserve plan for potential implementation; and a proposal for dollar CBDC legislation to be developed by the Attorney General in consultation with Treasury and the Federal Reserve.
  • Calls for the development of a plan to mitigate the illicit finance and national security risks posed by the misuse of digital assets. This adds to previous work to align departments and agencies to combat misuse of digital assets enabling the rise and spread of ransomware.

What Is a “Digital Asset”?

The term “digital asset” is defined in the executive order to refer to:

“all CBDCs, regardless of the technology used, and to other representations of value, financial assets and instruments, or claims that are used to make payments or investments, or to transmit or exchange funds or the equivalent thereof, that are issued or represented in digital form through the use of distributed ledger technology.  For example, digital assets include cryptocurrencies, stablecoins, and CBDCs.  Regardless of the label used, a digital asset may be, among other things, a security, a commodity, a derivative, or other financial product.  Digital assets may be exchanged across digital asset trading platforms, including centralized and decentralized finance platforms, or through peer-to-peer technologies.”

While this definition appears to be focused on financial services applications, it is unclear whether such a broad definition intends to cover other aspects of the cryptoverse, such as non-fungible tokens (NFTs) for example.

Timing and Deadlines

The executive order requires an array of reports from various agencies with differing deadlines, ranging from 90 days (e.g., report on intentional law enforcement) to 210 days (e.g., CBDC legislative proposal, financial stability report) including deadlines in between. Other reports have deadlines that are determined based on the submission of other reports.

To view the executive order, click here.

If you have questions about the executive order or a particular task, please contact a member of our Data Privacy and Security Team.

On December 20, 2021, The National Institute of Standards and Technology (NIST) released its draft interagency report 8403 on “Blockchain for Access Control Systems”.  As the report’s abstract states:

“Protecting system resources against unauthorized access is the primary objective of an access control system. As information systems rapidly evolve, the need for advanced access control mechanisms that support decentralization, scalability, and trust – all major challenges for traditional mechanisms – has grown.

Blockchain technology offers high confidence and tamper resistance implemented in a distributed fashion without a central authority, which means that it can be a trustable alternative for enforcing access control policies. This document presents analyses of blockchain access control systems from the perspectives of properties, components, architectures, and model supports, as well as discussions on considerations for implementation.”

This public review also include a call for information on essential patent claims (see page iv of the draft report). For more information regarding inclusion of patents in Information Technology Laboratory (ITL) publications, click here.

The draft NISTIR discusses, among other things: (a) blockchain system component and advantages for access control systems, (b) access control functions of blockchain access control systems; (c) access control model support, and (d) other considerations.

The public comment period lasts from December 20, 2021 through February 7, 2022.

To view the draft report, click here.

Comments, including patent statements from patent holder or their agents, should be emailed to ir8403-comments@nist.gov.