On July 21, 2020, the New York State Department of Financial Services (NYDFS) filed charges against First American Title Insurance Company (First American) for breach of state cybersecurity regulations. Specifically, NYDFS alleges that First American exposed tens of millions of documents containing consumers’ sensitive personal information, including bank account numbers and statements, mortgage and tax records, social security numbers, wire transaction receipts, and drivers’ license images. The statement of charges against First American state that from October 2014 through May 2019, a vulnerability in First American’s website exposed customers’ personal data. The statement of charges also claim that First American failed to adequately remedy the vulnerability when it was eventually discovered.

First American is charged with violating multiple provisions of the NYDFS’s cybersecurity regulations. These regulations require regulated entities, like insurance providers, to establish and maintain an adequate cybersecurity program and procedures. First American is the first entity to be charged under these regulations, which came into effect in 2017.

To read the Notice of Charges, click here.

We previously posted on yesterday’s Schrems II decision issued by the Court of Justice of the European Union (CJEU). Today (Jun 17, 2020), the Berlin data protection authority (Berlin DPA) went even further than the CJEU opinion, issuing a statement on the Schrems II case, calling for Berlin-based data controllers storing personal data in the US to transfer the same to Europe.  The DPA stated that data should not be transferred to the US until that legal framework is reformed.  In addition, regarding the SCCs that were cautiously validated by the CJEU, the Berlin DPA stated that European data exporters and third country data importers must check, prior to transferring data, whether the third country has state access to the data that exceeds that permitted under European law. If such access rights exists, the Berlin DPA stated, the SCCs cannot justify the data transfer to such third country. The Berlin DPA thus requested all data controllers to observe and comply with the CJEU’s judgment. In practice, the Berlin Commissioner provided that data controllers transferring data to the US, especially when using cloud service providers, are now required to use service providers based in the EU or in a country with an adequate level of protection.

This could impact the ability of Berlin-based companies to transfer personal data to their US subsidiaries or other US-based vendors or business partners.

To read the press release (currently available only in German), click here.

On July 16, 2020, the Court of Justice of the European Union (“CJEU” or “Court”) issued a significant judgment in Case C-311/18 (“Schrems II decision”) on the adequacy of protection provided by the EU-US Data Protection Shield. The court concluded that the Standard Contractual Clauses (“SCCs”) issued by the European Commission for the transfer of personal data to processors outside of the EU continue to be valid. However, the Court also invalidated the E.U.-U.S. Privacy Shield framework. In our post below, we: (I) provide some background on the events leading up to today’s decision; (II) summarize today’s decision and (III) provide some reflection on what it means for U.S. organizations that transfer personal data from Europe.

I. Context/Background.

The Schrems II decision is the latest in a series of decisions regarding privacy advocate Maximilian Schrems (“Max Schrems”), who filed a complaint in 2015 with the Irish Data Protection Commissioner challenging Facebook Ireland’s reliance on standard contractual clauses (“SCCs”) as a legal basis for transferring personal data to Facebook Inc. in the United States. Facebook turned to the SCCs after the CJEU had invalidated the US-EU Safe Harbor framework in 2015 upon Max Schrems’ earlier complaint.

The General Data Protection Regulation (‘the GDPR’) provides that the transfer of personal data to a third country may, in principle, take place only if the third country in question ensures an “adequate level of data protection”. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard contractual clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Max Schrems, an Austrian national residing in Austria, has used Facebook since 2008. The personal data of Mr. Schrems (and other European nationals) is transferred by Facebook Ireland to Facebook servers located in the United States, where it is processed. Mr. Schrems lodged a complaint with the Irish data protection authority (“DPA”) seeking to prohibit those transfers. He claimed that U.S. laws and practices do not sufficiently protect against access to the transferred data by U.S. public authorities. Specifically, he was concerned that EU personal data might be at risk of being accessed and processed by the U.S. government once transferred, in a manner inconsistent with privacy rights guaranteed in the EU, and that there is no remedy available to EU citizens to ensure protection of their personal data after it is transferred to the U.S. Mr. Schrems’ complaint was rejected at the time on the basis, among others, that the Commission had already found that the U.S. Safe Harbor Framework did ensure an adequate level of protection in Decision 2000/5205 (“the Safe Harbour Decision”).

Following his complaint, the Irish DPA brought proceedings against Facebook in the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling. These questions primarily addressed the validity of the SCCs, but also raised concerns about the E.U.-U.S. Privacy Shield framework. On October 6, 2015, the CJEU declared that the Safe Harbour Decision to be invalid (“the Schrems I judgment”), thus invalidating the EU-US Safe Harbor Framework and annulling the rejection of Max Schrems’ complaint.

In light of the Schrems I judgment, the Irish DPA then asked Mr. Schrems to amend his complaint. In his amended complaint, Mr. Schrems claimed that the U.S. still does not sufficiently protect data transferred to that country, and sought to suspend or prohibit future transfers of his personal data from the EU to the United States. Meanwhile, Facebook Ireland had begun carrying out data transfers pursuant to the alternative method of standard contractual clauses (“SCCs”) set out in the Annex to Decision 2010/87 (“SCC Decision”), which provides standard contractual clauses which could be used for data transfers to countries that had not been deemed adequate.

Since the outcome of Mr. Schrems’ amended complaint hinged upon the validity of the SCC Decision, the Irish DPA brought proceedings before the High Court in order for it to refer questions to the Court of Justice for a preliminary ruling. After the initiation of those proceedings, the Commission adopted Decision 2016/1250 on the adequacy of the protection provided by the E.U.-U.S. Privacy Shield (‘the Privacy Shield Decision’).

In today’s decision, the Irish High Court asked the CJEU whether: (1) the GDPR applies to transfers of personal data pursuant to the SCCs from Decision 2010/87, and what level of protection is required by the GDPR in connection with such a transfer and (2) what obligations are incumbent on supervisory authorities in those circumstances. The High Court also raised the question of the validity of both (3) the 2010 SCC Decision and (4) the 2016 Privacy Shield Decision.

II. Summary of Today’s CJEU Decision

In today’s decision, the Court stated that:

(1) GDPR Applies to Data Transfers. EU Law, and the GDPR in particular, applies to the transfer of personal data for commercial purposes by an economic operator established in a Member State to another economic operator established in a third country, even if, at the time of that transfer or thereafter, that data may be processed by the authorities of the third country in question for the purposes of public security, defense and state security. The Court adds that this type of data processing by the authorities of a third country cannot preclude such a transfer from the scope of the GDPR. The requirements of the GDPR concerning appropriate safeguards, enforceable rights, and effective legal remedies must be interpreted as meaning that data subjects whose personal data are transferred to a third country pursuant to SCCs must be afforded a level of protection essentially equivalent to that guaranteed within the EU by the GDPR, read in the light of the Charter. The assessment of that level of protection must take into consideration both: (a) the contractual clauses agreed between the data exporter established in the EU and the data importer recipient established in the third country concerned and, (b) the relevant aspects of the third country’s legal system regarding access by public authorities of that third country.

(2) Obligations of Supervisory Authorities. Regarding obligations of supervisory authorities (such as the Irish DPA) in connection with such a transfer, the CJEU held that, unless there is a valid Commission adequacy decision, those competent supervisory authorities are required to suspend or prohibit a transfer of personal data to a third country where the DPA takes the view, in the light of all the circumstances of the transfer, that the SCCs are not or cannot be complied with in that country and that the protection of the data transferred that is required by EU law cannot be ensured by other means, where the data exporter established in the EU has not itself suspended or put an end to such a transfer.

(3) Validation of SCC Decision. The Court found that Decision 2010/87 (SCC Decision) sufficiently establishes effective mechanisms that make it possible, in practice, to ensure compliance with the level of protection required by EU law and to ensure that transfers of personal data pursuant to such clauses a suspended or prohibited in the event of the breach of such clauses or it being impossible honor them. Specifically, the Court pointed out that, in particular, that that decision imposes an obligation on a data exporter and the recipient of the data to verify, prior to any transfer, whether that level of protection is respected in the third country concerned and that the decision requires the recipient to inform the data exporter of any inability to comply with the standard data protection clauses, the latter then being, in turn, obliged to suspend the transfer of data and/or to terminate the contract with the former. The court also emphasized the EU organizations relying on them must take a proactive role in evaluating, prior to any transfer, whether there is in fact an “adequate level of protection” for personal data in the data importer’s jurisdiction. The Court stated that many organizations may implement additional safeguards to ensure an “adequate level of protection” for personal data transfers, although it was not specific on what those additional safeguards might be. Further, non-EU organizations importing data from the EU based on SCCs must inform data exporters in the EU of any inability to comply with the SCCs. When non-EU data importers are unable to comply with the SCCs, and there are not additional safeguards in place to ensure an “adequate level of protection”, the EU data exporter must suspend the transfer of data and/or terminate the contract.

(4) Invalidity of Privacy Shield Decision. Finally, the CJEU decided, unexpectedly, to examine and rule on the validity of the EU-U.S. Privacy Shield framework. In invalidating the Privacy Shield, the Court took the view that “the limitations on the protection of personal data arising from the domestic law of the United States on the access and use by U.S. public authorities of such data transferred from the European Union to the United States, which the Commission assessed in the Privacy Shield Decision, are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law…” Specifically, the CJEU found, the Privacy Shield and its Ombudsperson mechanism “does not provide data subjects with any cause of action before a body which offers guarantees substantially equivalent to those required by EU law, such as to ensure both the independence of the Ombudsperson provided for by that mechanism and the existence of rules empowering the Ombudsperson to adopt decisions that are binding on the US intelligence services.” For these reasons, the Court declared the Privacy Shield Decision to be invalid.

III. What this Means for U.S. Organizations

Therefore, while the SCCs remain valid under today’s decision, organizations that currently rely on SCCs will need to consider whether there is still an “adequate level of protection” for the personal data as required by EU law, taking into account the nature of the personal data, the purposes and context of the processing, and the country of destination. Where that is not the case, organizations should consider what additional safeguards may be implemented to ensure there is in fact an “adequate level of protection.”

Further, organizations that currently rely on the EU- U.S. Privacy Shield framework will need to urgently identify an alternative data transfer mechanism to continue transfers of personal data to the U.S. These may include the SCCs that remain valid (along with any additional safeguards as necessary). Alternatives may also include derogations provided in the GDPR for certain transfers (such as when the transfer is necessary to perform a contract), or Binding Corporate Rules (“BCRs”) as set forth in the GDPR.

To read the CJEU decision, click here.

To read the CJEU press release, click here.

On June 1, 2020, California Attorney General Xavier Becerra submitted a finalized package of CCPA regulations to the California Office of Administrative Law (OAL).   The package included not only the final text of the regulations, but also the final statement of reasons for amendments to the previous drafts. There have been multiple rounds of drafts of the regulations, along with corresponding comment periods and workshops.  The first comment period received over 1,700 comments, leading to modifications published on February 7, 2020.  A second set of modifications released on March 11, 2020 eliminated the opt-out button and clarified procedures for consumer requests.  This finalized version of the regulations appears nearly identical to the March version.

Attorney General Becerra has stood firm in his insistence on the scheduled enforcement deadline of July 1, 2020, notwithstanding pleas to delay enforcement due to the COVID-19 pandemic.  However, the regulations must first be approved by the OAL, which has 90 days to make its decision.  Specifically, the OAL has 30 working days, plus an additional 60 calendar days under Executive Order N-40-20 related to the COVID-19 pandemic, to review the package for procedural compliance. However, the AG submitted a written justification for a request for an expedited review by the OAL to be completed within 30 business days and that the Final Regulations become effective upon filing with the Secretary of State.  Notably, the statute has been in effect as of January 1, 2020, and the statute includes a 12-month “look-back” requirement allowing consumers to request their records dating back one year from when the request was made.

While these CCPA regulations move toward finalization, we have previously written about a new ballot initiative scheduled for November 2020 for the California Privacy Rights Act (CPRA), commonly referred to as CCPA 2.0, which would significant strengthen the CCPA’s requirements and enforcement tools, would weaken defenses to private rights of action under the CCPA, and would establish the California Privacy Protection Agency to enforce privacy rights in California.

To view the final CCPA regulations, click here.

To view the California Attorney General’s press release, click here.

To view the California Attorney General’s supporting documentation, click here.

  1. Details about Apple/Google Launch

Yesterday (May 20, 2020), Apple and Google launched software that will allow public health authorities to create mobile applications that notify people when they may have come in contact with people who have confirmed cases of COVID-19, while purportedly preserving privacy around identifying information and location data. People who have updated their phones with the latest software will be able to share their Bluetooth signal, logging when the radio recognizes other people who have downloaded an app that uses the software.

Their public launch means that health agencies can now use the API in applications released to the general public. To date, Apple and Google had only released beta versions to help with the development process.  (To be clear, Apple and Google are not themselves creating an exposure notification or contact tracing application – but the launch means that developers working on behalf public health agencies can do so.)  This “exposure notification” tool uses Bluetooth radios within smartphones, and will be part of a new software update the companies will be pushing out Wednesday. State and federal governments can use it to create contact tracing applications that citizens can download via the Apple Store or Google Play store.

Many U.S. states and 22 countries across five continents have already asked for, and been provided access to, the API to support their development efforts, and they anticipate more being added going forward. So far, Apple and Google say they have conducted more than 24 briefings and tech talks for public health officials, epidemiologists and app developers working on their behalf.

The exposure notification API uses a decentralized identifier system with randomly generated temporary keys created on a user’s device (but not specifically tied to personally identifiable information). Public health agencies can define parameters around exposure time and distance, and can tweak transmission risk and other factors according to their own standards.

The applications are allowed to combine the API and voluntarily submitted user data provided through individual apps to enable public health authorities to contact exposed users directly to make them aware of what steps they should take.

Apple and Google have incorporated various privacy protections, including: (a) encryption of all device-specific Bluetooth metadata (e.g., signal strength, specific transmitting power), and (b) explicitly barring use of the API in any apps that also seek geolocation information permission from users.  Because many public health authorities developing contact tracing were considering using geolocation data, this privacy measure has prompted some to reconsider their approach.

Apple and Google provided the following joint statement about the API and how it will support contact-tracing efforts undertaken by public health officials and agencies:

One of the most effective techniques that public health officials have used during outbreaks is called contact tracing. Through this approach, public health officials contact, test, treat and advise people who may have been exposed to an affected person. One new element of contact tracing is Exposure Notifications: using privacy-preserving digital technology to tell someone they may have been exposed to the virus. Exposure Notification has the specific goal of rapid notification, which is especially important to slowing the spread of the disease with a virus that can be spread asymptomatically.

To help, Apple and Google cooperated to build Exposure Notifications technology that will enable apps created by public health agencies to work more accurately, reliably and effectively across both Android phones and iPhones. Over the last several weeks, our two companies have worked together, reaching out to public health officials scientists, privacy groups and government leaders all over the world to get their input and guidance.

Starting today, our Exposure Notifications technology is available to public health agencies on both iOS and Android. What we’ve built is not an app — rather public health agencies will incorporate the API into their own apps that people install. Our technology is designed to make these apps work better. Each user gets to decide whether or not to opt-in to Exposure Notifications; the system does not collect or use location from the device; and if a person is diagnosed with COVID-19, it is up to them whether or not to report that in the public health app. User adoption is key to success and we believe that these strong privacy protections are also the best way to encourage use of these apps.

Today, this technology is in the hands of public health agencies across the world who will take the lead and we will continue to support their efforts.

Google and Apple are also releasing draft technical documentation including Bluetooth and cryptography specification and framework documentation.

2. Privacy Reactions and Concerns Regarding Contact Tracing Applications

Many within the privacy community are focused on whether these types of applications meet the principles of “Privacy by Design”, with much emphasis being placed on using decentralized tracing rather than location data stored in central databases. The UK data protection authority (UK ICO) concluded on April 17, 2020 that proposals for the contact tracing framework itself “appear aligned with the principles of data protection by design and by default”, based on certain assumptions.  At the same time, France asked Apple to remove the limitation that Apple’s operating system prevents contract tracing apps using its Bluetooth technology from running constantly in the background if that data is going to be moved off the devise, a limit designed to protect user’s privacy, but which France said was standing in the way of the type of app that France wanted to build.

It is important to recognize that technology (in the form of a contact tracing application) is only a part of the solution, and that many security and privacy issues arise not only from the technology itself, but in the purpose, process, and manner in which it is used. For employers considering the use of contact tracing technologies or applications leveraging the Apple/Google APIs, a number of questions need to be addressed. For example:

  • Will you require employee consent and on what conditions? Will the contact tracing app continue to monitor after work hours are over?
  • How will you handle external requests (e.g., law enforcement, state/local government, hospitals, health authorities, nonprofits, etc.)?
  • Will your process be forward- or backward-looking? Will you penalize those who violate social distancing requirements based on this information to prevent infection? Or simply wait until positive testing results indicate a positive infection, and then look back at contact history to notify those in contact with the individuals?  The Apple and Google API appears to favor more of a backwards looking approach.
  • How will you ensure confidentiality among colleagues? South Korea and Israel’s approaches were more publicly accessible, which led in some cases to protests, vigilante reactions, and social stigma. Think through how to avoid social stigma towards any employees – even when not testing positive, there may be questions or rumors based on employer efforts to preserve confidentiality.  Policies should be clearly explained and emphasized to mitigate such misunderstanding and disproportionate reactions.
  • Also consider how to handle false positives and false negatives. If you lift the lockdown with the idea that an app can control the infection, you could create a false sense of security that, once compromised, eventually gives way to ignoring the technology itself. One examples is Singapore where, despite using a widely credited tracing app, still had to return to lockdown. Their app examined whether an individual had been within two meters of someone with COVID-19 in the past 30 minutes. If so, they receive a signal that they are possibly infected, as well. This is both over-inclusive (Bluetooth through glass walls and windows) and under-inclusive (viral transmission through kissing or intimate contact for less than 30 minutes).

Many of these applications attempt to address many of these privacy concerns by simply notifying the app users themselves (instead of the employer or public health agency), to encourage responsible behavior.

Much remains to be seen about how our society will balance the tension between privacy rights and public health and safety needs as it pertains to application of contact tracing technologies. Nonetheless, yesterday’s release marks a significant event in this continuing conversation.

Today, Senators Blumenthal (D-CT) and Mark Warner (D-VA) introduced the Public Health Emergency Privacy Act (“PHEPA”) into the Senate. A companion house bill was introduced by Reps. Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Suzan DelBene (D-WA), which was co-sponsored by Reps. Yvette Clarke (D-NY), G.K. Butterfield (D-NY), and Tony Cárdenas (D-CA).   This and similar legislation has been introduced in the last week as health agencies and technology companies nationwide are developing contact tracing and monitoring tools to contain the pandemic.

The Act would restrict data collected for public health purposes, limit what and by whom it can be collected and for what purposes it can be used. For example:

  • It requires data minimization procedures for that info, and require opt-in consent for any efforts.
  • It would formally mandate data collected to fight the pandemic be deleted after the public health emergency.
  • The bill would protect personal data collected in connection with COVID-19 from being used for non-public health purposes,
  • It would prohibit conditioning the right to vote based on use of such services or a medical condition.
  • It provides for both public enforcement (by the FTC) as well as a private right of action.
  • The private right of action specifies a range of statutory penalties ($100-$1000 for negligent violations, $500-$500 for reckless, willful, or intentional violations), plus attorney fees and costs, and any other appropriate relief. It would also make the statutory violation sufficient injury to allege standing.

This Democratic legislation comes as a counterproposal to the Senate Republicans’ bill, the COVID-19 Consumer Data Protection Act, failed to gain Democratic support.  The Republican bill’s opt-in requirement was more limited to data collected for purposes of tracking the spread of the virus, and did not include the same civil rights protections that are included in this legislation. It also did not include a private right of action. Both bills, however, include rules mandating transparency and consent, and controlling the use of data for purposes other than public health.

An unofficial copy of the legislation is available here on the website of the Electronic Privacy Information Center (EPIC).  We will update this post once it is available in the Congressional Record.

On May 1, 2020, the U.S. House of Representatives introduced House Resolution 6666, the COVID-19 Testing, Reaching, And Contacting Everyone (“TRACE”) Act.  The resolution, sponsored by Bobby Rush (D-IL) would authorize the Secretary of Health and Human Services to award grants to eligible entities to conduct diagnostic testing for COVID-19 to trace and monitor the contacts of infected individuals and to support the quarantine and testing of such contacts.  The resolution contemplates such activities occurring at mobile health units and, as necessary, at individuals’ residences.  A grant recipient may use the grant funds: (1) to hire, train, compensate, and pay expenses of individuals; and (2) to purchase personal protective equipment (“PPE”) in support of such contact tracing and other activity. Priority is given to applicants proposing to: (1) conduct activities in “hot spots and medically underserved communities”; and (2) hiring residents of the area or community where the activities will occur.

In addition, the resolution outlines that Federal privacy requirements must be complied with, and that no provisions of the Resolution and actions taken under those provisions may supersede any Federal privacy or confidentiality requirements under Federal legislation, including the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and other laws.

The resolution authorizes appropriations of $100,000,000,000 for fiscal year 2020; and such sums as may be necessary for each of fiscal year 2021 and any subsequent fiscal year during which the emergency period continues.  The resolution has been introduced and referred to the House Committee and Energy and Commerce, where it currently awaits further action.

To view the resolution, click here.

As they had previously announced their intent to do so,  the leadership of several Senate Committees introduced the “COVID-19 Consumer Data Protection Act” on May 7, 2020.

The Act would:

  • Require companies under FTC jurisdiction to obtain affirmative express consent from individuals to collect, process, or transfer their personal health, device, geolocation, or proximity information for the purposes of tracking the spread of COVID-19 (e.g., contact tracing).
  • Direct companies to disclose at the point of collection how data will be handled, to whom it will be transferred, and how long it will be retained.
  • Establish clear definitions about what constitutes aggregated and de-identified data to ensure sufficient technical and legal safeguards to protect consumer data from being re-identified.
  • Require companies to allow individuals to opt out of the collection, processing, or transfer of their personal health, geolocation, or proximity information.
  • Direct companies to provide public transparency reports describing data collection activities related to COVID-19.
  • Establish data minimization and security requirements for any personally identifiable information collected by a covered entity.
  • Require companies to delete or de-identify all personally identifiable information when it is no longer being used for the COVID-19 public health emergency.
  • Authorize state attorneys general to enforce the Act.

To read the bill, click here.

On May 4, the Californians for Consumer Privacy (led by Alistair McTaggart, the real estate investor and activist behind the original ballot initiative that led to the CCPA), announced in a letter that it had collected over 900,000 signatures to qualify the California Privacy Rights Act (“CPRA”) for the November 2020 ballot.  This version of the CPRA, commonly referred to as “CCPA 2.0”, would amend the CCPA to create new and additional privacy rights and obligations.  Specifically, it would:

  • Sensitive Personal Information.  Establish a new category of “sensitive personal information” to which new consumer privacy rights would apply. This category would be defined to include: Social Security Number, driver’s license number, passport number, financial account information, precise geolocation, race, ethnicity, religion, union membership, personal communications, genetic data, biometric or health information, and information about sex life or sexual orientation
  • Right to Correction. Grant consumers the right to request correction of inaccurate personal information held by a business.
  • Increased Fines and New Opt-In for Children’s Data.  Triple fines for violating the CCPA’s existing right to opt-in to sales and would create a new requirement to obtain opt-in consent to sell or share data from consumers under the age of 16.
  • Clarify Data Breach Liability. Amend the data breach liability provision to clarify that breaches resulting in the compromise of a consumer’s email address in combination with a password or security question and answer that would permit access to the consumer’s account are subject to the relevant provision.
  • Enforcement. Establish the California Privacy Protection Agency to enforce the law, instead of the California Attorney General’s Office.

To view the announcement, click here.

To view the proposed CPRA, click here.

On April 30, 2020, U.S. Sens. Roger Wicker (R-MS), chairman of the Senate Committee on Commerce, Science, and Transportation, John Thune (R-SD) chairman of the Subcommittee on Communications, Technology, Innovation, and the Internet, Jerry Moran (R-KS), chairman of the Subcommittee on Consumer Protection, Product Safety, Insurance and Data Security, and Marsha Blackburn (R-TN),  announced plans to introduce the COVID-19 Consumer Data Protection Act.  The legislation would provide all Americans with more transparency, choice, and control over the collection and use of their personal health, geolocation, and proximity data. The bill would also hold businesses accountable to consumers if they use personal data to fight the COVID-19 pandemic.

The COVID-19 Consumer Data Protection Act would:

  • Require companies under the jurisdiction of the Federal Trade Commission to obtain affirmative express consent from individuals to collect, process, or transfer their personal health, geolocation, or proximity information for the purposes of tracking the spread of COVID-19.
  • Direct companies to disclose to consumers at the point of collection how their data will be handled, to whom it will be transferred, and how long it will be retained.
  • Establish clear definitions about what constitutes aggregate and de-identified data to ensure companies adopt certain technical and legal safeguards to protect consumer data from being re-identified.
  • Require companies to allow individuals to opt out of the collection, processing, or transfer of their personal health, geolocation, or proximity information.
  • Direct companies to provide transparency reports to the public describing their data collection activities related to COVID-19.
  • Establish data minimization and data security requirements for any personally identifiable information collected by a covered entity.
  • Require companies to delete or de-identify all personally identifiable information when it is no longer being used for the COVID-19 public health emergency.
  • Authorize state attorneys general to enforce the Act.

To view the Senate committee’s press release, click here.