Ascension Data & Analytics LLC, a data analytics company for the mortgage industry, has entered into a proposed settlement agreement with the Federal Trade Commission (FTC) following allegations that it violated the Gramm-Leach-Bliley Act’s (GLB) Safeguards Rule by failing to ensure that a third-party vendor was adequately securing data of mortgage holders. The FTC complaint states that Ascension contracted with the third-party vendor, OpticsML, to scan and store mortgage documents containing sensitive financial information of thousands of mortgage holders. OpticsML stored these documents on a cloud-based server and in a separate cloud-based storage location but failed to protect or encrypt the server and storage locations, which left them unprotected on the internet from January 2018 to January 2019. As a result, approximately 52 unauthorized IP addresses accessed them with most of the IP addresses coming from computers outside of the United States, including addresses from Russia and China.

The FTC complaint concludes that Ascension violated Section 501(b) of the GLB Act (or the Safeguards Rule) which  requires financial institutions to protect the security, confidentiality, and integrity of customer information by developing, implementing, and maintaining a comprehensive information security program. The Safeguards Rule also requires financial institutions to oversee their third-party vendors and ensure that third-party vendors are capable of maintaining and implementing safeguards appropriate for the type of personal information collected from customers. The Safeguards Rule requires these types of measures to be in the contracts between the financial institutions and third-party vendors. The FTC complaint alleges that Ascension failed to take any formal steps to evaluate whether OpticsML could reasonably protect the personal information in the mortgage documents and failed to contractually require OpticsML to implement adequate safeguards.

FTC and Ascension have now entered into a proposed settlement agreement to resolve these allegations. The settlement agreement requires Ascension to implement a comprehensive data security program, conduct biennial assessments of the effectiveness of the data security program, and provide yearly certifications to the FTC that Ascension is complying with the FTC’s order. Ascension must also report any future data breaches to the FTC within 10 days of notifying federal or state government agencies.

On December 23, 2020, a description of the proposed settlement agreement was published in the Federal Register. The agreement will be subject to public comment for 30 days, after which the Commission will decide whether to make the proposed agreement final.

To view the FTC Complaint, click here.

To view the Proposed Settlement Agreement, click here.

To view the Federal Register Notice, click here.

As the nation closely watches the election results coming in, the majority of votes counted in California suggest that the California Privacy Rights Act of 2020 (“CPRA”, or commonly known as “CCPA 2.0”), is on track to pass.  Proposition 24 under the California General Election, as of the information available to us at the time of this blog post, is likely to pass with 6,342,807 (56.1%) votes in favor and 4,966,086 (43.9%) votes against, with 99.0% of the precincts partially reporting.

The CPRA would amend the CCPA and require businesses to:

  • not share a consumer’s personal information upon the consumer’s request;
  • provide consumers with an opt-out option for having their sensitive personal information used or disclosed for advertising or marketing;
  • obtain permission before collecting data from consumers who are younger than 16;
  • obtain permission from a parent or guardian before collecting data from consumers who are younger than 13; and
  • correct a consumer’s inaccurate personal information upon the consumer’s request.

If passed, the CPRA would become operative on January 1, 2023, and would only apply to personal information collected after January 1, 2022.

On October 22, 2020, the National Institute of Standards and Technology (“NIST”) published NIST Technical Note (TN) 2111, “An Empirical Study on Flow-based Botnet Attacks Prediction”. The note, authored by Mitsuhiro Hatada and Matthew Scholl of NIST’s Information Technology Laboratory, presents a method to predict botnet attacks, such as mass spam email and distributed denial-of-service attacks (“DDoS”).  This is particularly timely as botnet threats continue to rise in the era of the Internet of Things (“IoT”), where the number, density, and connectivity of devices continue to increase.

cybersecurity, botnet attacks

The described method leverages the measurement of command and control (C2) activities and automated labeling by associating them with attacks.  The authors evaluated the method using a large-scale, real-world, and long-term dataset. The note highlighted that C2 metrics in the 30 to 60 hours before the attack increases to more of a prediction than the metrics just before an attack occurs.  The results show that the proposed method can predict an increase in attacks with an accuracy of 0.767.   NIST intends for this work to support internet security by contributing to the development of further countermeasures against botnets.

To review the press release, click here.

To review the technical note, click here.

On October 12, 2020, California’s Attorney General proposed a third set of modifications to California Consumer Privacy Act (“CCPA”) regulations. These proposed modifications come nearly two months after the final regulations were approved and made effective by the California Office of Administrative Law (“OAL”) on August 14, and less than a month before the California Privacy Rights Act (“CPRA”) will be put to the voters on the statewide ballot on November 3, 2020.

Below, we summarize the proposed modifications  as well as provide direct links at the bottom of this post. The deadline for comments is not later than 5pm (Pacific Time) on October 28, 2020:

  • Offline Notices of Opt Out Rights:

    Current section 999.306 requires businesses that “sell” personal information to provide a notice of consumers’ rights to opt out. They provide for online notices and even require businesses that does not operate a website to provide an alternative documented method to inform consumers of the right to opt out. The proposed modification would include more specific instructions and examples. It specifically requires businesses that collect personal information offline (presumably even if they also collect it online) to provide notice by an offline method. For example, they illustrate, if a business collects personal information in a store, it can print the notice on paper or post signage.  If they collect information over the phone, they may provide the notice orally.

  • Consumer Methods for Requesting Opt Out:

Section 999.315 addresses consumer opt out requests . The proposed regulations insert a new subsection (h), which would require the business’s methods for submitting opt-out requests to be easy to execute and require minimal steps, and which cannot be so complicated as to subvert or impair the consumer’s opt out attempts:

  1. Specifically, the process for requesting to opt-out shall not require more steps than the opt-in requests. The regulation also provides guidance on how to measure the number of steps for comparison.
  2. A business shall not use confusing language (“Don’t Not Sell my Personal Information”) when providing opt out choices.
  3. Unless otherwise permitted, a business shall not require consumers confirming their opt out request to click through or listen to reasons why they should not do so.
  4. The business’s process shall not require the consumer to provide any more personal information than is necessary to process the request.
  5. Upon clicking “Do Not Sell My Personal Information”, the business shall not require the consumer to search or scroll through the text of a privacy policy or similar document to locate the opt-out request mechanism.
  • Authorized Agent Requests:

Section 999.326 addresses opt-out requests submitted by an authorized agent on behalf of a consumer.  The current version allows a business to require that the consumer do the following: (1) provide the authorized agent signed permission to do so; (2) verify their owner identity directly with the business; [and/or] (3) directly confirm with the business that they provided the authorized agent permission to submit the request. (The current regulations did not specify whether all or only one of these options were required – there was no “and” or “or”).

  1. The proposed regulations modify this to allow a business to require the authorized agent to provide proof that the consumer gave the agent signed permission to submit the request. It then says that the business “may also” require the consumer to do “either of the following”:
  2. Verify their own identity directly with the business; or directly confirm with the business that they provided the authorized agent permission to submit the request.
  3. Therefore, this proposed change would clarification a business’s choices in complying with requests from authorized agents.

To view the redline of proposed modifications, click here.

To view the notice summary of proposed modifications, click here.

On October 7, 2020, The Office of the Comptroller of the Currency (“OCC”) announced that it had assessed a $400 million civil penalty against Citibank, N.A. regarding alleged deficiencies in its enterprise-wide risk management and data governance programs and its internal controls.  In particular, the OCC found violations of 12 CFR Part 30, Appendix D (“OCC Guidelines Establishing Heightened Standards for Certain Large Insured National Banks, Insured Federal Savings Associations, and Insured Federal Branches”.  The OCC also issued a cease and desist order requiring the bank to take “broad and comprehensive corrective actions to improve risk management, data governance and internal controls.”  The order requires the bank to seek OCC’s non-objection before making significant new acquisitions and reserves the authority to implement additional business restrictions or require changes in board composition or senior management should the bank not comply with the order with timely sufficient progress.

In the consent order, the OCC found the following deficiencies:

  • Failure to establish effective front-line units and independent risk management (12 C.F.R. Part 30, Appx D);
  • Failure to establish an effective risk governance framework (12 C.F.R .Part 30, Appx D);
  • Failure of the Bank’s enterprise-wide risk management policies, standards, and frameworks to adequately identify, measure, monitor, and control risks; and
  • Failure of compensation and performance management programs to incentivize effective risk management.

The order also identified deficiencies, noncompliance with 12 C.F.R. Part 30, Appendix D, or unsafe or unsound practices with respect to the Banks’ data quality and data governance, including risk data aggregation and management and regulatory reporting.   The OCC determined that the Board and senior management oversight was inadequate to ensure timely appropriate action to correct the serious and longstanding deficiencies and unsafe or unsound practices in the areas of risk management, internal controls, and data governance.

The order states that this conduct contributed to other past violations and noncompliance, for which the OCC has assessed civil money penalties in 2019. The order further states that the Bank has begun taking corrective action and has committed to taking all necessary and appropriate steps to remedy the identified deficiencies.  The OCC penalty will be paid to the U.S. Treasury.

The Federal Reserve Board took a separate but related action against Citigroup, the bank’s holding company.

To view the press release, click here.

To view the consent order, click here.

On September 18, 2020, Brazil’s data protection law (Lei Geral de Proteção de Dados Pessoais, or “LGPD”) became retroactively effective August 16, 2020.  Penalties do not begin until August 1, 2021, based on a previous delay passed by Brazil’s legislature. Brazil’s legislature previously rejected a provisional measure which would have postponed applicability of the LGPD.  In addition, Brazil’s president issued a decree creating a new data protection authority, the Autoridade Nacional de Proteção de Dados (“ANPD”).

Ultimately, the LGPD will affect organizations doing business in Brazil in a way none of the previous privacy laws and norms have. General data protection provisions and principles are already found in Brazil’s federal constitution, the Brazilian Civil Code, and laws and regulations addressing consumer protection and employment, particular sectors such as financial institutions, health care providers, or telecommunications services providers, and particular professional activities such as medicine or law. Although the country already had several sectoral privacy laws and more than 40 laws and norms at the federal level, the LGPD is the first law to provide a comprehensive framework regulating the use and processing of all personal data.  In light of today’s digital economy and the perpetually expanding use of personal data, companies in all sectors are going to have to adjust and adapt their data collection practices to Brazil’s LGPD.

Influenced by the GDPR, the law sets forth in 65 articles, the Brazilian conception of personal data and provides the legal basis for authorizing its use. A matchup comparing the LGPD to GDPR provided by the International Association of Privacy Professionals (“IAPP”) can be found here.

By way of summary:

Jurisdiction. Like GDPR, the LGPD provides for extra territorial jurisdiction. Under Article 3, a personal data processor is subject to LGPD when either: (1) the data is either collected or processed within Brazil; (2) the data is processed for the purpose of offering goods or services to individuals located in Brazil; or (3) the personal data was collected in Brazil. If one of these conditions is met, the headquarters of the company is irrelevant, and LGPD applies.

Scope of “personal data”.  Personal data is broadly defined to encompass any information regarding any identified or “identifiable” natural person. It also includes any data that can be aggregated to other data to identify the individuals. Given the rapid development of big data, this definition could be broadly interpreted to include almost any kind of data.

Sensitive personal data. Like GDPR, the law includes additional provisions specific to “sensitive personal data”, which is considered vulnerable to discrimination. This includes personal data concerning racial or ethnic origin, religious belief, political opinion, trade union or religious, philosophical or political organization membership, health or sex life, and genetic or biometric data. Such data may only be processed in limited circumstances.

Consumer Rights.  Article 18 enumerates consumer rights and requires they be made known to consumers in an easily accessible manner. These rights include:

  1. Confirmation of the existence of the processing;
  2. Access to the data;
  3. Correction of incomplete, inaccurate or out-of-date data.
  4. Anonymization, blocking or deletion of unnecessary or excessive data or data processed in noncompliance with the provisions of this law.
  5. Portability of the data to another service or product provider, by means of an express request and subject to commercial and industrial secrecy, pursuant to the regulation of the controlling agency.
  6. Deletion of personal data processed with the consent of the data subject, except in the situations provided in Article 16 of this law.
  7. Information about public and private entities with which the controller has shared data.
  8. Information about the possibility of denying consent and the consequences of such denial.
  9. Revocation of consent.

Importantly, the LGPD expands upon the GDPR’s “right to be informed” by including both: (a) the right to be informed as to the entities with which data is shared and (b) the separate right to be informed as to what will happened if they refuse to consent. This provides greater transparency and understanding to consumers of the impact of their choices.

General principlesThe law lays out 10 principles that should be considered when processing personal data. These principles include purpose, suitability, necessity, free access, quality of the data, transparency, security, prevention, non-discrimination and accountability.  Ultimately, the extent of such consideration will assist the ANPD in determining whether a company is compliant.

Grounds for data processing.   Like the GDPR, the LGPD restricts data processing to certain enumerated scenarios as set forth in its text, one of which is after obtaining the valid consent of the data subject. Consent forms must be clear and include the purpose of processing, duration of processing, identity of the data controller, entities to whom the data will be disclosed and rights of the data subject, including their right to deny consent.

In the absence of valid consent, the law permits data processing in limited scenarios, including when processing is necessary to fulfill the legitimate interests of the controller. Importantly, these “legitimate interests” are subject to a balancing test against the data subject’s fundamental rights, in which those rights may ultimately outweigh the legitimate interests articulated.

Data Breaches.  The LGPD does not specify a timeline for data breach notification, but requires notice within a “reasonable time period” and that it contain certain specified information.  Controllers must also notify the ANPD and data subject if they experience a security incident that “may create risk or relevant damage to data subjects.”

Data Protection OfficerThe LGPD does require a data protection officer. However, unlike GDPR and other laws, Executive Order No. 869/18 indicates that the DPO does not have to be a natural person. Rather, companies, committees or other internal groups are able to serve as DPOs. Alternatively, an organization may even outsource the position to a third party, such as a specialized company or law firm.

National Data Protection Authority and Enforcement. Brazil’s ANPD will be responsible for overseeing all compliance and for conducting the aforementioned balancing tests. An initial provisions creating the ANPD was vetoed and, as a result, the ANPD was not officially established until the passage of Executive Order No. 869/18. Therefore, the ANPD is not yet fully operational.  Once it is stood up, the ANPD will have various enforcement tools and administrative penalties available, such as:

  • A formal warning with deadline for corrective measures.
  • Fines of up to 2% of the gross revenue of the company, limited to R$50 million (approximately $9.4 million US) per infraction.
  • Daily fines for noncompliance, cumulatively up to the same limit.
  • Public disclosure of the infraction after proper investigation and confirmation of its occurrence.
  • Blocking of the personal data involved in the infraction until the situation is corrected.
  • Elimination the personal data involved in the infraction.
  • Partial suspension of the database operation involved in the infraction for a maximum 6 month period extendable for the same period, until the activity is compliant.
  • Suspension of the processing activity involved in the infraction, for a maximum of 6 months with 6 month extension
  • Partial or total prohibition of engaging in personal data processing activities.

These penalties will only take effect in August 2021, and they must be applied directly by the ANPD. However, this body is not yet up and running since the relevant regulation on its internal structure and staffing by civil servants and political appointees was only issued at the end of August this year. The ANPD will thus be fundamental in regulating and issuing guidance about the various provisions and themes covered by the law.

Conclusion. More will be known once the ANPD is up and running with guidance and interpretation, and begins enforcement activities.  Much like the CCPA’s January 1, 2020 statutory compliance date and subsequent enforcement and regulations, companies are left in the meantime try and determine the best path to compliance.  In the meantime, companies need to be aware that the law is effective, and can be applied by the courts or other competent authorities.


On August 19, 2020, the California State Assembly on Appropriations ordered to a second reading Assembly Bill (“AB”) 1281, which would extend the exemption of the California Consumer Privacy Act (“CCPA”) in relation to employee information and business-to-business (“B2B”) transactions until January 1, 2022.  Specifically, AB 1281 would exempt information collected about a natural person in the course of such person acting as a job applicant, employee, owner, director officer, medical staff member, or contractor.  It would also exempt information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, and whose communications or transactions with the business occur solely within the context of the business’s due diligence regarding a product or service. AB 1281 would only become operative if the California Privacy Rights Act (“CPRA” or “CCPA 2.0”) is not approved by voters during the November 2020 general election.

Two other bills, AB 660 and AB 1782, were also referred to the Appropriations Committee on August 19, 2020.  AB 660 would prohibit data collected, received, or prepared for purposes of contact tracing from being used, maintained, or disclosed for any purpose other than facilitating contact tracing efforts. It would also require all data collected, received, or prepared for purposes of contact tracing to be deleted within 60 days, except if that data is in the possession of a state or local health department.  AB 1782 would create the Technology-Assisted Contact Tracing Public Accountability and Consent Terms Act.  This would generally regulate public health entities and businesses that provide technology-assisted contact tracing. AB 1782 would also require a business or public health entity offering technology-assisted contact tracing to provide a simple mechanism for a user to revoke consent for the collection, use, maintenance, or disclosure of data and permit revocation of consent at any time.

To view AB 1281, click here.

To view AB 660, click here.

To view AB 1782, click here.

On Friday, August 14, 2020, the California Attorney General released the final CCPA regulations issued under the California Consumer Privacy Act of 2018 (“CCPA”) as approved by the California Office of Administrative Law (“OAL”), and filed them with the California Secretary of State.  During its review, the OAL made additional revisions to the CCPA regulations, which it stated were “non-substantive” and primarily for accuracy, consistency, grammar, and clarity, and for eliminating unnecessary or duplicative provisions. Per the Attorney General’s request, the regulations became effective on August 14, 2020, the day they were submitted to the Secretary of State by the Attorney General.

To view the final CCPA regulations, click here.

To view a redline version of the additional changes made by the OAL to the Attorney General’s proposed regulations, click here.

To view OAL’s statement of reasons detailing its additional changes, click here.

Yesterday, on August 10, 2020, the European Commission (“Commission”) and the Department of Commerce (“DoC”) issued a joint statement announcing they are beginning discussions to evaluate potential enhancements to the EU-U.S. Privacy Shield framework.  These discussions have begun to address compliance with the recent Schrems II decision by the Court Justice of the European Union (“CJEU”).  Both entities recognized the importance of data protection in the EU and the U.S. as well as the significance of cross-border data transfers to the “nearly 800 million citizens on both sides of the Atlantic.” They also noted their shared commitment to privacy and the rule of law, as well as to further deepening their economic relationship.

To view the joint statement on the Commission’s website, click here.

To view the joint statement on the DoC’s website, click here.

Vermont Amends Data Breach Notification Law

On July 1, 2020, amendments to Vermont’s Security Breach Notice Act, 9 V.S.A. §§ 2330 & 2335, took effect along with a new “Student Online Personal Information Protection Act.”

Key amendments to the security breach act include:

  • An expanded definition of Personally Identifiable Information (“PII”). The definition now adds various ID numbers, unique biometric data, genetic information, and certain health or wellness records.
  • Expanded definition of security breach to include “login credentials”. Login credentials are defined by the amendment as “a consumer’s user name or email address in combination with a password or an answer to a security question that together permit access to an online account.”  Businesses should consider login credentials and PII as the same when considering whether breach occurred and whether a business has a general duty to notify, but login credentials differ from PII in how and to whom notice must be provided.
    • If only login credentials are breached (without breach of actual PII), a data collector is only required to notify the Vermont Attorney General (or the Department of Finance, as applicable) if the login credentials were acquired directly from the data collector or its agent. The law specifies different notification requirements depending on whether the breached login credential would permit access to an email account.
  • Narrows the Permissibility of Substitute Notice. Previously, substitute notice was permitted when the class of affected consumers to be provided written or telephonic notice exceeded 5,000, the cost of direct notice would exceed $5000, or the data collector did not have sufficient contact information. Now, substitute notice is only permitted where the lowest cost of providing notice to affected customers via written, email, or telephonic notice would exceed $10,000. This revision included e-mail as a permissible form of notice and eliminated the number of affected consumers exceeding 5,000 as a basis for providing substitute notice.  Because email allows companies to provide mass notice to affected customers in a timely manner at low cost, it will be more difficult for data collectors to reach that $10,000 minimum.

Vermont Enacts New Student Privacy Law

Vermont’s new Student Online Personal Information Protection Act updates its privacy law to include regulations specifically concerning the data of pre-K to 12th grade students. The law applies to website operators, online services, or mobile applications designed and marketed to, and used primarily by, pre-K to 12th grade schools.

Under the new law, enforceable by the Vermont Attorney General, operators are generally prohibited from:

  • Engaging in targeted advertising based on any information the operator has acquired because of the use of its site, service, or application for PreK-12 purposes;
  • Using information that is created or gathered by the operator’s site, service, or application to amass a profile about a student, except for PreK-12 purposes;
  • Selling, bartering, or renting a student’s information; or
  • Disclosing covered information to a third party, unless a specific exception applies (including certain disclosures for educational purposes).

Operators are also required to: (a) implement and maintain reasonable security procedures and practices; (b) delete a student’s covered information within a reasonable time period if the school or school district requests it; and (c) publicly disclose and provide the school with material information about the operator’s collection, use, and disclosure of covered information, including publishing terms of service, a privacy policy or similar document.

Operators may use or disclose covered information as required by law. Operators may also use covered information for legitimate research purposes in certain circumstances and to disclose the information to a state or local education agency for PreK-12 purposes, as permitted by State or federal law.  Operators are also not prohibited from using covered information in the following scenarios so long as the information is not associated with an identified student within the operator’s control (sites, services, applications, products, or marketing):

  • Improving educational products;
  • Demonstrate the effectiveness of the operator’s products or services, including their marketing;
  • Development or improvement of educational sites, services, or applications;
  • Using recommendation engines to recommend to a student (1) additional content or (2) additional services, in which both relate to an educational, other learning, or employment opportunity purpose, so long as the recommendation is not determined in whole or in part by payment or other consideration from a third party; or
  • Responding to a student’s request for information or feedback without the response being determined by payment or other consideration

This subchapter does not:

  • Limit the authority of law enforcement to lawfully obtain content or information;
  • Limit the ability of an operator to use student data for adaptive or customized student learning purposes;
  • Apply to general audience websites, online services, online applications, or mobile applications
  • Limit service providers from providing Internet connectivity to schools, students, or their families;
  • Prohibit an operator from marketing educational products directly to parents;
  • Impose a duty upon a provider of an electronic store, gateway, marketplace, or other means or purchasing or downloading software to review or enforce compliance of this law;
  • Impose a duty upon a provider or an interactive computer service to review or enforce compliance with this law;
  • Prohibit students from downloading, exporting, transferring, saving, or maintaining their own student-created data or documents; or
  • Supersede the federal Family Educational Rights and Privacy Act (FERPA) or rules adopted pursuant to the Act.

Finally, the law requires the Vermont Attorney General, in consultation with the Vermont Agency of Education, to examine the issue of student data privacy as it relates to FERPA and access to student data by data brokers, and determine whether to make any recommendations.


This post was co-authored with Kaylee Rose, first-year law student at Cumberland School of Law: