In a recent decision from the Southern District of Florida, U.S. District Judge Robert N. Scola, Jr. denied class certification of a proposed class of paid Univision NOW subscribers who assert that Univision NOW’s use of the Meta Pixel violates the Video Privacy Protection Act (VPPA). The three proposed class representatives allege D2C, LLC, doing business as Univision NOW, violated the VPPA by disclosing their personal viewing information using pixel software from Meta Platforms. The plaintiffs claim Univision NOW disclosed information linking them to specific videos that they watched. The plaintiffs sought class certification of Univision NOW subscribers whose viewing information was allegedly disclosed to Meta between April 2021 and May 2023. Judge Scola denied class certification, finding the plaintiffs failed to meet the numerosity requirement for class certification.

What is the VPPA?

In 1988, Congress passed the VPPA in response to concerns about consumer privacy in the age of video rentals. This legislation was spurred by concerns over disclosure of Judge Bork’s video rental history, which emerged during his SCOTUS confirmation hearing. The VPPA precludes videotape service providers from disclosing a consumer’s personal identifying information (PII), together with their video viewing history, and it provides for actual or liquidated damages of $2,500 per violation of the law. In recent years, there has been an increase in new privacy class actions under the VPPA against website owners with video functionality on their websites.

The Univision Now Case

Judge Scola’s decision hinged primarily on the issue of numerosity—one of the four key requirements for class certification under Federal Rule of Civil Procedure 23(a). To satisfy the numerosity requirement, plaintiffs were required to show that the number of individuals affected by the purported VPPA violation  was large enough that it would be impractical to bring each case individually. The plaintiffs initially argued that Univision NOW automatically disclosed the viewing information of its 35,845 subscribers, but acknowledged there were several impediments to Univision NOW’s transmission of information to Meta.

The court explained that the plaintiffs’ theory of automatic data transmission was undercut by their own concessions and Univision NOW’s expert testimony, which suggested several conditions must be met for the Pixel to actually transmit PII. Specifically, the court found that in addition to viewing or selecting a prerecorded video through Univision NOW’s website, a subscriber also must have  (1) had a Facebook account at the time the video was selected; (2) used a web browser that did not block the Pixel by default; (3) been simultaneously logged into the subscriber’s own Facebook account while selecting the video; (4) been simultaneously logged into Facebook on the same device that the subscriber used to select the video; (5) been simultaneously logged into Facebook using the same browser through which the subscriber selected the video; and (6) not deployed any number of browser settings or add-on software that would have blocked the Pixel.  Crucially, while the court found that the class was ascertainable, it also found that class certification was not warranted because the plaintiffs failed to carry their burden to show that Univision NOW disclosed the personally identifiable information of and record of videos viewed by even a single subscriber—including that of the three named plaintiffs.

Although the plaintiffs attempted to save prospects of class certification by reducing the potential class to approximately 17,000 individuals based on estimates of individuals who use Facebook and individuals who use certain popular web browsers, Judge Scola ruled that these estimates were too speculative.  Without a means to determine class size, the court found the plaintiffs failed to meet the numerosity requirement.

Conclusion

Judge Scola’s decision to deny class certification in this case is a significant victory for Univision NOW. While the plaintiffs can still pursue individual claims, their failure to secure class certification limits the scope and potential impact of their lawsuit.

For companies that provide video content and use tracking technologies like pixels, this decision reinforces the need to closely monitor their data-sharing practices and ensure compliance with privacy laws.

As part of a new enforcement initiative called “Operation AI Comply,” the FTC recently announced that it has brought the following five enforcement actions against businesses that use or sell AI tools in a manner that the FTC has alleged is deceptive and unfair:

  1. DoNotPay. The FTC brought suit against DoNotPay, which had claimed to be “the world’s first robot lawyer.” The company advertised its AI service as capable of allowing consumers to sue without a lawyer and generate valid legal documents quickly, aiming to replace the legal industry with AI. However, the FTC’s complaint states that DoNotPay did not test its chatbot’s effectiveness against human lawyers and lacked any retained attorneys.

    Additionally, DoNotPay offered a service claiming to check small business websites for legal violations based on just an email address, which was also found ineffective.

    As part of a proposed settlement, DoNotPay will pay $193,000 and must inform consumers about the limitations of its service. The settlement will also prevent the company from making unsupported claims about its ability to replace professional services.

    The FTC’s decision to pursue this action was unanimous, and the settlement is open for public comment before finalization.
  2. Ascend Ecom. The FTC filed suit against Ascend Ecom, an online business opportunity scheme. The FTC alleged Ascend Ecom misled consumers with false claims about its AI-powered tools that supposedly enable quick earnings through online storefronts.

    The complaint alleges that Ascend charged consumers tens of thousands of dollars to start online stores on platforms like Amazon and Etsy, while also requiring significant investments in inventory. Despite promises of generating substantial monthly income within two years, most consumers experienced financial losses instead, accumulating debt and negative bank balances.

    Additionally, the scheme is accused of pressuring consumers to alter or remove negative reviews and failing to honor a “guaranteed buyback” policy, even threatening to withhold it from dissatisfied customers. A federal court has temporarily halted the scheme and placed it under a receiver’s control while the case proceeds in court.

    The Commission vote authorizing the staff to file the complaint was 5-0. The complaint was filed in the U.S. District Court for the Central District of California.
  3. Ecommerce Empire Builders. The FTC charged Ecommerce Empire Builders (EEB) with misleading consumers about building an “AI-powered Ecommerce Empire” through costly training programs and “done for you” storefronts. The FTC alleges that the scheme promised participants the potential to earn millions, but the FTC alleges these profits rarely materialize.

    According to the FTC’s complaint, consumers paid as much as $35,000 for storefronts, only to find they generated little to no income. EEB’s marketing claimed clients could make $10,000 monthly without evidence to support such claims. Many consumers reported difficulty obtaining refunds, as EEB either denied requests or offered partial refunds.

    A federal court has temporarily halted EEB’s operations and placed it under a receiver’s control while the case continues.  The FTC’s complaint was unanimously approved by the Commission and filed in the U.S. District Court for the Eastern District of Pennsylvania.
  4. Rytr. The FTC charged Rytr, an AI writing assistant service, with generating false consumer reviews and testimonials. The FTC alleges that since April 2021, Rytr allowed paid subscribers to produce unlimited detailed reviews based on minimal input, often resulting in misleading content that could deceive potential buyers. The FTC’s complaint further alleges that many subscribers created hundreds or even thousands of potentially false reviews.

    To settle the complaint, a proposed order would prevent Rytr from advertising or selling any service related to generating consumer reviews or testimonials. The Commission’s vote to file the complaint was 3-2, with two commissioners dissenting.

    Commissioner Andrew Ferguson issued a dissenting statement joined by Commissioner Melissa Holyoak. The dissenting opinion criticizes the case against Rydr arguing that it overextends the FTC’s enforcement powers by punishing Ryder for providing a generative AI tool used that could legitimately be used by businesses simply because of the mere possibility the tool could be used for fraudulent or deceptive purposes, which the Commissioners argue could potentially have a stifling effect on innovation.
  5. FBA Machine. In June 2024, the FTC took action against a business opportunity scheme that the FTC alleges falsely promised consumers guaranteed income through online storefronts using AI software. The scheme, known as Passive Scaling and later rebranded as FBA Machine, allegedly defrauded consumers of over $15.9 million with deceptive earnings claims.

    Following the FTC’s complaint, a federal court temporarily halted the scheme and appointed a receiver. The case is ongoing, and the complaint was filed in the U.S. District Court for the District of New Jersey, with the Commission voting 5-0 to authorize the action.

The five enforcement actions expand upon other enforcement proceedings that the FTC previously brought against other businesses using AI tools, such as the pre-existing claims the FTC has brought against companies utilizing AI tools to offer services for creating online storefronts, enrolling in career training, sending anonymous messages, utilizing facial recognition in retail stores, and DNA testing.  Collectively, these enforcement actions signal that the FTC continues to make the use of AI tools by businesses, and its impact on consumers, a top enforcement priority.

On August 5, 2024, Illinois Governor J.B. Pritzker signed into law SB 2979, significantly amending the state’s Biometric Information Privacy Act (BIPA). This update represents a considerable decrease in the potential for exorbitant financial liabilities for businesses that engage with biometric data while still maintaining the statute’s robust protections for individuals’ biometric data. The amendment went into effect immediately.

Most significantly, SB 2979 redefines the scope of potential liability from $5,000 per collection or disclosure to $5,000 per individual. Previously, the Illinois Supreme Court held that BIPA’s framework allowed for each biometric data interaction—such as a fingerprint scan—to be treated as a separate infraction, potentially resulting in overwhelming cumulative penalties. This ruling raised the financial stakes associated with BIPA violations considerably, particularly within employment settings. SB 2979 consolidates these infractions by treating the initial collection of biometric data as a singular violation, irrespective of the number of collections or disclosures. This change aims to strike a balance of commercial and individual interests, reducing the threat of existential judgments against businesses while preserving the law’s core protective measures.

The amended law does not apply retroactively, and thus it will not influence any pre-existing litigation. This element of the amendment addresses concerns from numerous industry stakeholders who have faced extensive legal challenges and significant settlements under the original BIPA provisions.

In addition to narrowing the potential for damages, the amendments modify the definition of “biometric identifier” to exclude certain types of biological and medical data. The revisions also make clear that an “electronic signature” is valid for obtaining written consent.

Despite these business-friendly adjustments, BIPA continues to empower Illinois residents with a private right of action, a unique feature not commonly found in similar laws across other states. Businesses must still secure written informed consent before collecting biometric data, and they must still adhere to stringent data protection and storage policies​.

The State of Texas and Meta Platforms Inc. (“Meta”) have agreed to a $1.4 billion settlement, to be paid out over five years, to resolve claims relating to Meta’s alleged use of facial recognition technology without user consent.  This settlement marks the largest privacy settlement obtained by a single state and is the first one obtained under the Texas Capture or Use of Biometric Identifier Act (“CUBI”).  

Meta’s subsidiary, Facebook, launched its facial recognition technology around 2010, allowing users to tag others in photos and videos.  Texas alleged Meta used the collected biometric data to train and enhance its facial recognition technology.  Facebook discontinued the technology in November 2021 after Facebook entered a $650 million settlement in a lawsuit that alleged the technology violated the Illinois Biometric Information Privacy Act (“BIPA”). 

Texas brought this suit shortly thereafter—in February 2022—alleging that Facebook violated CUBI by capturing and disclosing Facebook users’ and non-users’ biometric identifiers for a commercial purpose without their consent through its now-discontinued facial recognition technology.  Texas also alleged that Facebook violated CUBI by failing to destroy the collected biometric data within a reasonable time.

While Illinois has been the focus of compliance due to BIPA’s private right of action, this settlement may indicate that Texas will become increasingly active in enforcement—for both CUBI and its new comprehensive privacy law that went into effect on July 1.  Companies should take notice.

The Federal Trade Commission (FTC) continues to enforce and update its Health Breach Notification Rule (HBNR) amidst a fast-changing regulatory environment. A new rule, which took effect this week, expands the scope of the HBNR, as the FTC ramps up enforcement activity related to disclosures of identifiable health data, and other agencies implement changes to the Health Insurance Portability and Accountability Act (HIPAA), Part 2, and Information Blocking rules regulating similar data.

The Upshot

  • Via final rule, effective July 29, the FTC expanded the scope of the HBNR in efforts to “strengthen and modernize” the applicable regulations. The updated HBNR expands the methods by which regulated entities may make required notifications and updates timing requirements for making such notifications.
  • The updated HBNR requires entities that possess personal health records (PHR), but are not covered by the HIPAA, to provide notice following a breach of unsecured data.
  • The FTC clarified that the HBNR is intended to apply to health care apps and similar technologies not covered by HIPAA. The FTC implemented these clarifications via expansions to existing definitions and other changes intended to improve overall readability of the HBNR.  The FTC recently began enforcing the HBNR, while other HHS agencies continue to publish guidance related to HIPAA and Information Blocking.

The Bottom Line

Companies that are not regulated by HIPAA but maintain health information must ensure HBNR compliance. Other entities—including “Part 2” (federally assisted substance use disorder treatment) programs, HIPAA-covered entities and their business associates, health care providers, information technology developers and information exchanges, and lawful holders of health information—should note recent shifts in the regulatory environment for maintaining identifiable health data.

Health Breach Notification Rule

Effective July 29, 2024, Federal Trade Commission (FTC) updates to its Health Breach Notification Rule (HBNR) will (1) clarify the scope of the HBNR in order to make clear its applicability to developers of electronic health apps; (2) revise key definitions for breaches and regulated entities; and (3) revise the method, timing, and content of required notices. The HBNR applies to “foreign and domestic vendors of personal health records, PHR-related entities, and third-party service providers” not covered by the Health Insurance Portability and Accountability Act (HIPAA).

As clarified, a “vendor of personal health records” is an entity, not covered by HIPAA, that “offers or maintains a personal health record,” and may include developers of mobile health applications. A “personal health record” will now be “an electronic record of PHR identifiable health information on an individual that has the technical capacity to draw information from multiple sources and that is managed, shared, and controlled by or primarily for the individual,” and may include a mobile health application. PHR-related entities are, essentially, those not covered by HIPAA that offer products and services through vendors of personal health records, including their mobile health applications. Finally, a “third-party service provider” is one that accesses, maintains, retains, modifies, records, stores, destroys, or otherwise holds, uses, or discloses unsecured PHR identifiable health information in furtherance of services provided to such entities. 

Taken as a whole, this means that the revised HBNR will apply to virtually any entity not covered by HIPAA that handles identifiable information that relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual.

Under the revised HBNR, a “breach of security” will include any unauthorized releases of PHR identifiable health information, even if intentional. The HBNR requires vendors and PHR-related entities to notify affected individuals of any breaches of security, and further requires notice to the FTC and media for breaches affecting more than 500 individuals. The updated HBNR provides for certain electronic notifications and expands the required content of relevant notifications. 

Other Updates Affecting Health Information

Various federal agencies continue to issue notable guidance and expand enforcement efforts specific to health information. 

The updates to the HBNR, specifically, follow 2023’s first instance of FTC enforcement. In that instance, the FTC took issue with, among other things, the third-party collection and sharing, in alleged violation of the HBNR, of users’ medications and demographic data with advertisers in order to create medication-specific advertisements. Earlier this year, the FTC continued similar enforcement efforts against a Part 2 (substance use disorder diagnosis, treatment, or referral for treatment) program for alleged violations of the FTC Act and the Opioid Addiction Recovery Fraud Prevention Act of 2018. (Part 2 Programs should also note recent Office for Civil Rights (OCR) and Substance Abuse and Mental Health Services Administration regulatory alignment of substance use disorder confidentiality requirements and penalties with HIPAA). 

Notably, the alleged violations were based, in part, on the improper disclosure of identifiable information via “tracking technologies” and violations of HIPAA stemming from OCR’s subregulatory guidance related to such tracking technologies. That same tracking technology guidance has been subject to recent OCR revisions, as well as judicial challenges. For example, in June, the District Court for the Northern District of Texas vacated a portion of OCR’s guidance. Specifically, the court took issue with what it described at length as OCR’s attempts, via the tracking technology guidance, to “shoehorn additional information,” into the definition of “individually identifiable health information” provided by statute. 

Though promulgated by formal rulemaking (rather than subregulatory guidance), the revised HBNR similarly relies on new or expanded definitions of key terms, including “PHR identifiable health information” and “covered health care provider.”  

In addition, OCR recently released subregulatory guidance related to security via a concept paper outlining a number of currently voluntary cybersecurity recommendations (including reference to FDA guidance for cybersecurity recommendations applicable to medical devices) and cybersecurity performance goals. OCR indicated that formal updates to the HIPAA Security Rule will follow in 2024, potentially along with proposed changes to the Privacy Rule. Additionally, CMS recently published disincentives complementing applicable OIG penalties based on regulatory definitions set forth by the Office of the National Coordinator for Health Information Technology (ONC) in “Information Blocking” regulations promulgated in accordance with the Public Health Service and 21st Century Cures acts. Via these rules, various HHS agencies continue to interpret and clarify the applicability of statutory and regulatory definitions governing a wide array of health information. 

Health care entities and other lawful holders of health information should review these rules and maintain robust compliance measures for health information, prior to incurring any breach. Ballard Spahr’s Health Care and Privacy and Data Security attorneys are available to assist with any questions related to the HBNR, or other aspects of data privacy, security and breach reporting.

In this month’s webcast, Greg Szewczyk of Ballard Spahr’s Privacy and Data Security group is joined by Paolo Sbuttoni, a partner at Foot Anstey who specializes in technology and data. We compare the AI regulatory landscape in the European Union, the United Kingdom, and the United States. We also provide insight on the scope of recent regulation that have been enacted in the EU and U.K., and those that will be enacted on a state by state basis in the U.S.

The California Privacy Protection Agency (“CPPA”) discussed at its July 16 meeting new enforcement focuses in addition to current goals.  While the new focuses are largely in line with general trends, they also serve as a reminder that specific and nuanced compliance decisions can make a big difference.

As the CPPA has made clear in multiple statements, its focus over the past year has largely been on monitoring privacy notices and policies, the right of deletion, and the handling and implementation of consumer requests.  For example, the CPPA issued a formal enforcement advisory in April emphasizing that businesses must not request unnecessary personal information when consumers opt out of data sales.

The discussion at the July 16 meeting continues this trend, but adds four new focus areas including dark patterns, honoring consumer opt-out requests, providing proper notice of information sales and sharing, and the agencys prioritization of cases that affect vulnerable groups.

The CPPA notes that investigations typically span about 18 months and are ongoing across a variety of sectors, encouraging businesses to continue to be proactive in their privacy compliance efforts.  The agency hopes to continue issuing enforcement advisories as they collaborate with other states and federal partners in addressing privacy issues.

Additionally, the agency is considering new draft regulations that would address a variety of privacy issues, including privacy rights around automated technology and artificial intelligence (“AI”).  These regulations include increased audit requirements and rules guiding companies on using AI in their businesses. These proposals will be closely watched as they may impact common practices – including relating to employee monitoring.

In other words, although the state legislative season has ended for the year, the privacy compliance landscape continues to evolve.

Over the course of the past few months, the Office of Civil Rights (OCR) and the Office of the National Coordinator for Health Information Technology (ONC), both of which are divisions of the U.S. Department of Health and Human Services (HHS), have issued a series of new regulations and guidance related to the Health Insurance Portability and Accountability Act of 1996 (HIPAA).

The Upshot

  • OCR issued a final rule that modifies HIPAA to support reproductive health care privacy.
  • OCR issued new guidance which clarifies and revises how the HIPAA rules apply to a Regulated Entity’s use of tracking technologies, although a recent court decision struck down a significant portion of that guidance.
  • OCR published frequently asked questions to address notice and breach procedure questions related to the Change Healthcare cyber attack.
  • ONC issued a final rule that requires Health IT Modules to provide an “internet-based method” for an individual to request a restriction on the use or disclosure of their PHI.

The Bottom Line

Covered entities under HIPAA (including employer-sponsored health benefit plans), as well as their business associates, should be aware of these new rules and guidance in order to maintain compliance with HIPAA. Attorneys in Ballard Spahr’s Health Care Industry Group are continuously tracking the developments and are available for counsel.

In the first half of 2024, OCR and ONC have issued rules and guidance related to HIPAA on four topics of importance to health plans, health care clearinghouses, and health care providers that are subject to HIPAA, as well as their business associates (collectively “Regulated Entities”).

Reproductive Health Care Privacy Final Rule

On April 22, 2024, OCR issued a final rule to modify HIPAA to support reproductive health care privacy. The final rule makes a number of significant changes to the HIPAA regulations. For example, the new rule:

  • Prohibits the use or disclosure of Reproductive Health Care Information (RHI) by Regulated Entities for the purpose of investigating or imposing liability on any person for the mere act of seeking, obtaining, providing, or facilitating reproductive health care that is lawful under the circumstances in which it was provided, or to identify any person for such purposes. These prohibited purposes include, but are not limited to, law enforcement investigations, third-party investigations in furtherance of civil proceedings, state licensure proceedings, criminal prosecutions, and family law proceedings.
  • Requires Regulated Entities to obtain a signed attestation that certain requests, including subpoenas, for RHI are not for these prohibited purposes.
  • Requires Regulated Entities to modify their Notice of Privacy Practices to address reproductive health care privacy.
  • Includes a presumption that reproductive care is, for HIPAA purposes, presumed to be legal unless the Regulated Entity has “actual knowledge” that the care was not lawful under the circumstances.

Compliance is required by Dec. 23, 2024, except for required updates to the Notice of Privacy Practices that are required by Feb. 16, 2026.

OCR Guidance Regarding the Use of Tracking Technologies

On March 18, 2024, OCR issued new guidance on how the HIPAA rules apply to a Regulated Entity’s use of third-party tracking technologies, such as cookies and pixels. The new publication updates guidance that OCR originally published on these technologies in December 2022 and includes a number of significant revisions and clarifications. For example, the new guidance:

  • Clarifies that not all data elements collected by website tracking technologies constitute PHI. In order to constitute PHI, the information must be related to an individual’s past, present, or future health, health care, or payment for health care.
  • Suggests an alternative solution for dealing with a technology vendor who will not sign a Business Associate Agreement (BAA): the Regulated Entity can establish a BAA with a Customer Data Platform vendor, who would then de-identify online tracking information that includes PHI. The Customer Data Platform vendor can then only disclose de-identified information to tracking technology vendors.
  • Emphasizes that OCR is going to prioritize compliance with the HIPAA Security Rule in investigations into the use of online tracking technologies.

However, on June 20, 2024, the U.S. District Court for the Northern District of Texas vacated a significant portion of OCR’s tracking technology guidance on the grounds that it exceeded OCR’s statutory authority under HIPAA. Specifically, the court stated that metadata from a user’s search of a provider’s public-facing web page does not meet the definition of “individually identifiable health information” under HIPAA. As of now, the tracking technology guidance is still on the HHS website, but HHS has stated that it is evaluating its next steps in light of this recent decision.

OCR Updates and FAQs Regarding the Change Healthcare Cyber Attack

On April 19, 2024, OCR published a webpage with frequently asked questions (FAQs) concerning the Change Healthcare (a unit of UnitedHealth Group (UHG)) cybersecurity incident which occurred in late February 2024. OCR then updated the FAQs on May 31, 2024, to address additional concerns. In summary, the FAQs explain that:

  • OCR has initiated an investigation into the Change Healthcare cybersecurity incident to determine whether a breach of unsecured PHI occurred and into Change Healthcare’s and UHG’s compliance with the HIPAA Rules.
  • OCR’s investigation is not prioritizing the investigation of covered entities and business associates engaged with Change Healthcare and UHG. However, the guidance reminds these other entities of their obligation to have BAAs in place and to make sure that timely breach notifications to HHS and the affected individuals are provided if and when they receive notice from Change Healthcare.
  • If a covered entity receives notice that it has been affected by a breach by Change Healthcare, it may delegate to Change Healthcare the task of providing the required HIPAA breach notifications on its behalf. Only one entity – which could be the covered entity itself, UHG, or Change Healthcare – needs to complete breach notifications to affected individuals and HHS, and a covered entity and Change Healthcare may cooperatively satisfy any breach obligations under HIPAA.

Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing Final Rule

On February 8, 2024, ONC issued a final rule that, in part, supports the HIPAA Privacy Rule. Under the HIPAA Privacy Rule, covered entities are required to allow individuals to request a restriction on the use or disclosure of their PHI for treatment, payment, or health care operations and to have policies in place by which to accept or deny such requests. However, the HIPAA Privacy Rule does not specify a particular process to be used by individuals to make such requests or for the entity to accept or deny the request. In guidance that addresses various technical standards applicable to electronic health information, the ONC sets forth a standard that requires Health IT Modules to support an internet-based method for an individual to request such a restriction.

The authors express their thanks to Summer Associate Sofia E. Reed for her efforts in the preparation of this Briefing.

The Consumer Financial Protection Bureau (CFPB) has launched the process for independent standard-setting bodies to receive formal recognition, as part of its efforts to shift towards open banking in the United States.

On June 5, 2024, the CFPB finalized a rule outlining the minimum attributes that standard-setting bodies must exhibit to issue standards in compliance with CFPB’s proposed Personal Financial Data Rights Rule.  The Personal Financial Data Rights Rule, proposed in October 2023, is the first federal legal framework for open banking under Section 1033 of the 2010 Consumer Financial Protection Act.  This previously untapped legal authority gives consumers the right to control their personal financial data and assigns the task of implementing personal financial data sharing standards and protections to the CFPB.

As it is currently drafted, the Personal Financial Data Rights Rule would grant companies the ability to utilize technical standards developed by standard-setting organizations recognized by the CFPB.  The CFPB “Industry standard-setting bodies that operate in a fair, open, and inclusive manner have a critical role to play in ensuring a safe, secure, reliable, and competitive data access framework,” stated the CFPB in the proposal.

Under the rule launching the approval process, industry standardard-setting bodies can apply to be recognized by the CFPB.  Those seeking approval must demonstrate the following attributes:

  • Openness: A standard-setting organization’s sources, procedures, and processes must be open to all interested parties, including public interest groups, consumer advocates, and app developers;
  • Balanced-Decision Making: The decision-making power to set standards must be balanced across all interested parties. There must be meaningful representation for large and small commercial entities, and balanced representation must be reflected at all levels of the standard-setting body;
  • Due Process: The standard-setting body must use documented and publicly available policies and procedures to provide a fair and impartial process. An appeals process is also available for the impartial handling of procedural appeals;
  • Consensus: Standards development must proceed by consensus but not necessarily unanimity; and
  • Transparency: Procedures must be transparent to participants and publicly available.

The CFPB also outlined the application process which involves requesting recognition, followed by additional information requests from the CFPB which may also involve public comment.  Next, the CFPB will review the available information against the requirements listed above, make a decision on the application, and if approved, officially recognize the organization as a standards-setting body. 

The CFPB also has the power to (a) revoke standard-setters’ recognition if they fail to meet the qualifications and (b) impose a maximum recognition duration of five years, after which recognized standard-setters will have to apply for re-recognition.  This rule will take effect 30 days after its publication in the Federal Registrar.

Fair standards issued by standard-setters outside the agency will help companies comply with the proposed Personal Financial Data Rights Rule.  Interested standard-setters are encouraged to begin ensuring their adherence to this new rule.

In a reminder that open source products can carry significant risks beyond intellectual property, a vulnerability in a compression tool commonly used by developers has triggered widespread concerns. 

XZ Utils (“XZ”) is an open source data compression utility, first published in 2009, and widely used in Linux and macOS systems. The tool is primarily used for data compression and decompression and may reside on routers and switches, VPN concentrators and firewalls. XZ is used on many smartphones, televisions and most web servers on the internet. Despite its wide adoption, as an open source tool, XZ was primarily maintained by one volunteer—who maintained the open source software for free.

When this volunteer ran into some personal matters, he turned over the maintenance responsibilities to JiaT75, known as Jia Tan (or what many now believe was a group of hackers working under this alias). In February 2024, Jia Tan updated XZ for versions 5.6.0 and 5.6.1, which included malicious code. This malicious code could be used to create a backdoor on infected devices, including the potential for stealing encryption keys or installing malware. Jia Tan took several steps to obfuscate the addition of malicious code. For example, the malicious code was not included in the public GitHub repository but instead included only in tarball releases. Further, the backdoor was deployed only in certain environments to avoid detection.

On March 29, 2024, a security researcher stumbled onto a software bug that led him to discover and report the XZ attack. The Cybersecurity & Infrastructure Security Agency (CISA) issued an alert recommending that users downgrade to an uncompromised version of XZ, and Linux vendors promptly issued press releases and remediation efforts to minimize the effects of the attack.

The XZ attack has raised questions within the open source community about the risks of having critical software maintained and governed by unpaid volunteers. The attack also serves as a reminder that even widely adopted software is at risk of attack and companies should prepare for future attacks to its or its third party vendor’s software. As part of that process, companies should:

Know your dependencies. Be able to quickly identify, or ensure that third-party vendors can quickly identify, whether a certain software package with a known vulnerability is used in any of a company’s software.

Develop disaster recovery and incident response plans. Response plans are typically more comprehensive and more effective when they are created before an incident. Companies should consult with a variety of subject matter experts, internally and externally, to evaluate the size and scope of its business activities, amount and type of personal information that is collected and stored, the locations of operations and applicable federal, state and sector-specific regulatory requirements.

Apply technology mitigation strategies. Limit access to critical software systems to only those employees and independent contractors who need access to such systems and monitor usage for any unusual behavior.

Review applicable policies and procedures often. Outside counsel can review policies and procedures to ensure compliance with the fast-changing regulatory landscape.

Review vendor contracts for liability protections. Scrutinize vendor contracts for appropriate risk shifting terms. Indemnification clauses may be appropriate and might include claims related to cybersecurity incidents, data breaches and regulatory liability. Representations and warranties might be used to represent that certain mitigation strategies will be deployed and compliance standards will be maintained. Insurance can be a helpful tool, and covenants to purchase adequate insurance might be considered. Further, it may be appropriate to carve out some cybersecurity incidents from a liability cap. In-house and outside counsel can review risk shifting terms in light of current market and legal trends.