Recently, a federal court issued the first ruling on the closely watched issue of fair use in copyright infringement involving AI. The court ruled in favor of the plaintiff on its direct infringement claim, and ruled that the defendant’s use of plaintiff’s material to train its AI model was not a fair use.

The Upshot

  • On February 11, 2025, the court in Thomson Reuters v. Ross Intelligence reconsidered its prior decision that the question of fair use needed to be decided by the jury and instead ruled on renewed summary judgment motions that defendant’s use was not fair use.
  • The case involved defendant’s alleged infringement of Thomson Reuters’ Westlaw headnotes. Ross licensed “Bulk Memos” from a third party to train Ross’s AI-based legal search engine. The “Bulk Memos” were created using Westlaw headnotes.
  • The court found that the headnotes were original and copyrightable, and granted summary judgment to Thomson Reuters on direct infringement for certain headnotes.
  • On Ross’s fair use defense, the court found that the use was commercial and not transformative. It also found that the use impacted both the legal research market and the market for data to train AI tools. Overall, the fair use analysis favored Thomson Reuters.
  • Courts are just starting to reach decisions in AI-based copyright cases. The fair use analysis provides guidance for how future courts will think about these issues.

The Bottom Line

This closely watched decision is significant as it’s the first of its kind so far in the landscape of AI-related copyright litigation. While the infringement finding is fairly specific to the facts of the case, the fair use ruling will likely be influential for future courts’ analysis of this defense, particularly its discussion of the purpose and market impact of using copyrighted materials to train AI models. Ballard Spahr lawyers closely monitor this area of law to advise clients on issues of artificial intelligence and copyright infringement.

On February 11, 2025, Third Circuit Judge Stephanos Bilbas, sitting by designation in the District of Delaware, issued a summary judgment decision in the closely watched copyright infringement dispute between Thomson Reuters and Ross Intelligence concerning Ross’s AI-based legal search engine. The court granted most of Thomson Reuters’ motion on direct copyright infringement, and held that Ross’s defenses, including fair use, failed as a matter of law. This case is significant as it’s the first of its kind to address fair use in connection with artificial intelligence, though the court was careful to point out that this matter, unlike many others working their way through the court system, involved a non-generative AI system.

The underlying case concerns Ross’s AI-based legal search engine and Thomson Reuters’ claim that the use of Thomson Reuters’ Westlaw headnotes as training material for the AI tool constituted copyright infringement. Thomson Reuters’ Westlaw platform contains editorial content and annotations, like headnotes, that guide users to key points of law and case holdings. Ross, a competitor to Westlaw, made a legal research search engine based on artificial intelligence, and initially asked to license Westlaw content to train its product. When Thomson Reuters refused, Ross turned to a third party, LegalEase, which provided training data in the form of “Bulk Memos” consisting of legal questions and answers. The Bulk Memos were created using Westlaw headnotes.

Thomson Reuters brought claims of copyright infringement based on this use. In 2023, the court largely denied Thomson Reuters’ motions for summary judgment on copyright infringement and fair use, and held that those issues were properly decided by a jury. After reflection, the court “realized that [its] prior summary-judgment ruling had not gone far enough,” and invited the parties to renew their summary judgment briefing. This time, the court largely ruled in Thomson Reuters’ favor.

First, the court held that Thomson Reuters’ headnotes were sufficiently original to be copyrightable, even if they were based on the text of underlying court cases. The court found that”[i]dentifying which words matter and chiseling away the surrounding mass expresses the editor’s idea about what the important point of law from the opinion is,” and therefore has enough of a “creative spark” to overcome the low bar presented by the originality requirement. “Similarly, Westlaw’s Key Numbering System was also sufficiently original, as Thomson Reuters had chosen a particular way to organize these legal topics, even if it was not a novel one. The court then turned to actual copying and substantial similarity and granted summary judgment to Thomson Reuters on headnotes which “very closely track[ ] the language of the Bulk Memo question but not the language of the case opinion.” Other headnotes and the Key Numbering System were left for trial.

On fair use, the court granted summary judgment for Thomson Reuters, finding that Ross’s use was not fair. On the first fair use factor, the purpose and character of the use, the court found that Ross’s use was commercial and served the same purpose of Thomson Reuters’: a legal research tool. In the parlance of fair use law, Ross’s use was not “transformative.” The court also rejected Ross’s analogy to earlier computer programming cases where intermediate copying was necessary, and rejected Ross’s argument that the copying was allowed because the text of the headnotes was not reproduced in the final product.

The second and third factors (nature of the material and how much was used), went to Ross, but the fourth factor, the likely effect on the market for the original work, and “the single most important” of the four factors, went to Thomson Reuters. The court looked at both the current market for the original work and potential derivative ones, and found that Ross’s use impacted both the original market for legal research and the derivative market for data to train AI tools. The court found that it did “not matter whether Thomson Reuters has used the data to train its own legal search tools; the effect on a potential market for AI training data is enough.” Altogether, the four fair use factors favored Thomson Reuters, and it was granted summary judgment on fair use.

Looking beyond this opinion, it is the first decision to substantively address fair use in the context of artificial intelligence, so it will be an important guidepost for the multiple cases pending across the country, many of which involve companies who have used copyrighted works to train generative AI models. However, the opinion has an important caveat, which is that “only non-generative AI” was at issue in the case. Generative AI models use their training data set to create new text, image, video, or other outputs. Non-generative models, by contrast, analyze and classify data based on patterns learned from their training data. The cases involving generative AI may involve different analysis for the fair use factors like the question of transformativeness and the nature of the original works, but the opinion’s commentary on current and potential markets, as well as its willingness to weigh the four factors on summary judgment, may be highly applicable.

In short, this is an important decision but much remains unsettled in the law applying copyright to artificial intelligence. Ballard Spahr lawyers closely monitor developments concerning artificial intelligence and intellectual property, including copyright infringement and fair use. Our AI Legislation and Litigation Tracker provides a comprehensive view of AI-related legislative activities and important information about litigation matters with significant potential impact on clients.

On January 6, 2025, the U.S. Department of Health and Human Services (“HHS”) Office for Civil Rights (“OCR”) published a Notice of Proposed Rulemaking (“NPRM”) to amend the Health Insurance Portability and Accountability Act (“HIPAA”) Security Rule. The proposed changes, if enacted, would represent the first update to the HIPAA Security Rule since 2013.

The proposed updates, which apply to covered entities and business associates (collectively, “Regulated Entities”) aim to enhance cybersecurity measures within the healthcare sector, addressing the increasing frequency and sophistication of cyberattacks that threaten patient safety and the confidentiality of electronic protected health information (“ePHI”).

Below are some of the key proposals set forth in the NPRM:

  1. Strengthened Security Requirements: HHS proposes eliminating the current distinction between “required” and “addressable” provisions of the Security Rule, thereby requiring compliance with all implementation specifications in the future.  For example, with certain exceptions, ePHI would now be required to be encrypted at rest and in transit.  Regulated Entities would no longer be permitted to merely document rationale for noncompliance with “addressable” implementation specifications. HHS also proposes new implementation specifications.  As such, Regulated Entities would be required to strengthen and adopt security standards to ensure robust cybersecurity practices that keep pace with technological advancements and emerging threats, including by deploying anti-malware solutions, removing unnecessary software, disabling unused network ports, implementing multi-factor authentication for systems that handle ePHI, and conducting vulnerability scans every six months and annual penetration tests.
  2. Technology Asset Inventory and Network Map: Regulated Entities would be required to develop and maintain an inventory of their technology assets and create a network map illustrating the movement of ePHI within the Regulated Entities’ systems, which must be updated annually or when significant changes in the organizations’ operations or environment occur.
  3. Enhanced Risk Analyses: Regulated Entities would be required to include greater specificity when conducting a risk analysis, including, among other things:
    • “A review of the technology asset inventory and network map.
    • Identification of all reasonably anticipated threats to the confidentiality, integrity, and availability of ePHI.
    • Identification of potential vulnerabilities and predisposing conditions to the regulated entity’s relevant electronic information systems; [and]
    • An assessment of the risk level for each identified threat and vulnerability, based on the likelihood that each identified threat will exploit the identified vulnerabilities.”
    • The written risk assessment would need to be reviewed, verified, and updated at least every 12 months, with evaluations conducted when there are changes in the environment or operations. A written risk management plan must be maintained and reviewed annually.
  4. Contingency and Incident Response Plans with Notification Procedures: Regulated Entities would be required to implement detailed plans for restoring systems within 72 hours, prioritizing critical systems and establishing and test written security incident response plans regularly, and business associates and subcontractors would be required to notify covered entities within 24 hours of activating their contingency plans.
  5. Verification of Business Associates’ Safeguards: Business associates would be required to verify at least once every 12 months that they have deployed technical safeguards required by the Security Rule to protect ePHI through a written analysis of the business associate’s relevant electronic information systems by a subject matter expert and a written certification that the analysis has been performed and is accurate. Based on these written verifications, Regulated Entities would be required to conduct an assessment of the risks posed by new and existing business associate arrangements.

Along with the NPRM, OCR published a fact sheet that provides additional details on the proposed updates.

Public comments to the proposed rule are due on or before March 7, 2025, although it is possible that the change in Administrations later this month could affect the progress of this and other proposed rules. While HHS undertakes the rulemaking, the current Security Rule remains in effect.

The Dutch Data Protection Authority (the “Dutch DPA”) issued a €4.75 million (approximately $5 million USD) fine on Netflix in connection with a data access investigation that started in 2019.  The investigation arose out of a complaint was filed by nonprofit privacy and digital rights organization, noyb, which is run by European privacy campaigner Max Schrems.

In a press release dated December 18, 2024, the Dutch DPA stated that Netflix “did not give customers sufficient information about what the company does with their personal data between 2018 and 2020.”  In particular, the Dutch DPA alleged Netflix’s privacy notice was not clear about the following:

  • the purposes of and the legal basis for collecting and using personal data;
  • which personal data are shared by Netflix with other parties, and why precisely this is done;
  • how long Netflix retains the data; and
  • how Netflix ensures that personal data remain safe when the company transmits them to countries outside Europe.

Furthermore, the Dutch DPA stated that customers did not receive sufficient information when they asked Netflix what data the company collects about them.  According to the Dutch DPA, Netflix has since updated its privacy statement to improve to the relevant disclosures.

Netflix has objected to the fine.

On December 3, 2024, the Consumer Financial Protection Bureau (CFPB) published its long anticipated proposed rule aimed at regulating data brokers under the Fair Credit Reporting Act (FCRA).  Although the CFPB’s future is uncertain under the upcoming administration, if implemented, the rule would significantly expand the reach of the FCRA. 

In the accompanying press release, the CFPB stated that its “proposal would ensure data brokers comply with federal law and address critical threats from current data broker practices, including” national security and surveillance risks; criminal exploitation; and violence, stalking, and personal safety threats to law enforcement personnel and domestic violence survivors.  The CFPB expanded on these stated risks in a separate fact sheet.

To address these risks, the proposed rule would treat data brokers like credit bureaus and background check companies: Companies that sell data about income or financial tier, credit history, credit score, or debt payments would be considered consumer reporting agencies required to comply with the FCRA, regardless of how the information is used.  So, the rule would turn data brokers’ disclosure of such information into the communication of consumer reports subject to FCRA’s regulation.  The CFPB did not propose any express exceptions for use of credit header data for fraud prevention, identity verification, compliance with Bank Secrecy Act or Know-Your-Customer requirements, or law enforcement uses.    

If enacted, the proposed rule would significantly impact the data broker industry and restrict the information that data brokers can sell to third parties.  It would also likely increase compliance costs for all data brokers—regardless of the types of data in which they deal.  Unsurprisingly, as with other CFPB initiatives of late, industry reactions were immediate and clear.  For example, the Consumer Data Industry Association (CDIA) expressed concerns that the proposed rule could have “severe unintended consequences for public safety, law enforcement, and the consumer economy.”  Specifically, the CDIA noted that the proposed rule could make “it harder to identify and prevent fraudulent schemes” and that it “may become more difficult for police to identify and track fugitives or locate missing and exploited children.”  It therefore called “on the CFPB to engage in a more collaborative approach with industry stakeholders and lawmakers to address data privacy concerns without compromising the integrity and efficiency of the credit reporting system that has long been the envy of the world.”

In any event, the proposed rule has a 90-day comment period, meaning that it the comment period alone will run until March 3, 2025.  Based on the incoming Trump administration’s apparent position towards the CFPB and FCRA, it seems unlikely that the rule will go into effect as proposed.  But until anything becomes formal, companies that would be impacted by the proposed rule should still consider submitting comments to ensure that their interests are protected. 

On December 5, 2024, the Colorado Department of Law adopted amended rules to the Colorado Privacy Act (CPA). 

The DOL had released the first set of the proposed amended rules—which relate to the interpretative guidance and opinion letter process, biometric identifier consent, and additional requirements for the personal data of minors—on September 13, 2024. The Attorney General discussed the proposed rules at the 2024 Annual Colorado Privacy Summit, sought and received comments from the public, and revised the rules. The adopted rules will now be sent to the Attorney General, who will issue a formal opinion. After that formal opinion is issued, the rules will be filed with the Secretary of State, and they will become effective 30 days after they are published in the state register.

On November 7, 2024, Michigan lawmakers in the Senate introduced the Reproductive Data Privacy Act (“RDPA”), also known as Senate Bill 1082 (SB 1082).  The bill aims to strengthen privacy protections for sensitive reproductive health data, including information on menstrual cycles, fertility, and contraception. 

The RDPA is largely modeled after Washington’s My Health, My Data Act, but it more narrowly applies to organizations that provide reproductive health-related products or services, such as diagnostic testing, fertility apps, or abortion care.  The bill regulates these organizations’ collection and processing of “reproductive health data,” which is defined to mean information that is linked or reasonably linkable to an individual and that identifies the individual’s past, present, or future reproductive health status.  The RDPA includes the following notable provisions:

  1. Consumer Control and Consent:  Entities must notify individuals and obtain explicit consent before collecting or processing their reproductive health data.  Additionally, consumers have the right to access, delete, and revoke consent for sharing or selling of, their reproductive health data.
  2. Restrictions on Data Use and Disclosure:  Data sharing with third parties or government agencies is prohibited without a warrant, legal obligation, or the individual’s consent.  The bill bans geofencing practices around reproductive health service locations to prevent tracking or targeting individuals. 
  3. Data Minimization. The RPDA mandates that information may only be collected for one of the following enumerated purposes:
    • To provide a product, service, or service feature to the individual to whom the reproductive health data pertains when that individual requested the product, service, or service feature by subscribing to, creating an account with, or otherwise contracting with the covered entity or service provider;
    • To initiate, manage, execute, or complete a financial or commercial transaction or to fulfill an order for a specific product or service requested by an individual to whom the reproductive health data pertains, including, but not limited to, associated routine administrative, operational, and account servicing activity such as billing, shipping, storage, and accounting;
    • To comply with an obligation under a law of Michigan or federal law; or
    • To protect public safety or public health.

      Entities are prohibited from retaining reproductive health data for longer than necessary to achieve these purposes.
  4. Enforcement and Penalties:  The Michigan Attorney General would oversee enforcement, and individuals could sue for damages ranging from $100 to $750 per violation.  Additional remedies like injunctions and declaratory relief are also included.

Supporters seek to pass the legislation before the year’s end, prior to President-elect Donald Trump assuming office.  The bill, however, must first pass through the Senate Committee on Housing and Human Services before it can be advanced to the Senate floor for potential amendment and vote.  If approved by the Senate, it would then be referred to the House for further consideration.

On November 12, 2024, the Consumer Financial Protection Bureau (CFPB) released a report examining the carve outs and limitations contained in comprehensive state privacy laws relating to financial institutions.  In an accompanying press release, the CFPB stated that in its assessment, “privacy protections for financial information now lag behind safeguards in other sectors of the economy.”

As the CFPB’s report notes, eighteen states had passed comprehensive privacy laws (nineteen, counting Florida, which has particular thresholds).  However, all of these state privacy laws have some level of carve outs or limitations for financial institutions.  Some state laws have a full entity-level exemption, where financial institutions regulated by the Gramm-Leach-Bliley Act (GLBA) are entirely exempt from the scope of the law.  Under other laws, non-public personal information (NPI) regulated by the GLBA is exempted from scope of the state privacy law.  Additionally, state privacy laws also contain exemptions for information regulated by the Fair Credit Reporting Act (FCRA).  Accordingly, financial information processed by financial institutions is, in large part, exempted from state privacy laws.

The CFPB report goes on to describe that the federal laws regulating financial information do not contain the same consumer privacy rights that are contained in state privacy laws—rights such as the right to know what data businesses have about them, to correct inaccurate information, or to request the business delete the information about them. 

Importantly, the report’s conclusion is that state policymakers should assess gaps in existing state privacy laws, and that they should consider whether their consumers are adequately protected under their state laws.  Seen in the context of the recent election, this advice is not surprising.  Indeed, recent CFPB initiatives like the Open Banking Rule—which would afford consumers with rights similar to those offered under state privacy laws—could be halted by the new administration through the Congressional Review Act or enjoined by ongoing litigation.  It is therefore expected that the current CFPB leadership would look for ways to secure its achievements through other avenues.

What is notable, however, is how this change would reshape the scope of state privacy laws.  To date, the discussion on financial institution exemptions has been on entity-level versus data-level.  No states have adopted comprehensive privacy laws that fully cover NPI that is already regulated by the GLBA.  But, with the report, the CFPB now argues that the GBLA’s general preemption provision would not prohibit such application.  If a state takes the CFPB up on its request, it would mark a radical shift in privacy law—and operational changes—in the financial world.  

On November 14, 2024, the California Privacy Protection Agency (“CPPA”), which is tasked with enforcing the California Consumer Privacy Act (the “CCPA”), announced it settled with two data brokers, Growbots, Inc. and UpLead LLC, for failing to register and pay required fees under Senate Bill 362, also known as the Delete Act. The companies will each pay fines—$35,400 for Growbots and $34,400 for UpLead—and agree to cover the CPPA’s legal costs for violations that occurred between February and July 2024.

The Delete Act, signed into law in 2023, mandates that data brokers register with the CPPA and pay an annual fee to fund the California Data Broker Registry.  The Delete Act imposes fines of $200 per day for failing to register by the deadline.  The registration fees are used to fund efforts like the development of the Data Broker Requests and Opt-Out Platform (“DROP”), which is a first-of-its-kind deletion mechanism that will allow consumers to request data deletion from all brokers with a single action. The CPPA expects that DROP will be available to consumers in 2026 via the CPPA website.

These recent settlements, in addition to newly adopted regulations by the CPPA (which further clarify data broker registration requirements under the Delete Act and require data brokers to disclose specific information about their exempt data collection practices) highlight the CPPA’s continued focus on the privacy practices of data brokers.

On October 22, 2024, the Consumer Financial Protection Bureau (“CFPB”) issued its final rule implementing Section 1033 of the Dodd-Frank Act (the “Final Rule” or the “Open Banking Rule”), granting consumers greater access rights to the data their financial institutions hold.  Although there are some differences, the Final Rule largely tracks the Proposed Rule announced by the CFPB last year on October 19, 2023, with the largest concession coming in form of the extended effective date.

The Final Rule was immediately met with criticism from industry groups, with the Banking Policy Institute and Kentucky Bankers Association filing a lawsuit on the day the Final Rule was issued in the U.S. District Court for the Eastern District of Kentucky seeking injunctive relief, alleging that the CFPB exceeded its statutory authority.

Scope of the Final Rule

The Final Rule applies to data providers, third parties, and data aggregators.  “Data provider” is defined to mean a financial institution under Regulation E, card issuers under Regulations Z, or any other person that controls or possesses information concerning a covered consumer financial product or service that the consumer obtained form that person.  Digital wallet providers are specifically listed as an example.  While some commenters pushed the CFPB to expand the scope of data providers, it declined to do so at this time, although it did explain that it intends to do so in the future.

“Third parties” are defined to mean any person or entity that is not the consumer about whom the covered data pertains or the data provider that controls or possesses that data.  To become an “authorized third party,” entities must comply with authorized procedures outlined in the Final Rule.  The Final Rule also has additional requirements for “data aggregators,” which are defined to mean a person that is retained by and providers services to authorized third parties to enable access to covered data.

The Final Rule defines covered data to mean transaction information, account balance information, information to initiate payment to or from a Regulation E account, terms and conditions, upcoming bill information, and basic account verification information.  The Final Rule includes examples for some, but not all, of those categories, and it does not contain any express exclusions for de-identified or anonymized data.

Substance of Final Rule

The Final Rule requires data providers to provide a right of access to authenticated consumer sand authenticated third parties (including data aggregators acting on behalf for an authorized third party) to the most recently updated covered data.  Access must be in electronic format that is transferrable to consumers and third parties and usable in a separate system (known as portability under privacy laws), and data providers cannot impose any fee or charge to consumers or third parties.  The CFPB has stated that the purpose of this requirement is to encourage competition, while critics have stated that it will allow third parties to profit from consumer data at the expense of banks and other data providers. 

Data providers must also establish and maintain two interfaces—one for consumers, and one for developers.  The developer interface is defined to mean the interface through which a data provider receives requests for covered data and makes available covered data to authorized third parties, and it would need to satisfy several requirements relating to format, performance, and security.  Adhering to standards set by a qualified industry standard would constitute an indicia of compliance that would provide a safe harbor in some instances.  The CFPB’s rule outlining the qualifications to become a recognized industry standard setting body, which can issue standards, was finalized in June.

Data providers will also need to make certain information publicly available in both human and machine readable formats, which go well beyond the standard annual privacy policy updates.  Additionally, data providers will need to maintain written policies and procedures relating to data availability and accuracy, as well as data retention and access requests.

With respect to third parties, the Final Rule contains a three-part authorization procedure to become an authorized third party: providing the consumer with an authorization disclosure, certifying that the third party agrees to specific obligations, and obtaining the consumer’s express informed consent.  The Final Rule allows data aggregators to perform the third party authorization, subject to specific requirements.

The Final Rule also imposes limitations on the third party’s secondary uses of consumer data, explicitly prohibiting the use of consumer data for targeted advertising, cross-selling of other services of products or services, and the sale of data.  Many commentators requested greater clarity on the secondary use limitations, especially on how to determine primary versus secondary uses, and seeking carve outs for de-identified data.  The Final Rule did not specifically address de-identified data or how data may be used to train artificial intelligence or algorithms, but it did explicitly allow for the use of covered data for “uses that are reasonably necessary to improve the product or service the consumer requested.”

It is also worth noting that the Final Rule carried through numerous other specific requirements relating to data security, data retention, consent revocation, reauthorization, and written policies and procedures.

Compliance Timelines

In perhaps the biggest change from the Proposed Rule, the CFPB extended the earliest compliance timeline.  Under the Proposed Rule, the largest depository institutions would have had to comply within six months after publication, while the smallest institutions would have had four years to comply.

Under the Final Rule, the largest depository institutions—defined to mean those that hold at least $250 billion in total assets—will have until April 1, 2026 to comply.  While this extended compliance date is obviously welcome news, the threshold for a company to fall within the category of the largest depository group was previously set at $500 billion in total assets under the Proposed Rule, which means more institutions will now be subject to the new initial deadline set forth in the Final Rule.

Depository institutions with between $250 billion and $10 billion will have until April 1, 2027; those with between $10 billion and $3 billion have until April 1, 2028; those with between $3 billion and $1.5 billion have until April 1, 2029; those with between $1.5 billion and $850 million have until April 1, 2030; and those with less than $850 million are exempt from the Final Rule entirely.

Reception and Criticisms

On the same day that the CFPB issued the Final Rule, the Bank Policy Institute filed a lawsuit in federal court challenging aspects of the CFPB’s rulemaking under Section 1033 of the Dodd-Frank Act.  The complaint asks the court to set aside the Final Rule in its entirety pursuant to the Administrative Procedure Act, and to enter an order permanently enjoining the CFPB from enforcing the Final Rule. 

Other industry groups have been similarly critical of the Final Rule.  In particular, many organizations and groups in the banking industry have voiced the following criticisms in response to the Final Rule:

  • under the Final Rule, third parties are able to profit, at no cost, from a system built and maintained by banks, and that banks are not able to exercise control over customer data once it is transferred to third parties;
  • the CFPB was mistaken in not affirmatively and explicitly sunsetting the practice of “screen scraping” in the Final Rule, a method whereby third parties or data aggregators collect data from a website or application by using consumer credentials to log into consumer accounts
  • the new compliance deadline in the Final Rule, which while extended, will still be difficult for organizations to meet given that qualified industry standards have yet to be set by any recognized industry setting body. 

*          *          *

Compliance with the Final Rule will be a long and arduous process for data providers, third parties, and aggregators alike, requiring an update to technical processes and legal procedures. Indeed, for some companies, the Final Rule will require not just updates to account for the specific requirements set forth in the Final Rule, but also a more comprehensive overhaul to their underlying security procedures to align with the security standard set forth in the federal Gramm-Leach-Bliley Act.  Companies would be wise to start assessing the impact of the Final Rule on their operations now, even if implementation of some of the technical updates will need to be delayed until standard setting bodies are formed.

In a recent decision from the Southern District of Florida, U.S. District Judge Robert N. Scola, Jr. denied class certification of a proposed class of paid Univision NOW subscribers who assert that Univision NOW’s use of the Meta Pixel violates the Video Privacy Protection Act (VPPA). The three proposed class representatives allege D2C, LLC, doing business as Univision NOW, violated the VPPA by disclosing their personal viewing information using pixel software from Meta Platforms. The plaintiffs claim Univision NOW disclosed information linking them to specific videos that they watched. The plaintiffs sought class certification of Univision NOW subscribers whose viewing information was allegedly disclosed to Meta between April 2021 and May 2023. Judge Scola denied class certification, finding the plaintiffs failed to meet the numerosity requirement for class certification.

What is the VPPA?

In 1988, Congress passed the VPPA in response to concerns about consumer privacy in the age of video rentals. This legislation was spurred by concerns over disclosure of Judge Bork’s video rental history, which emerged during his SCOTUS confirmation hearing. The VPPA precludes videotape service providers from disclosing a consumer’s personal identifying information (PII), together with their video viewing history, and it provides for actual or liquidated damages of $2,500 per violation of the law. In recent years, there has been an increase in new privacy class actions under the VPPA against website owners with video functionality on their websites.

The Univision Now Case

Judge Scola’s decision hinged primarily on the issue of numerosity—one of the four key requirements for class certification under Federal Rule of Civil Procedure 23(a). To satisfy the numerosity requirement, plaintiffs were required to show that the number of individuals affected by the purported VPPA violation  was large enough that it would be impractical to bring each case individually. The plaintiffs initially argued that Univision NOW automatically disclosed the viewing information of its 35,845 subscribers, but acknowledged there were several impediments to Univision NOW’s transmission of information to Meta.

The court explained that the plaintiffs’ theory of automatic data transmission was undercut by their own concessions and Univision NOW’s expert testimony, which suggested several conditions must be met for the Pixel to actually transmit PII. Specifically, the court found that in addition to viewing or selecting a prerecorded video through Univision NOW’s website, a subscriber also must have  (1) had a Facebook account at the time the video was selected; (2) used a web browser that did not block the Pixel by default; (3) been simultaneously logged into the subscriber’s own Facebook account while selecting the video; (4) been simultaneously logged into Facebook on the same device that the subscriber used to select the video; (5) been simultaneously logged into Facebook using the same browser through which the subscriber selected the video; and (6) not deployed any number of browser settings or add-on software that would have blocked the Pixel.  Crucially, while the court found that the class was ascertainable, it also found that class certification was not warranted because the plaintiffs failed to carry their burden to show that Univision NOW disclosed the personally identifiable information of and record of videos viewed by even a single subscriber—including that of the three named plaintiffs.

Although the plaintiffs attempted to save prospects of class certification by reducing the potential class to approximately 17,000 individuals based on estimates of individuals who use Facebook and individuals who use certain popular web browsers, Judge Scola ruled that these estimates were too speculative.  Without a means to determine class size, the court found the plaintiffs failed to meet the numerosity requirement.

Conclusion

Judge Scola’s decision to deny class certification in this case is a significant victory for Univision NOW. While the plaintiffs can still pursue individual claims, their failure to secure class certification limits the scope and potential impact of their lawsuit.

For companies that provide video content and use tracking technologies like pixels, this decision reinforces the need to closely monitor their data-sharing practices and ensure compliance with privacy laws.