On December 5, 2024, the Colorado Department of Law adopted amended rules to the Colorado Privacy Act (CPA). 

The DOL had released the first set of the proposed amended rules—which relate to the interpretative guidance and opinion letter process, biometric identifier consent, and additional requirements for the personal data of minors—on September 13, 2024. The Attorney General discussed the proposed rules at the 2024 Annual Colorado Privacy Summit, sought and received comments from the public, and revised the rules. The adopted rules will now be sent to the Attorney General, who will issue a formal opinion. After that formal opinion is issued, the rules will be filed with the Secretary of State, and they will become effective 30 days after they are published in the state register.

On November 7, 2024, Michigan lawmakers in the Senate introduced the Reproductive Data Privacy Act (“RDPA”), also known as Senate Bill 1082 (SB 1082).  The bill aims to strengthen privacy protections for sensitive reproductive health data, including information on menstrual cycles, fertility, and contraception. 

The RDPA is largely modeled after Washington’s My Health, My Data Act, but it more narrowly applies to organizations that provide reproductive health-related products or services, such as diagnostic testing, fertility apps, or abortion care.  The bill regulates these organizations’ collection and processing of “reproductive health data,” which is defined to mean information that is linked or reasonably linkable to an individual and that identifies the individual’s past, present, or future reproductive health status.  The RDPA includes the following notable provisions:

  1. Consumer Control and Consent:  Entities must notify individuals and obtain explicit consent before collecting or processing their reproductive health data.  Additionally, consumers have the right to access, delete, and revoke consent for sharing or selling of, their reproductive health data.
  2. Restrictions on Data Use and Disclosure:  Data sharing with third parties or government agencies is prohibited without a warrant, legal obligation, or the individual’s consent.  The bill bans geofencing practices around reproductive health service locations to prevent tracking or targeting individuals. 
  3. Data Minimization. The RPDA mandates that information may only be collected for one of the following enumerated purposes:
    • To provide a product, service, or service feature to the individual to whom the reproductive health data pertains when that individual requested the product, service, or service feature by subscribing to, creating an account with, or otherwise contracting with the covered entity or service provider;
    • To initiate, manage, execute, or complete a financial or commercial transaction or to fulfill an order for a specific product or service requested by an individual to whom the reproductive health data pertains, including, but not limited to, associated routine administrative, operational, and account servicing activity such as billing, shipping, storage, and accounting;
    • To comply with an obligation under a law of Michigan or federal law; or
    • To protect public safety or public health.

      Entities are prohibited from retaining reproductive health data for longer than necessary to achieve these purposes.
  4. Enforcement and Penalties:  The Michigan Attorney General would oversee enforcement, and individuals could sue for damages ranging from $100 to $750 per violation.  Additional remedies like injunctions and declaratory relief are also included.

Supporters seek to pass the legislation before the year’s end, prior to President-elect Donald Trump assuming office.  The bill, however, must first pass through the Senate Committee on Housing and Human Services before it can be advanced to the Senate floor for potential amendment and vote.  If approved by the Senate, it would then be referred to the House for further consideration.

On November 12, 2024, the Consumer Financial Protection Bureau (CFPB) released a report examining the carve outs and limitations contained in comprehensive state privacy laws relating to financial institutions.  In an accompanying press release, the CFPB stated that in its assessment, “privacy protections for financial information now lag behind safeguards in other sectors of the economy.”

As the CFPB’s report notes, eighteen states had passed comprehensive privacy laws (nineteen, counting Florida, which has particular thresholds).  However, all of these state privacy laws have some level of carve outs or limitations for financial institutions.  Some state laws have a full entity-level exemption, where financial institutions regulated by the Gramm-Leach-Bliley Act (GLBA) are entirely exempt from the scope of the law.  Under other laws, non-public personal information (NPI) regulated by the GLBA is exempted from scope of the state privacy law.  Additionally, state privacy laws also contain exemptions for information regulated by the Fair Credit Reporting Act (FCRA).  Accordingly, financial information processed by financial institutions is, in large part, exempted from state privacy laws.

The CFPB report goes on to describe that the federal laws regulating financial information do not contain the same consumer privacy rights that are contained in state privacy laws—rights such as the right to know what data businesses have about them, to correct inaccurate information, or to request the business delete the information about them. 

Importantly, the report’s conclusion is that state policymakers should assess gaps in existing state privacy laws, and that they should consider whether their consumers are adequately protected under their state laws.  Seen in the context of the recent election, this advice is not surprising.  Indeed, recent CFPB initiatives like the Open Banking Rule—which would afford consumers with rights similar to those offered under state privacy laws—could be halted by the new administration through the Congressional Review Act or enjoined by ongoing litigation.  It is therefore expected that the current CFPB leadership would look for ways to secure its achievements through other avenues.

What is notable, however, is how this change would reshape the scope of state privacy laws.  To date, the discussion on financial institution exemptions has been on entity-level versus data-level.  No states have adopted comprehensive privacy laws that fully cover NPI that is already regulated by the GLBA.  But, with the report, the CFPB now argues that the GBLA’s general preemption provision would not prohibit such application.  If a state takes the CFPB up on its request, it would mark a radical shift in privacy law—and operational changes—in the financial world.  

On November 14, 2024, the California Privacy Protection Agency (“CPPA”), which is tasked with enforcing the California Consumer Privacy Act (the “CCPA”), announced it settled with two data brokers, Growbots, Inc. and UpLead LLC, for failing to register and pay required fees under Senate Bill 362, also known as the Delete Act. The companies will each pay fines—$35,400 for Growbots and $34,400 for UpLead—and agree to cover the CPPA’s legal costs for violations that occurred between February and July 2024.

The Delete Act, signed into law in 2023, mandates that data brokers register with the CPPA and pay an annual fee to fund the California Data Broker Registry.  The Delete Act imposes fines of $200 per day for failing to register by the deadline.  The registration fees are used to fund efforts like the development of the Data Broker Requests and Opt-Out Platform (“DROP”), which is a first-of-its-kind deletion mechanism that will allow consumers to request data deletion from all brokers with a single action. The CPPA expects that DROP will be available to consumers in 2026 via the CPPA website.

These recent settlements, in addition to newly adopted regulations by the CPPA (which further clarify data broker registration requirements under the Delete Act and require data brokers to disclose specific information about their exempt data collection practices) highlight the CPPA’s continued focus on the privacy practices of data brokers.

On October 22, 2024, the Consumer Financial Protection Bureau (“CFPB”) issued its final rule implementing Section 1033 of the Dodd-Frank Act (the “Final Rule” or the “Open Banking Rule”), granting consumers greater access rights to the data their financial institutions hold.  Although there are some differences, the Final Rule largely tracks the Proposed Rule announced by the CFPB last year on October 19, 2023, with the largest concession coming in form of the extended effective date.

The Final Rule was immediately met with criticism from industry groups, with the Banking Policy Institute and Kentucky Bankers Association filing a lawsuit on the day the Final Rule was issued in the U.S. District Court for the Eastern District of Kentucky seeking injunctive relief, alleging that the CFPB exceeded its statutory authority.

Scope of the Final Rule

The Final Rule applies to data providers, third parties, and data aggregators.  “Data provider” is defined to mean a financial institution under Regulation E, card issuers under Regulations Z, or any other person that controls or possesses information concerning a covered consumer financial product or service that the consumer obtained form that person.  Digital wallet providers are specifically listed as an example.  While some commenters pushed the CFPB to expand the scope of data providers, it declined to do so at this time, although it did explain that it intends to do so in the future.

“Third parties” are defined to mean any person or entity that is not the consumer about whom the covered data pertains or the data provider that controls or possesses that data.  To become an “authorized third party,” entities must comply with authorized procedures outlined in the Final Rule.  The Final Rule also has additional requirements for “data aggregators,” which are defined to mean a person that is retained by and providers services to authorized third parties to enable access to covered data.

The Final Rule defines covered data to mean transaction information, account balance information, information to initiate payment to or from a Regulation E account, terms and conditions, upcoming bill information, and basic account verification information.  The Final Rule includes examples for some, but not all, of those categories, and it does not contain any express exclusions for de-identified or anonymized data.

Substance of Final Rule

The Final Rule requires data providers to provide a right of access to authenticated consumer sand authenticated third parties (including data aggregators acting on behalf for an authorized third party) to the most recently updated covered data.  Access must be in electronic format that is transferrable to consumers and third parties and usable in a separate system (known as portability under privacy laws), and data providers cannot impose any fee or charge to consumers or third parties.  The CFPB has stated that the purpose of this requirement is to encourage competition, while critics have stated that it will allow third parties to profit from consumer data at the expense of banks and other data providers. 

Data providers must also establish and maintain two interfaces—one for consumers, and one for developers.  The developer interface is defined to mean the interface through which a data provider receives requests for covered data and makes available covered data to authorized third parties, and it would need to satisfy several requirements relating to format, performance, and security.  Adhering to standards set by a qualified industry standard would constitute an indicia of compliance that would provide a safe harbor in some instances.  The CFPB’s rule outlining the qualifications to become a recognized industry standard setting body, which can issue standards, was finalized in June.

Data providers will also need to make certain information publicly available in both human and machine readable formats, which go well beyond the standard annual privacy policy updates.  Additionally, data providers will need to maintain written policies and procedures relating to data availability and accuracy, as well as data retention and access requests.

With respect to third parties, the Final Rule contains a three-part authorization procedure to become an authorized third party: providing the consumer with an authorization disclosure, certifying that the third party agrees to specific obligations, and obtaining the consumer’s express informed consent.  The Final Rule allows data aggregators to perform the third party authorization, subject to specific requirements.

The Final Rule also imposes limitations on the third party’s secondary uses of consumer data, explicitly prohibiting the use of consumer data for targeted advertising, cross-selling of other services of products or services, and the sale of data.  Many commentators requested greater clarity on the secondary use limitations, especially on how to determine primary versus secondary uses, and seeking carve outs for de-identified data.  The Final Rule did not specifically address de-identified data or how data may be used to train artificial intelligence or algorithms, but it did explicitly allow for the use of covered data for “uses that are reasonably necessary to improve the product or service the consumer requested.”

It is also worth noting that the Final Rule carried through numerous other specific requirements relating to data security, data retention, consent revocation, reauthorization, and written policies and procedures.

Compliance Timelines

In perhaps the biggest change from the Proposed Rule, the CFPB extended the earliest compliance timeline.  Under the Proposed Rule, the largest depository institutions would have had to comply within six months after publication, while the smallest institutions would have had four years to comply.

Under the Final Rule, the largest depository institutions—defined to mean those that hold at least $250 billion in total assets—will have until April 1, 2026 to comply.  While this extended compliance date is obviously welcome news, the threshold for a company to fall within the category of the largest depository group was previously set at $500 billion in total assets under the Proposed Rule, which means more institutions will now be subject to the new initial deadline set forth in the Final Rule.

Depository institutions with between $250 billion and $10 billion will have until April 1, 2027; those with between $10 billion and $3 billion have until April 1, 2028; those with between $3 billion and $1.5 billion have until April 1, 2029; those with between $1.5 billion and $850 million have until April 1, 2030; and those with less than $850 million are exempt from the Final Rule entirely.

Reception and Criticisms

On the same day that the CFPB issued the Final Rule, the Bank Policy Institute filed a lawsuit in federal court challenging aspects of the CFPB’s rulemaking under Section 1033 of the Dodd-Frank Act.  The complaint asks the court to set aside the Final Rule in its entirety pursuant to the Administrative Procedure Act, and to enter an order permanently enjoining the CFPB from enforcing the Final Rule. 

Other industry groups have been similarly critical of the Final Rule.  In particular, many organizations and groups in the banking industry have voiced the following criticisms in response to the Final Rule:

  • under the Final Rule, third parties are able to profit, at no cost, from a system built and maintained by banks, and that banks are not able to exercise control over customer data once it is transferred to third parties;
  • the CFPB was mistaken in not affirmatively and explicitly sunsetting the practice of “screen scraping” in the Final Rule, a method whereby third parties or data aggregators collect data from a website or application by using consumer credentials to log into consumer accounts
  • the new compliance deadline in the Final Rule, which while extended, will still be difficult for organizations to meet given that qualified industry standards have yet to be set by any recognized industry setting body. 

*          *          *

Compliance with the Final Rule will be a long and arduous process for data providers, third parties, and aggregators alike, requiring an update to technical processes and legal procedures. Indeed, for some companies, the Final Rule will require not just updates to account for the specific requirements set forth in the Final Rule, but also a more comprehensive overhaul to their underlying security procedures to align with the security standard set forth in the federal Gramm-Leach-Bliley Act.  Companies would be wise to start assessing the impact of the Final Rule on their operations now, even if implementation of some of the technical updates will need to be delayed until standard setting bodies are formed.

In a recent decision from the Southern District of Florida, U.S. District Judge Robert N. Scola, Jr. denied class certification of a proposed class of paid Univision NOW subscribers who assert that Univision NOW’s use of the Meta Pixel violates the Video Privacy Protection Act (VPPA). The three proposed class representatives allege D2C, LLC, doing business as Univision NOW, violated the VPPA by disclosing their personal viewing information using pixel software from Meta Platforms. The plaintiffs claim Univision NOW disclosed information linking them to specific videos that they watched. The plaintiffs sought class certification of Univision NOW subscribers whose viewing information was allegedly disclosed to Meta between April 2021 and May 2023. Judge Scola denied class certification, finding the plaintiffs failed to meet the numerosity requirement for class certification.

What is the VPPA?

In 1988, Congress passed the VPPA in response to concerns about consumer privacy in the age of video rentals. This legislation was spurred by concerns over disclosure of Judge Bork’s video rental history, which emerged during his SCOTUS confirmation hearing. The VPPA precludes videotape service providers from disclosing a consumer’s personal identifying information (PII), together with their video viewing history, and it provides for actual or liquidated damages of $2,500 per violation of the law. In recent years, there has been an increase in new privacy class actions under the VPPA against website owners with video functionality on their websites.

The Univision Now Case

Judge Scola’s decision hinged primarily on the issue of numerosity—one of the four key requirements for class certification under Federal Rule of Civil Procedure 23(a). To satisfy the numerosity requirement, plaintiffs were required to show that the number of individuals affected by the purported VPPA violation  was large enough that it would be impractical to bring each case individually. The plaintiffs initially argued that Univision NOW automatically disclosed the viewing information of its 35,845 subscribers, but acknowledged there were several impediments to Univision NOW’s transmission of information to Meta.

The court explained that the plaintiffs’ theory of automatic data transmission was undercut by their own concessions and Univision NOW’s expert testimony, which suggested several conditions must be met for the Pixel to actually transmit PII. Specifically, the court found that in addition to viewing or selecting a prerecorded video through Univision NOW’s website, a subscriber also must have  (1) had a Facebook account at the time the video was selected; (2) used a web browser that did not block the Pixel by default; (3) been simultaneously logged into the subscriber’s own Facebook account while selecting the video; (4) been simultaneously logged into Facebook on the same device that the subscriber used to select the video; (5) been simultaneously logged into Facebook using the same browser through which the subscriber selected the video; and (6) not deployed any number of browser settings or add-on software that would have blocked the Pixel.  Crucially, while the court found that the class was ascertainable, it also found that class certification was not warranted because the plaintiffs failed to carry their burden to show that Univision NOW disclosed the personally identifiable information of and record of videos viewed by even a single subscriber—including that of the three named plaintiffs.

Although the plaintiffs attempted to save prospects of class certification by reducing the potential class to approximately 17,000 individuals based on estimates of individuals who use Facebook and individuals who use certain popular web browsers, Judge Scola ruled that these estimates were too speculative.  Without a means to determine class size, the court found the plaintiffs failed to meet the numerosity requirement.

Conclusion

Judge Scola’s decision to deny class certification in this case is a significant victory for Univision NOW. While the plaintiffs can still pursue individual claims, their failure to secure class certification limits the scope and potential impact of their lawsuit.

For companies that provide video content and use tracking technologies like pixels, this decision reinforces the need to closely monitor their data-sharing practices and ensure compliance with privacy laws.

As part of a new enforcement initiative called “Operation AI Comply,” the FTC recently announced that it has brought the following five enforcement actions against businesses that use or sell AI tools in a manner that the FTC has alleged is deceptive and unfair:

  1. DoNotPay. The FTC brought suit against DoNotPay, which had claimed to be “the world’s first robot lawyer.” The company advertised its AI service as capable of allowing consumers to sue without a lawyer and generate valid legal documents quickly, aiming to replace the legal industry with AI. However, the FTC’s complaint states that DoNotPay did not test its chatbot’s effectiveness against human lawyers and lacked any retained attorneys.

    Additionally, DoNotPay offered a service claiming to check small business websites for legal violations based on just an email address, which was also found ineffective.

    As part of a proposed settlement, DoNotPay will pay $193,000 and must inform consumers about the limitations of its service. The settlement will also prevent the company from making unsupported claims about its ability to replace professional services.

    The FTC’s decision to pursue this action was unanimous, and the settlement is open for public comment before finalization.
  2. Ascend Ecom. The FTC filed suit against Ascend Ecom, an online business opportunity scheme. The FTC alleged Ascend Ecom misled consumers with false claims about its AI-powered tools that supposedly enable quick earnings through online storefronts.

    The complaint alleges that Ascend charged consumers tens of thousands of dollars to start online stores on platforms like Amazon and Etsy, while also requiring significant investments in inventory. Despite promises of generating substantial monthly income within two years, most consumers experienced financial losses instead, accumulating debt and negative bank balances.

    Additionally, the scheme is accused of pressuring consumers to alter or remove negative reviews and failing to honor a “guaranteed buyback” policy, even threatening to withhold it from dissatisfied customers. A federal court has temporarily halted the scheme and placed it under a receiver’s control while the case proceeds in court.

    The Commission vote authorizing the staff to file the complaint was 5-0. The complaint was filed in the U.S. District Court for the Central District of California.
  3. Ecommerce Empire Builders. The FTC charged Ecommerce Empire Builders (EEB) with misleading consumers about building an “AI-powered Ecommerce Empire” through costly training programs and “done for you” storefronts. The FTC alleges that the scheme promised participants the potential to earn millions, but the FTC alleges these profits rarely materialize.

    According to the FTC’s complaint, consumers paid as much as $35,000 for storefronts, only to find they generated little to no income. EEB’s marketing claimed clients could make $10,000 monthly without evidence to support such claims. Many consumers reported difficulty obtaining refunds, as EEB either denied requests or offered partial refunds.

    A federal court has temporarily halted EEB’s operations and placed it under a receiver’s control while the case continues.  The FTC’s complaint was unanimously approved by the Commission and filed in the U.S. District Court for the Eastern District of Pennsylvania.
  4. Rytr. The FTC charged Rytr, an AI writing assistant service, with generating false consumer reviews and testimonials. The FTC alleges that since April 2021, Rytr allowed paid subscribers to produce unlimited detailed reviews based on minimal input, often resulting in misleading content that could deceive potential buyers. The FTC’s complaint further alleges that many subscribers created hundreds or even thousands of potentially false reviews.

    To settle the complaint, a proposed order would prevent Rytr from advertising or selling any service related to generating consumer reviews or testimonials. The Commission’s vote to file the complaint was 3-2, with two commissioners dissenting.

    Commissioner Andrew Ferguson issued a dissenting statement joined by Commissioner Melissa Holyoak. The dissenting opinion criticizes the case against Rydr arguing that it overextends the FTC’s enforcement powers by punishing Ryder for providing a generative AI tool used that could legitimately be used by businesses simply because of the mere possibility the tool could be used for fraudulent or deceptive purposes, which the Commissioners argue could potentially have a stifling effect on innovation.
  5. FBA Machine. In June 2024, the FTC took action against a business opportunity scheme that the FTC alleges falsely promised consumers guaranteed income through online storefronts using AI software. The scheme, known as Passive Scaling and later rebranded as FBA Machine, allegedly defrauded consumers of over $15.9 million with deceptive earnings claims.

    Following the FTC’s complaint, a federal court temporarily halted the scheme and appointed a receiver. The case is ongoing, and the complaint was filed in the U.S. District Court for the District of New Jersey, with the Commission voting 5-0 to authorize the action.

The five enforcement actions expand upon other enforcement proceedings that the FTC previously brought against other businesses using AI tools, such as the pre-existing claims the FTC has brought against companies utilizing AI tools to offer services for creating online storefronts, enrolling in career training, sending anonymous messages, utilizing facial recognition in retail stores, and DNA testing.  Collectively, these enforcement actions signal that the FTC continues to make the use of AI tools by businesses, and its impact on consumers, a top enforcement priority.

On August 5, 2024, Illinois Governor J.B. Pritzker signed into law SB 2979, significantly amending the state’s Biometric Information Privacy Act (BIPA). This update represents a considerable decrease in the potential for exorbitant financial liabilities for businesses that engage with biometric data while still maintaining the statute’s robust protections for individuals’ biometric data. The amendment went into effect immediately.

Most significantly, SB 2979 redefines the scope of potential liability from $5,000 per collection or disclosure to $5,000 per individual. Previously, the Illinois Supreme Court held that BIPA’s framework allowed for each biometric data interaction—such as a fingerprint scan—to be treated as a separate infraction, potentially resulting in overwhelming cumulative penalties. This ruling raised the financial stakes associated with BIPA violations considerably, particularly within employment settings. SB 2979 consolidates these infractions by treating the initial collection of biometric data as a singular violation, irrespective of the number of collections or disclosures. This change aims to strike a balance of commercial and individual interests, reducing the threat of existential judgments against businesses while preserving the law’s core protective measures.

The amended law does not apply retroactively, and thus it will not influence any pre-existing litigation. This element of the amendment addresses concerns from numerous industry stakeholders who have faced extensive legal challenges and significant settlements under the original BIPA provisions.

In addition to narrowing the potential for damages, the amendments modify the definition of “biometric identifier” to exclude certain types of biological and medical data. The revisions also make clear that an “electronic signature” is valid for obtaining written consent.

Despite these business-friendly adjustments, BIPA continues to empower Illinois residents with a private right of action, a unique feature not commonly found in similar laws across other states. Businesses must still secure written informed consent before collecting biometric data, and they must still adhere to stringent data protection and storage policies​.

The State of Texas and Meta Platforms Inc. (“Meta”) have agreed to a $1.4 billion settlement, to be paid out over five years, to resolve claims relating to Meta’s alleged use of facial recognition technology without user consent.  This settlement marks the largest privacy settlement obtained by a single state and is the first one obtained under the Texas Capture or Use of Biometric Identifier Act (“CUBI”).  

Meta’s subsidiary, Facebook, launched its facial recognition technology around 2010, allowing users to tag others in photos and videos.  Texas alleged Meta used the collected biometric data to train and enhance its facial recognition technology.  Facebook discontinued the technology in November 2021 after Facebook entered a $650 million settlement in a lawsuit that alleged the technology violated the Illinois Biometric Information Privacy Act (“BIPA”). 

Texas brought this suit shortly thereafter—in February 2022—alleging that Facebook violated CUBI by capturing and disclosing Facebook users’ and non-users’ biometric identifiers for a commercial purpose without their consent through its now-discontinued facial recognition technology.  Texas also alleged that Facebook violated CUBI by failing to destroy the collected biometric data within a reasonable time.

While Illinois has been the focus of compliance due to BIPA’s private right of action, this settlement may indicate that Texas will become increasingly active in enforcement—for both CUBI and its new comprehensive privacy law that went into effect on July 1.  Companies should take notice.

The Federal Trade Commission (FTC) continues to enforce and update its Health Breach Notification Rule (HBNR) amidst a fast-changing regulatory environment. A new rule, which took effect this week, expands the scope of the HBNR, as the FTC ramps up enforcement activity related to disclosures of identifiable health data, and other agencies implement changes to the Health Insurance Portability and Accountability Act (HIPAA), Part 2, and Information Blocking rules regulating similar data.

The Upshot

  • Via final rule, effective July 29, the FTC expanded the scope of the HBNR in efforts to “strengthen and modernize” the applicable regulations. The updated HBNR expands the methods by which regulated entities may make required notifications and updates timing requirements for making such notifications.
  • The updated HBNR requires entities that possess personal health records (PHR), but are not covered by the HIPAA, to provide notice following a breach of unsecured data.
  • The FTC clarified that the HBNR is intended to apply to health care apps and similar technologies not covered by HIPAA. The FTC implemented these clarifications via expansions to existing definitions and other changes intended to improve overall readability of the HBNR.  The FTC recently began enforcing the HBNR, while other HHS agencies continue to publish guidance related to HIPAA and Information Blocking.

The Bottom Line

Companies that are not regulated by HIPAA but maintain health information must ensure HBNR compliance. Other entities—including “Part 2” (federally assisted substance use disorder treatment) programs, HIPAA-covered entities and their business associates, health care providers, information technology developers and information exchanges, and lawful holders of health information—should note recent shifts in the regulatory environment for maintaining identifiable health data.

Health Breach Notification Rule

Effective July 29, 2024, Federal Trade Commission (FTC) updates to its Health Breach Notification Rule (HBNR) will (1) clarify the scope of the HBNR in order to make clear its applicability to developers of electronic health apps; (2) revise key definitions for breaches and regulated entities; and (3) revise the method, timing, and content of required notices. The HBNR applies to “foreign and domestic vendors of personal health records, PHR-related entities, and third-party service providers” not covered by the Health Insurance Portability and Accountability Act (HIPAA).

As clarified, a “vendor of personal health records” is an entity, not covered by HIPAA, that “offers or maintains a personal health record,” and may include developers of mobile health applications. A “personal health record” will now be “an electronic record of PHR identifiable health information on an individual that has the technical capacity to draw information from multiple sources and that is managed, shared, and controlled by or primarily for the individual,” and may include a mobile health application. PHR-related entities are, essentially, those not covered by HIPAA that offer products and services through vendors of personal health records, including their mobile health applications. Finally, a “third-party service provider” is one that accesses, maintains, retains, modifies, records, stores, destroys, or otherwise holds, uses, or discloses unsecured PHR identifiable health information in furtherance of services provided to such entities. 

Taken as a whole, this means that the revised HBNR will apply to virtually any entity not covered by HIPAA that handles identifiable information that relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual.

Under the revised HBNR, a “breach of security” will include any unauthorized releases of PHR identifiable health information, even if intentional. The HBNR requires vendors and PHR-related entities to notify affected individuals of any breaches of security, and further requires notice to the FTC and media for breaches affecting more than 500 individuals. The updated HBNR provides for certain electronic notifications and expands the required content of relevant notifications. 

Other Updates Affecting Health Information

Various federal agencies continue to issue notable guidance and expand enforcement efforts specific to health information. 

The updates to the HBNR, specifically, follow 2023’s first instance of FTC enforcement. In that instance, the FTC took issue with, among other things, the third-party collection and sharing, in alleged violation of the HBNR, of users’ medications and demographic data with advertisers in order to create medication-specific advertisements. Earlier this year, the FTC continued similar enforcement efforts against a Part 2 (substance use disorder diagnosis, treatment, or referral for treatment) program for alleged violations of the FTC Act and the Opioid Addiction Recovery Fraud Prevention Act of 2018. (Part 2 Programs should also note recent Office for Civil Rights (OCR) and Substance Abuse and Mental Health Services Administration regulatory alignment of substance use disorder confidentiality requirements and penalties with HIPAA). 

Notably, the alleged violations were based, in part, on the improper disclosure of identifiable information via “tracking technologies” and violations of HIPAA stemming from OCR’s subregulatory guidance related to such tracking technologies. That same tracking technology guidance has been subject to recent OCR revisions, as well as judicial challenges. For example, in June, the District Court for the Northern District of Texas vacated a portion of OCR’s guidance. Specifically, the court took issue with what it described at length as OCR’s attempts, via the tracking technology guidance, to “shoehorn additional information,” into the definition of “individually identifiable health information” provided by statute. 

Though promulgated by formal rulemaking (rather than subregulatory guidance), the revised HBNR similarly relies on new or expanded definitions of key terms, including “PHR identifiable health information” and “covered health care provider.”  

In addition, OCR recently released subregulatory guidance related to security via a concept paper outlining a number of currently voluntary cybersecurity recommendations (including reference to FDA guidance for cybersecurity recommendations applicable to medical devices) and cybersecurity performance goals. OCR indicated that formal updates to the HIPAA Security Rule will follow in 2024, potentially along with proposed changes to the Privacy Rule. Additionally, CMS recently published disincentives complementing applicable OIG penalties based on regulatory definitions set forth by the Office of the National Coordinator for Health Information Technology (ONC) in “Information Blocking” regulations promulgated in accordance with the Public Health Service and 21st Century Cures acts. Via these rules, various HHS agencies continue to interpret and clarify the applicability of statutory and regulatory definitions governing a wide array of health information. 

Health care entities and other lawful holders of health information should review these rules and maintain robust compliance measures for health information, prior to incurring any breach. Ballard Spahr’s Health Care and Privacy and Data Security attorneys are available to assist with any questions related to the HBNR, or other aspects of data privacy, security and breach reporting.