In the span of just a couple days, the California Privacy Protection Agency (CalPrivacy) announced two significant privacy enforcement actions, highlighting the increasing scrutiny on companies’ handling of personal data. These actions underscore the agency’s commitment to ensuring that businesses comply with privacy laws designed to protect individuals’ rights, particularly focusing on transparency and ease of data control for consumers. The cases involve a youth sports media company and the automotive giant Ford, both of which were alleged to have engaged in practices that violated consumers’ opt-out rights.

In the action against PlayOn Sports, CalPrivacy took particular issue with the fact that PlayOn directed users to opt out through the Network Advertising Initiative and the Digital Advertising Alliance as opposed to providing its own opt-out mechanism. CalPrivacy also alleged a failure to recognize opt-out signals and insufficient privacy notices. In its public announcement, CalPrivacy’s head of enforcement stated that “[s]tudents trying to go to prom or a high school football game shouldn’t have to leave their privacy rights at the door.” PlayOn was fined $1.1 million and agreed to modify its practices.

In a separate action, CalPrivacy alleged that Ford added unnecessary friction to the opt-out process, making it cumbersome for consumers to exercise their right by requiring email verification. The agency acknowledged that Ford “didn’t intend” to require consumers to verify their identities, but it stated that the action shows it “will pursue violations regardless of intent.” As part of the settlement, Ford will pay a fine and has committed to streamlining its opt-out procedures.

These enforcement actions serve as an important reminder that regulators are still extremely focused on public-facing aspects of privacy regimes, and especially the granular details of opt-out mechanisms. Companies should review their processes carefully.

A new bill introduced in Connecticut—Connecticut Senate Bill 117, An Act Concerning Breaches of Security Involving Electronic Personal Information—would create mandatory forensic examination requirements for entities that experience a “massive breach of security,” defined as a data breach affecting at least 100,000 Connecticut residents, and imposes substantial penalties for noncompliance.

SB 117 would require entities that experience a “massive breach of security” to:

  • Immediately retain a qualified third-party forensic examiner to conduct a forensic examination of the computer or computer system that was the subject of the data breach and to prepare a detailed forensic report disclosing how the breach occurred and its root causes;
  • Submit the detailed forensic report to the Connecticut Attorney General within 90 days of discovering the breach; and
  • Face civil penalties of $100,000 for small businesses and $500,000 for other entities for noncompliance.

The entity that experiences a massive data breach bears the cost of the forensic examination and report, regardless of whether the entity retains a third party itself or fails to do so and the Connecticut Attorney General retains a forensic examiner on its behalf. SB 117 would grant the Connecticut Attorney General authority to retain a qualified third party to perform the forensic examination and prepare the forensic report if an entity fails to comply.

If enacted, Connecticut would be the first state to impose automatic forensic examination and forensic reporting requirements for incidents based on a numerical threshold. It also raises serious issues regarding disclosure of confidential and proprietary information and privileged information.

In any event, given the scale of the potential penalties and the mandatory nature of the new requirements, entities that collect, store, or process personal information of Connecticut residents should closely monitor SB 117’s progress in the Assembly. If it passes, companies should establish protocols for engaging qualified third-party forensic examiners immediately upon discovery of a massive data breach and ensure their incident response plans accommodate the 90-day reporting deadline to the Connecticut Attorney General.

On February 5, 2026, Governor Henry McMaster signed into law South Carolina’s Age-Appropriate Code Design Act. South Carolina joins California, Maryland and Vermont in enacting an Age Appropriate Code Design Act that seeks to regulate website design and advertising that appeals specifically to Minors.

The statute applies to online services that conduct business in South Carolina who are “reasonably likely to be accessed” by minors and meet specified revenue or data-processing thresholds-including annual gross revenues exceeding $25 million, processing personal data of 50,000 or more consumers, or deriving at least 50% of revenue from data sales.

South Carolina’s act targets website design elements that encourage excessive engagement by minors, including the following specific features:

  • infinite scroll,
  • auto-playing videos,
  • gamification mechanics,
  • visible engagement metrics,
  • notifications,
  • in-game purchases,
  • and appearance-altering filters.

Websites that offer any of these covered features, and either know minors are visiting or contain websites defined under COPPA to target minors must disable these features by default and offer easy to use controls that enable parents to monitor and restrict their child’s usage of the website. The statute also prohibits targeted advertising to minors, restricts precise geolocation data collection, and bans “dark patterns,” or user interfaces designed to subvert user autonomy.

The South Carolina Attorney General holds exclusive enforcement authority, with covered services facing treble damages for violations, and where the conduct is found to be willful and wanton, the corporations directors and officers can be found personally liable. Dark pattern usage triggers additional exposure under the South Carolina Unfair Trade Practices Act. To ensure compliance, Companies must submit annual public reports prepared by independent auditors detailing design features, data practices, and age verification methods. While this statute may ultimately be challenged on first amendment grounds, companies should not delay compliance. Recommended steps include conducting a threshold analysis, auditing design features against restricted elements, implementing default privacy settings and parental controls for known minors, revising data practices to meet minimization requirements, eliminating targeted advertising and dark patterns, and engaging auditors for the July 1 reporting deadline. Given the immediate effective date and treble damages exposure, affected companies should prioritize compliance.

State privacy enforcement is entering a new phase, and Connecticut is quickly becoming a jurisdiction to watch.  In its third annual Connecticut Data Privacy Act (CTDPA) enforcement report, the Office of Attorney General William Tong disclosed for the first time that it has opened multiple active investigations into how messaging platforms, gaming services, and AI chatbot providers collect, use, and protect the personal information of children and teens.

This report is the first to reflect enforcement under Connecticut’s expanded minors’ privacy protections, which took effect in October 2024.  For companies operating nationally, the timing is critical.  Connecticut is joining California and Colorado in elevating minors’ data protection to a top enforcement priority, signaling a broader, coordinated shift in state-level privacy oversight.

Where Connecticut is Focusing Its Investigations

  1. Messaging Apps

    The Connecticut AG issued a notice of violation and inquiry letter to a messaging platform widely used by children and teens, citing deficiencies in privacy notice disclosures and opt-out mechanisms.  The investigation reflects a growing focus on whether platforms:

    • know or deliberately ignore the presence of minor users;
    • restrict adults from sending unsolicited messages to minors; and
    • obtain valid consent for the collection and use of minors’ precise geolocation data.

    The message is clear: platforms cannot remain passive when minors are on their services.

    1. Gaming Platforms

      In May 2025, Connecticut AG sent an inquiry letter to a popular gaming provider focused on the potential sale and targeted advertising use of children’s personal data.  Testing of the provider’s iOS and Android apps purportedly revealed the use of advertising SDKs commonly associated with targeted advertising.  In a subsequent report, the AG stated:

      “Companies may not willfully blind themselves to users’ age and must adjust their tracking technologies to account for the heightened protections afforded to minors under the CTDPA.”

      Connecticut has also joined several other states in sending a joint letter to gaming studios and their subsidiaries, identifying alleged deficiencies in privacy disclosures and consent processes related to minors’ data.  Separately, the Connecticut AG is investigating a digital advertising data broker that offers SDKs to app developers—including apps directed at minors—for potential violations of both the CTDPA and the Connecticut Unfair Trade Practices Act (CUTPA).

      1. AI Chatbots

      Perhaps most notable is the Connecticut AG’s ongoing investigation into a technology company that provides a chatbot platform for alleged harm to minors due to certain design features. Connecticut also recently joined a coalition of 42 Attorneys General in sending letters to major artificial intelligence companies demanding more quality control and other safeguards over chatbot products. Existing federal and state privacy, data breach, and unfair and deceptive acts laws apply in this space, and Connecticut has made clear it will use them.

      New Rules for Minors’ Data in 2026

      Connecticut’s legislature has significantly strengthened the CTDPA’s protections for minors, with key amendments set to take effect in July 2026.  Key changes include:

      • A ban on targeted advertising and the sale of minors’ personal data. Consent will no longer be a permissible workaround.
      • A prohibition on “addictive design features.”  Controllers may not use design features intended to significantly increase, sustain, or extend a minor’s use of an online service.
      • Strict limits on precise geolocation data.  Collection will be permitted only when strictly necessary to provide the relevant service.
      • Mandatory minors-specific privacy impact assessments.  These must be completed before engaging in high-risk processing, not after.
      • Expanded definitions of covered harms.  The law now expressly includes physical violence, material harassment, and sexual abuse or exploitation.

      Additional amendments taking effect in July 2026 include lowered applicability thresholds, so that all sensitive data processing and all sales of personal data will be covered. The definition of “sensitive data” has been expanded to include disability or treatment information, non-binary or transgender status, certain financial and government identifier information, and “neural” data. Companies will be required to disclose in their privacy notices whether they collect, use, or sell personal data for the purpose of training large language models.  The enforcement report further signals potential future legislative action, including tighter limits on “publicly available” data, a standalone genetic privacy law, AI-specific safeguards for children, and stronger universal opt-out mechanisms.

      Practical Takeaways for Businesses

      In light of Connecticut’s ramped-up enforcement activity and the July 2026 amendments, there is little room to delay.  Companies should consider taking the following steps now:

      • Audit minors’ data flows.  Identify where children’s data is collected, shared, or monetized, and unwind practices that will soon be prohibited. Minors-specific data protection assessments should already be underway.
      • Reassess age detection and verification practices.  The Attorney General has explicitly rejected “willful blindness” to users’ ages. Companies must evaluate whether they have actual or constructive knowledge of minor users and whether their controls align with CTDPA expectations.
      • Scrutinize product design for “addictive” features.  Autoplay, infinite scroll, streaks, and engagement-driven notifications may pose compliance risks under the 2026 amendments.
      • Treat AI and chatbot products as high-risk.  Connecticut has made clear that AI is not a regulatory blind spot. Existing privacy, security, and unfair practices laws will be enforced.
      • Strengthen vendor oversight.  Recent multistate enforcement actions demonstrate rising expectations around third-party risk management, including robust contractual controls, ongoing monitoring, and swift remediation when vendors deviate from agreed practices.

      Connecticut’s February 2026 disclosure of active investigations into messaging apps, gaming platforms, and AI chatbots marks a significant escalation in state-level enforcement focused on children’s privacy.  With new prohibitions on targeted advertising, data sales, addictive design features, and geolocation collection set to take effect in July 2026, businesses have a narrow window to achieve compliance.

      Companies that process personal data of Connecticut residents, particularly children and teens, should act now to assess their compliance posture and address any gaps before the next round of enforcement actions.

      Two customers shopping for the same product on the same website at the same time may see two different prices.  This scenario is a growing reality in today’s data-driven marketplace, and California regulators are paying attention.  On Data Privacy Day 2026, California Attorney General Rob Bonta announced a new investigative sweep targeting “surveillance pricing”—a practice in which businesses use personal information to set individualized prices for consumers.  For online retailers and service providers, this probe raises important questions about how customer data is collected, used, and disclosed.

      The California Consumer Privacy Act secures several key privacy rights for California consumers, including:

      • The right to know about the personal information a business collects about them and how it is used and shared;
      • The right to delete personal information collected from them (with some exceptions);
      • The right to opt-out of the sale or sharing of their personal information, including via the Global Privacy Control (GPC); and
      • The right to non-discrimination for exercising their CCPA rights.

      Of particular relevance to pricing practices, the CCPA includes a “purpose limitation” that restricts how businesses can use personal information.  Under this principle, businesses are limited in their use of personal information to purposes that are consistent with the reasonable expectations of consumers.  Businesses must disclose in their privacy policies how they collect, use, share, and sell consumers’ personal information, and these policies must include information on consumers’ privacy rights and how to exercise them.

      A Track Record of Active Enforcement Under the CCPA

          This enforcement of CCPA actions is not new to Attorney General Bonta.  He has consistently demonstrated commitment to robust enforcement of California’s privacy law and has targeted a range of data practices.  In July 2025, the Attorney General announced the largest CCPA settlement to date resolving allegations that a company’s use of online tracking technology on its website violated the CCPA.  In 2024, the CCPA investigative sweep focused on compliance by streaming services and connected TVs.  In August 2022, the Attorney General announced a settlement resolving a sweep of companies that were allegedly out of compliance with the user-enabled privacy control (GPC) signal to stop the sale of personal information.  Other sweeps have addressed the location data industry, employee information, opt-out requests on mobile apps, and business loyalty programs.

          What Is Surveillance Pricing?

          In plain terms, “surveillance pricing” is the use of consumers’ personal information to set targeted, individualized prices for products and services.  

          This can result in different consumers being offered different prices for the same product at the same time, often without any disclosure to the consumer.  Unless a business discloses that it uses a consumer’s personal information to set prices, surveillance pricing may be invisible to the consumer.

          What Businesses Are Being Asked to Disclose

          The California AG’s inquiry letters seek detailed information on businesses’ data-driven pricing practices, including:

          • Companies’ use of consumer personal information to set prices;
          • Policies and public disclosures regarding personalized pricing;
          • Any pricing experiments undertaken by companies;
          • Measures companies are taking to comply with algorithmic pricing, competition, and civil rights laws.

          Attorney General Bonta emphasized that practices like surveillance pricing “may undermine consumer trust, unfairly raise prices, and when conducted without proper disclosure or beyond reasonable expectations, may violate California law.”  Similar enforcement has been implemented by the Federal Trade Commission and New York.  

          Although the current sweep is centered on surveillance pricing in retail, grocery, and hospitality, the California AG’s public statements make clear that enforcement is driven by the nature of the data use, not the sector.  The CCPA’s purpose-limitation and reasonable-expectations principles have been construed to apply broadly, and other uses of personal information that significantly influence economic terms for consumers could come under scrutiny.

          Legal Risks Under the CCPA Key Considerations for Businesses

          Businesses that may use data to set individualized prices, directly or indirectly, should consider the following:

          • Review data use practices: Assess whether consumer personal information is used for pricing decisions and whether such use is disclosed in a manner consistent with CCPA requirements.
          • Update privacy disclosures: Ensure privacy policies accurately reflect pricing practices and purposes for data collection.
          • Assess reasonable expectations: Consider whether targeted pricing strategies align with what consumers would reasonably expect based on the business’s relationship with them.
          • Prepare for enforcement: If contacted by the California AG’s office, respond promptly and ensure records, data mapping, and compliance documentation are up to date.

          The California AG’s surveillance pricing probe marks a significant development in privacy enforcement.  Privacy enforcement is no longer just about data collection and sharing; it is now about how businesses use consumer data to influence prices.  As Attorney General Bonta stated, “Consumers have the right to understand how their personal information is being used, including whether companies are using their data to set the prices that Californians pay.”  Businesses would be well advised to review their data practices now and ensure that privacy disclosures and pricing strategies are aligned with California’s evolving regulatory expectations.

          A sharp contrast in the speed of obtaining appellate review is emerging between two key privacy statutes. While the U.S. Supreme Court is set to resolve a circuit split over the Video Privacy Protection Act (VPPA), litigants grappling with the California Invasion of Privacy Act (CIPA)—a statute one federal judge recently described as a “total mess”—still await interpretative guidance from even a single appellate court.

          Some predicted that appellate review of CIPA might be coming soon, based on a motion for interlocutory appeal that was recently filed in Fregosa v. Mashable, Inc. However, on January 23, 2026, the district court denied the request. At issue in that case was the interpretation of CIPA’s “pen register” provision and whether it applied to website tracking technologies. Citing a handful of federal district courts that had adopted similar interpretations of CIPA, the court found there were no “substantial grounds for difference of opinion” to justify an appeal—despite the growing number of state courts that have adopted a conflicting interpretation of CIPA.

          By contrast, trial courts in VPPA cases have seemed more open to allowing appellate review of ambiguities within the statute. The United States Supreme Court will hear Salazar v. Paramount Global to decide who qualifies as a “consumer” after federal circuits split 2-2 on the issue. At stake in Salazar is whether the VPPA applies narrowly to subscribers of “audiovisual materials” or broadly to all of a company’s subscribers.

          With CIPA’s interpretation in flux and a landmark VPPA decision pending, the legal landscape for website tracking is volatile. Businesses must stay informed and proactively manage compliance risk. Follow for updates on these critical developments.

          #PrivacyLaw #VPPA #CIPA #SCOTUS

          On February 5, 2026, Florida Attorney General James Uthmeier announced the creation of a first-of-its-kind specialized civil and criminal unit, named Consumer Harm from International and Nefarious Actors or “CHINA” for short. The unit will be dedicated to investigating and prosecuting foreign corporations, particularly those with Chinese ownership, that collect consumer data from Florida residents.

          The new division, housed within the Florida Attorney General’s Office will leverage state consumer protection laws to pursue subpoenas, investigations, and lawsuits against companies whose data practices may expose Floridians to foreign adversaries. The AG’s office made clear that health care companies, especially those collecting biometric and demographic data, will be a top enforcement priority.

          For companies operating in the health tech, biometrics, or consumer data space, Florida’s new enforcement unit warrants close attention. Businesses with any foreign ownership, particularly ties to China, should review their data collection, transmission, and disclosure practices. Transparency with consumers about international data flows may become not just a best practice but an enforcement flashpoint. If other states follow Florida’s lead, companies could soon face a patchwork of aggressive state-level scrutiny layered on top of existing federal oversight.

          China’s internet regulatory authority and top prosecutors have recently released a series of enforcement actions and cases, aimed at highlighting enforcement priorities in the data security realm over the last year. In 2025, enforcements under the Chinese Cybersecurity Law, the Data Security Law, the Personal Information Protection Law, and the Regulations on the Security Management of Network Data, focused primary on data security compliance, illegal cross-border transfers, and violations of personal information rights.

          Companies were cited for violations related to over-collection of data, noncompliance with network data security rules, improper storage, allowing unrestricted access to sensitive information, and violating consent requirements.

          Chinese authorities have stated that enforcement priorities in 2026 will include increased oversight in the “digital sphere” including activities of illegal collection and misuse of personal data. Enforcement actions in 2026 are also expected to focus on data leaks and improper network governance and increased penalties for data breaches.

          To prevent being the focus of China’s regulatory authority, companies should focus on:

          • Ensuring adequate safeguards are in place to protect data in transit and at rest;
          • Completing required cybersecurity classification and grading, where required;
          • Implementing effective security management policies and access policies, including strong password requirements;
          • Completing necessary assessments or certifications where required for cross-border transfers; and
          • Publishing adequate notices disclosing personal information collection and processing practices, and do not exceed those disclosed practices.

          While specific compliance requirements will vary depending on the business, any data collection, storage, or processing occurring in China will be subject to increased scrutiny in 2026, requiring businesses to take a closer look to ensure proper data security is in place.

          Importantly, although the safeguards may have overlap across different international laws and standards, cross-border transfer and localization principles are causing harder operational issues for businesses. With enforcement ramping up, businesses should carefully consider their global compliance strategy.

          Navigating the 2026 CCPA Updates

          As forecasted, effective January 1, 2026, businesses that are subject to the California Consumer Privacy Act (CCPA) must comply with newly-updated regulations. For some businesses, complying with these updates will require the implementation of or updates to policies and procedures related to, among other things, risk assessments, cybersecurity audits, and the use of Automated Decision-Making Technologies. Businesses should review the updated regulations to determine if they might be affected and, if so, implement a plan to promptly ensure compliance.

          Outlined below are just a few of the most notable CCPA updates businesses should be aware of:

          1. Risk Assessments
            • Businesses who engage in certain processing activities, including selling or sharing personal information, will need to conduct risk assessments. For new processing activities (beginning after January 1, 2026), risk assessments must be conducted prior to commencement of the new activity. Businesses that conduct a risk assessment in the prior calendar year must submit an attestation of the risk assessment to the California Privacy Protection Agency (“Agency”) by April 1 of the following year.
            • For activity that occurred prior to January 1, 2026, and continued thereafter, businesses must conduct a risk assessment by December 31, 2027, and provide attestation on or before April 1, 2028.
            • Risk assessments completed for other state laws can demonstrate compliance if they check the boxes of CCPA’s regulations as well.
          2. Automated Decision-Making Technology (ADMT) Rules
            • The updated regulations impose new requirements for businesses that use ADMT to make “significant decisions” about consumers. Those requirements will take effect in 2027. “Significant decisions” include granting or denying services like financial or lending products, housing, educational admissions or opportunities, job or contracting opportunities and compensation, or healthcare services.
            • Obligations associated with ADMT use include:
              • Providing consumers with pre-use notice of and access to information describing the manner in which ADMT is used and informing them of their associated opt-out and access rights;
              • Offering consumers ADMT opt-out, unless an exception applies;
              • Conducting risk assessments, as applicable; and
              • Updating privacy notices, as applicable.
          3. Cybersecurity Audits
            • The CCPA regulations will require certain businesses to conduct mandatory cybersecurity audits. The scope of the audits contains a long list of specifics, which generally tracks established audit standards such as the NIST Cybersecurity Framework. Businesses will also have to submit annual certifications of completion to the Agency.
            • While the deadlines for submitting certifications begin in 2028, businesses should be aware that implementing compliant cybersecurity programs—which often must include, among other things, incident response management, access controls, data inventory, retention and disposal procedures, and vendor oversight—often requires collaboration across businesses and can be very time consuming.
          4. Broadened Definition of “Sensitive Personal Information”
            • The updates also imported the statutory definition of “sensitive personal information,” with the addition of “personal information collected and analyzed concerning a consumer’s health, sex life, or sexual orientation,” and “personal information of consumers that the business has actual knowledge [or willfully disregards] are less than 16 years of age.”
            • With the broadened definition, businesses should reassess their need for notices, opt-outs, and back-end procedures related to the Right to Limit.
          5. Other Notable Updates
            • Stricter requirements relating to the use of dark patterns, highlighting the need for careful consideration relating to cookie banners and opt-out menus;
            • Additional notice requirements for businesses that disclose personal information collected through augmented or virtual reality devices;
            • Updates to data subject rights procedures; and
            • New transparency requirements, including in-app privacy policy posting requirement.

          Although many of the deadlines outlined above seem distant, businesses should be auditing their current processing activities for compliance now rather than discovering potential issues right before the applicable deadlines expire.

          The past year set up a clear clash between federal deregulatory efforts and state-level AI rulemaking, and 2026 is poised to be the year that conflict materializes in earnest.  The Trump Administration signaled a strong preference for scaling back AI-specific rules while exploring avenues to preempt state and local measures, even as a growing number of states moved forward with their own frameworks. In short, 2025 laid the groundwork, and 2026 is likely to deliver the confrontation.

          On the federal side, the Administration’s posture included both legislative and policy initiatives aimed at limiting state restrictions on AI.  Although a proposed 10-year moratorium on enforcing state AI laws was removed during negotiations over the One Big Beautiful Bill Act, the America’s AI Action Plan soon followed, instructing agencies to consider using preemption to curb “burdensome” state AI regulations. In the final weeks of 2025, the White House issued an executive order, Ensuring a National Policy Framework for Artificial Intelligence (the “Executive Oder”). The Executive Orderdirects the U.S. Attorney General to establish an AI Litigation Task to challenge state AI laws deemed unconstitutional or preempted and tasks the Administration with developing a national AI legislative framework that would preempt conflicting state rules.

          States, however, did not stand still.  California’s SB 53 established a first-in-the-nation set of standardized safety disclosure and governance obligations for developers of frontier AI systems, underscoring state willingness to regulate despite federal headwinds. Colorado’s Anti-Discrimination in AI Law remained intact through the 2025 session and is scheduled to take effect in June 2026, setting a near-term compliance deadline that will shape risk assessments and product planning.  Even traditionally deregulatory states like Texas pursued aggressive enforcement under existing biometric laws against alleged AI-driven facial recognition practices.

          Looking ahead, expect 2026 to feature litigation over the scope of preemption, increased enforcement actions from federal agencies, and a push toward a federal legislative framework, alongside continued state innovation in AI governance.  Despite the uncertainty, companies should continue to comply with applicable state AI laws because the Executive Order itself cannot overturn state law. Only Congress and the courts have the power to do so, and until then, state laws remain enforceable. For companies, that means preparing for a two-track reality: monitor and implement state obligations while tracking federal moves that could reshape, narrow, or delay those obligations. The result is likely to be a dynamic, contested compliance environment throughout 2026, rather than quick regulatory convergence.