State privacy enforcement is entering a new phase, and Connecticut is quickly becoming a jurisdiction to watch.  In its third annual Connecticut Data Privacy Act (CTDPA) enforcement report, the Office of Attorney General William Tong disclosed for the first time that it has opened multiple active investigations into how messaging platforms, gaming services, and AI chatbot providers collect, use, and protect the personal information of children and teens.

This report is the first to reflect enforcement under Connecticut’s expanded minors’ privacy protections, which took effect in October 2024.  For companies operating nationally, the timing is critical.  Connecticut is joining California and Colorado in elevating minors’ data protection to a top enforcement priority, signaling a broader, coordinated shift in state-level privacy oversight.

Where Connecticut is Focusing Its Investigations

  1. Messaging Apps

    The Connecticut AG issued a notice of violation and inquiry letter to a messaging platform widely used by children and teens, citing deficiencies in privacy notice disclosures and opt-out mechanisms.  The investigation reflects a growing focus on whether platforms:

    • know or deliberately ignore the presence of minor users;
    • restrict adults from sending unsolicited messages to minors; and
    • obtain valid consent for the collection and use of minors’ precise geolocation data.

    The message is clear: platforms cannot remain passive when minors are on their services.

    1. Gaming Platforms

      In May 2025, Connecticut AG sent an inquiry letter to a popular gaming provider focused on the potential sale and targeted advertising use of children’s personal data.  Testing of the provider’s iOS and Android apps purportedly revealed the use of advertising SDKs commonly associated with targeted advertising.  In a subsequent report, the AG stated:

      “Companies may not willfully blind themselves to users’ age and must adjust their tracking technologies to account for the heightened protections afforded to minors under the CTDPA.”

      Connecticut has also joined several other states in sending a joint letter to gaming studios and their subsidiaries, identifying alleged deficiencies in privacy disclosures and consent processes related to minors’ data.  Separately, the Connecticut AG is investigating a digital advertising data broker that offers SDKs to app developers—including apps directed at minors—for potential violations of both the CTDPA and the Connecticut Unfair Trade Practices Act (CUTPA).

      1. AI Chatbots

      Perhaps most notable is the Connecticut AG’s ongoing investigation into a technology company that provides a chatbot platform for alleged harm to minors due to certain design features. Connecticut also recently joined a coalition of 42 Attorneys General in sending letters to major artificial intelligence companies demanding more quality control and other safeguards over chatbot products. Existing federal and state privacy, data breach, and unfair and deceptive acts laws apply in this space, and Connecticut has made clear it will use them.

      New Rules for Minors’ Data in 2026

      Connecticut’s legislature has significantly strengthened the CTDPA’s protections for minors, with key amendments set to take effect in July 2026.  Key changes include:

      • A ban on targeted advertising and the sale of minors’ personal data. Consent will no longer be a permissible workaround.
      • A prohibition on “addictive design features.”  Controllers may not use design features intended to significantly increase, sustain, or extend a minor’s use of an online service.
      • Strict limits on precise geolocation data.  Collection will be permitted only when strictly necessary to provide the relevant service.
      • Mandatory minors-specific privacy impact assessments.  These must be completed before engaging in high-risk processing, not after.
      • Expanded definitions of covered harms.  The law now expressly includes physical violence, material harassment, and sexual abuse or exploitation.

      Additional amendments taking effect in July 2026 include lowered applicability thresholds, so that all sensitive data processing and all sales of personal data will be covered. The definition of “sensitive data” has been expanded to include disability or treatment information, non-binary or transgender status, certain financial and government identifier information, and “neural” data. Companies will be required to disclose in their privacy notices whether they collect, use, or sell personal data for the purpose of training large language models.  The enforcement report further signals potential future legislative action, including tighter limits on “publicly available” data, a standalone genetic privacy law, AI-specific safeguards for children, and stronger universal opt-out mechanisms.

      Practical Takeaways for Businesses

      In light of Connecticut’s ramped-up enforcement activity and the July 2026 amendments, there is little room to delay.  Companies should consider taking the following steps now:

      • Audit minors’ data flows.  Identify where children’s data is collected, shared, or monetized, and unwind practices that will soon be prohibited. Minors-specific data protection assessments should already be underway.
      • Reassess age detection and verification practices.  The Attorney General has explicitly rejected “willful blindness” to users’ ages. Companies must evaluate whether they have actual or constructive knowledge of minor users and whether their controls align with CTDPA expectations.
      • Scrutinize product design for “addictive” features.  Autoplay, infinite scroll, streaks, and engagement-driven notifications may pose compliance risks under the 2026 amendments.
      • Treat AI and chatbot products as high-risk.  Connecticut has made clear that AI is not a regulatory blind spot. Existing privacy, security, and unfair practices laws will be enforced.
      • Strengthen vendor oversight.  Recent multistate enforcement actions demonstrate rising expectations around third-party risk management, including robust contractual controls, ongoing monitoring, and swift remediation when vendors deviate from agreed practices.

      Connecticut’s February 2026 disclosure of active investigations into messaging apps, gaming platforms, and AI chatbots marks a significant escalation in state-level enforcement focused on children’s privacy.  With new prohibitions on targeted advertising, data sales, addictive design features, and geolocation collection set to take effect in July 2026, businesses have a narrow window to achieve compliance.

      Companies that process personal data of Connecticut residents, particularly children and teens, should act now to assess their compliance posture and address any gaps before the next round of enforcement actions.

      Two customers shopping for the same product on the same website at the same time may see two different prices.  This scenario is a growing reality in today’s data-driven marketplace, and California regulators are paying attention.  On Data Privacy Day 2026, California Attorney General Rob Bonta announced a new investigative sweep targeting “surveillance pricing”—a practice in which businesses use personal information to set individualized prices for consumers.  For online retailers and service providers, this probe raises important questions about how customer data is collected, used, and disclosed.

      The California Consumer Privacy Act secures several key privacy rights for California consumers, including:

      • The right to know about the personal information a business collects about them and how it is used and shared;
      • The right to delete personal information collected from them (with some exceptions);
      • The right to opt-out of the sale or sharing of their personal information, including via the Global Privacy Control (GPC); and
      • The right to non-discrimination for exercising their CCPA rights.

      Of particular relevance to pricing practices, the CCPA includes a “purpose limitation” that restricts how businesses can use personal information.  Under this principle, businesses are limited in their use of personal information to purposes that are consistent with the reasonable expectations of consumers.  Businesses must disclose in their privacy policies how they collect, use, share, and sell consumers’ personal information, and these policies must include information on consumers’ privacy rights and how to exercise them.

      A Track Record of Active Enforcement Under the CCPA

          This enforcement of CCPA actions is not new to Attorney General Bonta.  He has consistently demonstrated commitment to robust enforcement of California’s privacy law and has targeted a range of data practices.  In July 2025, the Attorney General announced the largest CCPA settlement to date resolving allegations that a company’s use of online tracking technology on its website violated the CCPA.  In 2024, the CCPA investigative sweep focused on compliance by streaming services and connected TVs.  In August 2022, the Attorney General announced a settlement resolving a sweep of companies that were allegedly out of compliance with the user-enabled privacy control (GPC) signal to stop the sale of personal information.  Other sweeps have addressed the location data industry, employee information, opt-out requests on mobile apps, and business loyalty programs.

          What Is Surveillance Pricing?

          In plain terms, “surveillance pricing” is the use of consumers’ personal information to set targeted, individualized prices for products and services.  

          This can result in different consumers being offered different prices for the same product at the same time, often without any disclosure to the consumer.  Unless a business discloses that it uses a consumer’s personal information to set prices, surveillance pricing may be invisible to the consumer.

          What Businesses Are Being Asked to Disclose

          The California AG’s inquiry letters seek detailed information on businesses’ data-driven pricing practices, including:

          • Companies’ use of consumer personal information to set prices;
          • Policies and public disclosures regarding personalized pricing;
          • Any pricing experiments undertaken by companies;
          • Measures companies are taking to comply with algorithmic pricing, competition, and civil rights laws.

          Attorney General Bonta emphasized that practices like surveillance pricing “may undermine consumer trust, unfairly raise prices, and when conducted without proper disclosure or beyond reasonable expectations, may violate California law.”  Similar enforcement has been implemented by the Federal Trade Commission and New York.  

          Although the current sweep is centered on surveillance pricing in retail, grocery, and hospitality, the California AG’s public statements make clear that enforcement is driven by the nature of the data use, not the sector.  The CCPA’s purpose-limitation and reasonable-expectations principles have been construed to apply broadly, and other uses of personal information that significantly influence economic terms for consumers could come under scrutiny.

          Legal Risks Under the CCPA Key Considerations for Businesses

          Businesses that may use data to set individualized prices, directly or indirectly, should consider the following:

          • Review data use practices: Assess whether consumer personal information is used for pricing decisions and whether such use is disclosed in a manner consistent with CCPA requirements.
          • Update privacy disclosures: Ensure privacy policies accurately reflect pricing practices and purposes for data collection.
          • Assess reasonable expectations: Consider whether targeted pricing strategies align with what consumers would reasonably expect based on the business’s relationship with them.
          • Prepare for enforcement: If contacted by the California AG’s office, respond promptly and ensure records, data mapping, and compliance documentation are up to date.

          The California AG’s surveillance pricing probe marks a significant development in privacy enforcement.  Privacy enforcement is no longer just about data collection and sharing; it is now about how businesses use consumer data to influence prices.  As Attorney General Bonta stated, “Consumers have the right to understand how their personal information is being used, including whether companies are using their data to set the prices that Californians pay.”  Businesses would be well advised to review their data practices now and ensure that privacy disclosures and pricing strategies are aligned with California’s evolving regulatory expectations.

          A sharp contrast in the speed of obtaining appellate review is emerging between two key privacy statutes. While the U.S. Supreme Court is set to resolve a circuit split over the Video Privacy Protection Act (VPPA), litigants grappling with the California Invasion of Privacy Act (CIPA)—a statute one federal judge recently described as a “total mess”—still await interpretative guidance from even a single appellate court.

          Some predicted that appellate review of CIPA might be coming soon, based on a motion for interlocutory appeal that was recently filed in Fregosa v. Mashable, Inc. However, on January 23, 2026, the district court denied the request. At issue in that case was the interpretation of CIPA’s “pen register” provision and whether it applied to website tracking technologies. Citing a handful of federal district courts that had adopted similar interpretations of CIPA, the court found there were no “substantial grounds for difference of opinion” to justify an appeal—despite the growing number of state courts that have adopted a conflicting interpretation of CIPA.

          By contrast, trial courts in VPPA cases have seemed more open to allowing appellate review of ambiguities within the statute. The United States Supreme Court will hear Salazar v. Paramount Global to decide who qualifies as a “consumer” after federal circuits split 2-2 on the issue. At stake in Salazar is whether the VPPA applies narrowly to subscribers of “audiovisual materials” or broadly to all of a company’s subscribers.

          With CIPA’s interpretation in flux and a landmark VPPA decision pending, the legal landscape for website tracking is volatile. Businesses must stay informed and proactively manage compliance risk. Follow for updates on these critical developments.

          #PrivacyLaw #VPPA #CIPA #SCOTUS

          On February 5, 2026, Florida Attorney General James Uthmeier announced the creation of a first-of-its-kind specialized civil and criminal unit, named Consumer Harm from International and Nefarious Actors or “CHINA” for short. The unit will be dedicated to investigating and prosecuting foreign corporations, particularly those with Chinese ownership, that collect consumer data from Florida residents.

          The new division, housed within the Florida Attorney General’s Office will leverage state consumer protection laws to pursue subpoenas, investigations, and lawsuits against companies whose data practices may expose Floridians to foreign adversaries. The AG’s office made clear that health care companies, especially those collecting biometric and demographic data, will be a top enforcement priority.

          For companies operating in the health tech, biometrics, or consumer data space, Florida’s new enforcement unit warrants close attention. Businesses with any foreign ownership, particularly ties to China, should review their data collection, transmission, and disclosure practices. Transparency with consumers about international data flows may become not just a best practice but an enforcement flashpoint. If other states follow Florida’s lead, companies could soon face a patchwork of aggressive state-level scrutiny layered on top of existing federal oversight.

          China’s internet regulatory authority and top prosecutors have recently released a series of enforcement actions and cases, aimed at highlighting enforcement priorities in the data security realm over the last year. In 2025, enforcements under the Chinese Cybersecurity Law, the Data Security Law, the Personal Information Protection Law, and the Regulations on the Security Management of Network Data, focused primary on data security compliance, illegal cross-border transfers, and violations of personal information rights.

          Companies were cited for violations related to over-collection of data, noncompliance with network data security rules, improper storage, allowing unrestricted access to sensitive information, and violating consent requirements.

          Chinese authorities have stated that enforcement priorities in 2026 will include increased oversight in the “digital sphere” including activities of illegal collection and misuse of personal data. Enforcement actions in 2026 are also expected to focus on data leaks and improper network governance and increased penalties for data breaches.

          To prevent being the focus of China’s regulatory authority, companies should focus on:

          • Ensuring adequate safeguards are in place to protect data in transit and at rest;
          • Completing required cybersecurity classification and grading, where required;
          • Implementing effective security management policies and access policies, including strong password requirements;
          • Completing necessary assessments or certifications where required for cross-border transfers; and
          • Publishing adequate notices disclosing personal information collection and processing practices, and do not exceed those disclosed practices.

          While specific compliance requirements will vary depending on the business, any data collection, storage, or processing occurring in China will be subject to increased scrutiny in 2026, requiring businesses to take a closer look to ensure proper data security is in place.

          Importantly, although the safeguards may have overlap across different international laws and standards, cross-border transfer and localization principles are causing harder operational issues for businesses. With enforcement ramping up, businesses should carefully consider their global compliance strategy.

          Navigating the 2026 CCPA Updates

          As forecasted, effective January 1, 2026, businesses that are subject to the California Consumer Privacy Act (CCPA) must comply with newly-updated regulations. For some businesses, complying with these updates will require the implementation of or updates to policies and procedures related to, among other things, risk assessments, cybersecurity audits, and the use of Automated Decision-Making Technologies. Businesses should review the updated regulations to determine if they might be affected and, if so, implement a plan to promptly ensure compliance.

          Outlined below are just a few of the most notable CCPA updates businesses should be aware of:

          1. Risk Assessments
            • Businesses who engage in certain processing activities, including selling or sharing personal information, will need to conduct risk assessments. For new processing activities (beginning after January 1, 2026), risk assessments must be conducted prior to commencement of the new activity. Businesses that conduct a risk assessment in the prior calendar year must submit an attestation of the risk assessment to the California Privacy Protection Agency (“Agency”) by April 1 of the following year.
            • For activity that occurred prior to January 1, 2026, and continued thereafter, businesses must conduct a risk assessment by December 31, 2027, and provide attestation on or before April 1, 2028.
            • Risk assessments completed for other state laws can demonstrate compliance if they check the boxes of CCPA’s regulations as well.
          2. Automated Decision-Making Technology (ADMT) Rules
            • The updated regulations impose new requirements for businesses that use ADMT to make “significant decisions” about consumers. Those requirements will take effect in 2027. “Significant decisions” include granting or denying services like financial or lending products, housing, educational admissions or opportunities, job or contracting opportunities and compensation, or healthcare services.
            • Obligations associated with ADMT use include:
              • Providing consumers with pre-use notice of and access to information describing the manner in which ADMT is used and informing them of their associated opt-out and access rights;
              • Offering consumers ADMT opt-out, unless an exception applies;
              • Conducting risk assessments, as applicable; and
              • Updating privacy notices, as applicable.
          3. Cybersecurity Audits
            • The CCPA regulations will require certain businesses to conduct mandatory cybersecurity audits. The scope of the audits contains a long list of specifics, which generally tracks established audit standards such as the NIST Cybersecurity Framework. Businesses will also have to submit annual certifications of completion to the Agency.
            • While the deadlines for submitting certifications begin in 2028, businesses should be aware that implementing compliant cybersecurity programs—which often must include, among other things, incident response management, access controls, data inventory, retention and disposal procedures, and vendor oversight—often requires collaboration across businesses and can be very time consuming.
          4. Broadened Definition of “Sensitive Personal Information”
            • The updates also imported the statutory definition of “sensitive personal information,” with the addition of “personal information collected and analyzed concerning a consumer’s health, sex life, or sexual orientation,” and “personal information of consumers that the business has actual knowledge [or willfully disregards] are less than 16 years of age.”
            • With the broadened definition, businesses should reassess their need for notices, opt-outs, and back-end procedures related to the Right to Limit.
          5. Other Notable Updates
            • Stricter requirements relating to the use of dark patterns, highlighting the need for careful consideration relating to cookie banners and opt-out menus;
            • Additional notice requirements for businesses that disclose personal information collected through augmented or virtual reality devices;
            • Updates to data subject rights procedures; and
            • New transparency requirements, including in-app privacy policy posting requirement.

          Although many of the deadlines outlined above seem distant, businesses should be auditing their current processing activities for compliance now rather than discovering potential issues right before the applicable deadlines expire.

          The past year set up a clear clash between federal deregulatory efforts and state-level AI rulemaking, and 2026 is poised to be the year that conflict materializes in earnest.  The Trump Administration signaled a strong preference for scaling back AI-specific rules while exploring avenues to preempt state and local measures, even as a growing number of states moved forward with their own frameworks. In short, 2025 laid the groundwork, and 2026 is likely to deliver the confrontation.

          On the federal side, the Administration’s posture included both legislative and policy initiatives aimed at limiting state restrictions on AI.  Although a proposed 10-year moratorium on enforcing state AI laws was removed during negotiations over the One Big Beautiful Bill Act, the America’s AI Action Plan soon followed, instructing agencies to consider using preemption to curb “burdensome” state AI regulations. In the final weeks of 2025, the White House issued an executive order, Ensuring a National Policy Framework for Artificial Intelligence (the “Executive Oder”). The Executive Orderdirects the U.S. Attorney General to establish an AI Litigation Task to challenge state AI laws deemed unconstitutional or preempted and tasks the Administration with developing a national AI legislative framework that would preempt conflicting state rules.

          States, however, did not stand still.  California’s SB 53 established a first-in-the-nation set of standardized safety disclosure and governance obligations for developers of frontier AI systems, underscoring state willingness to regulate despite federal headwinds. Colorado’s Anti-Discrimination in AI Law remained intact through the 2025 session and is scheduled to take effect in June 2026, setting a near-term compliance deadline that will shape risk assessments and product planning.  Even traditionally deregulatory states like Texas pursued aggressive enforcement under existing biometric laws against alleged AI-driven facial recognition practices.

          Looking ahead, expect 2026 to feature litigation over the scope of preemption, increased enforcement actions from federal agencies, and a push toward a federal legislative framework, alongside continued state innovation in AI governance.  Despite the uncertainty, companies should continue to comply with applicable state AI laws because the Executive Order itself cannot overturn state law. Only Congress and the courts have the power to do so, and until then, state laws remain enforceable. For companies, that means preparing for a two-track reality: monitor and implement state obligations while tracking federal moves that could reshape, narrow, or delay those obligations. The result is likely to be a dynamic, contested compliance environment throughout 2026, rather than quick regulatory convergence.

          Over the last few years, businesses, nonprofits, and other website operators have seen thousands of lawsuits and arbitrations filed under the California Invasion of Privacy Act (CIPA) alleging that the use of ubiquitous cookies and pixels on websites violates CIPA’s wiretap and pen register provisions. The California legislature considered curbing that explosion of litigation with SB 690, which was introduced in the 2025 session with some enthusiasm but ultimately stalled.

          Thus, although there is some hope of relief as the legislative session picks back up in 2026, organizations are left in the same situation as before—balancing business needs and value with risk mitigation.

          SB 690 at a Glance

          CIPA was originally enacted in 1967 to address traditional wiretapping and eavesdropping concerns which, at the time, primarily involved telephonic communications. It therefore did not address digital technologies or adtech now used for website marketing. Nonetheless, in recent years, several thousands of lawsuits and arbitrations have sought to apply CIPA in precisely that context.  

          In 2025, California lawmakers advanced SB 690 to modernize CIPA for the internet era. The amendment aims to tamp down lawsuits targeting routine website technologies by adding a broad “commercial business purpose” exemption to CIPA’s wiretap and pen-register provisions and, thus, clarify that use of these technologies for business purposes would not trigger CIPA liability. Early versions of the bill included a retroactivity provision that would have applied to all pending cases as of January 1, 2026, but that provision was removed in the Senate amid criticism from consumer advocate groups.

          Why SB 690 Stalled in 2025

          Despite unanimous passage in the Senate, SB 690 did not clear the Assembly before the 2025 legislative session adjourned, and it was thus designated a two-year bill. Reports indicate the bill stalled in the Assembly Judiciary process, with lawmakers citing the need for further stakeholder dialogue and competing policy priorities, leaving businesses to face continued litigation through at least 2026.

          The primary resistance to the bill came from privacy and consumer groups who argued that SB 690’s proposed exemption was too broad and would shield opaque tracking practices that CCPA enforcement has not yet addressed. Removal of the bill’s retroactivity provision would reduce immediate litigation relief, and plaintiffs’ filings continued apace through the close of the year amid split trial court decisions on CIPA’s scope.

          What’s Next for SB 690 in 2026

          Notably, it was SB 690’s author and primary sponsor that paused the bill in the Assembly in 2025, citing opposition from consumer privacy advocates and attorneys’ groups. So, it remains to be seen whether SB 690 will advance at all, at least in its current form, in 2026.

          SB 690 is eligible for reconsideration as a two-year bill in the 2026 session, which reconvenes January 5. The last day to introduce bills is February 20, and the final day to pass bills August 31. Expect policy committee activity in spring, fiscal deadlines in May, and floor-only periods late May and in the final August push. If revived, SB 690 is likely to remain prospective only, reinforcing the importance of near-term compliance and risk mitigation for existing site technologies.

          What Businesses and Organizations Should Do Now

          Until reform is enacted, CIPA suits over pixels, analytics, chatbots, search bar functions, and replay tools will continue, complete with statutory damages of $5,000 per alleged violation and inconsistent early-dismissal outcomes. Further, even if SB 690 eventually passes in its current form, relief would not be coming until at least 2027, if not longer. Organizations therefore need to take steps now to assess their risk.

          Risk mitigation for each specific organization will necessarily vary, but it will generally involve assessing practices, identifying any higher risk processing activities, and updating disclosures, consents, and vendor governance as appropriate.

          On December 19, 2025, New York Governor Kathy Hochul vetoed the New York Health Information Privacy Act (NY HIPA), a health data privacy bill that would have afforded consumer protections to non-HIPAA health data.

          Although NY HIPA resembled existing laws, like Washington’s My Health My Data Act, it had several important differences that would have greatly expanded its impact—including by applying to employee data, data held by financial institutions subject to the Gramm-Leach-Bliley Act, and data that had been de-identified in accordance with HIPAA. NY HIPA would have also required regulated entities to maintain a publicly available retention schedule and dispose of an individual’s regulated health information pursuant to that schedule subject to certain regulatory requirements.

          Unsurprisingly, NY HIPA was the subject of intense lobbying on both sides of the debate. Ultimately, Governor Hochul stated in her veto memo that the legislation, as written, is too broad, “creating potentially significant uncertainty about the information subject to regulation and compliance challenges.”

          NY HIPA could still technically pass into law if two-thirds of the members of each house vote to override the Governor’s veto. While NY HIPA did originally pass with that level of support, the likelihood of overriding a veto is very slim from a historical standpoint.

          In many ways, 2025 was a relatively tame year for privacy legislation, and the NY HIPA veto is a fitting conclusion. As we move into 2026, companies should carefully monitor whether states push more aggressively, as well as the emerging fight between the states and federal government over the authority to regulate AI and privacy issues.

          On October 13, 2025, California Governor Gavin Newsom vetoed S.B. 7, which would have required human oversight in certain types of employment decisions made solely by automated decision systems (“ADS”).  If Gov. Newsom signed the bill, it would have required California employers using automated systems for actions such as hiring, firing, and discipling to implement human oversight and explain certain decisions made by AI. The bill would have also required robust notices and granted employees and contractors access rights to data used by ADS.

          In his letter notifying the California State Senate of the veto Gov. Newsom cited concerns that S.B. 7 would have imposed “overly broad restrictions” on employer deployment of ADS. For example, the requirements could be interpreted to extend to “innocuous” technology such as scheduling and workflow management tools. Industry groups opposing the bill argued it would have also carried massive costs for compliance, particularly to small businesses.

          Gov. Newsom shared concerns with the bill’s author of unregulated use of ADS and affording employees protection as it relates to ADS, but wrote that legislatures “should assess the efficacy of [such] regulations to address these concerns.” Still, California employers face restrictions with respect to certain uses of ADS under the regulations recently finalized by the California Privacy Protection Agency.