On March 15, 2023, the Iowa House passed Senate Bill 262 on a 97-0 vote. The Bill had previously passed the Iowa Senate on March 6, 2023.  If ultimately signed by Iowa Governor Kim Reynolds, Iowa would join California, Colorado, Connecticut, Utah, and Virginia as the sixth U.S. state with a comprehensive consumer data privacy law. The Bill would also become law if it is sent to the governor and remains unsigned for three days.

The proposed Iowa privacy law follows a similar path as other U.S. state privacy laws. At a high-level, the Iowa Bill would:

  • Apply to entities that control or process personal data of at least 100,000 consumers or that control or process person data of at least 25,000 consumers but derives over 50% of its gross revenue from the sale of personal data.
  • Provide the Iowa Attorney General with exclusive enforcement authority.
  • Provide consumers with: A right to confirm whether a controller is processing the consumer’s personal data; a right to request deletion of personal data provided by the consumer; a right to obtain a copy of the consumer’s personal data from the controller; and a right to opt out of the sale of personal data.
  • Require a controller to present consumers with a clear notice and opportunity to opt-out of the processing of their sensitive data, if processing is for a nonexempt purpose. Likewise, the Bill would require a controller who sells personal data to 3rd parties to clearly and conspicuously disclose such activity to consumers, as well as the manner in which a consumer may exercise the right opt-out of such activity.
  • Require a contract between a controller and processer to include: That each person processing personal data is subject to a duty of confidentiality with respect to the data being processed; that at the controller’s direction, the processor must delete or return all personal data to the controller as requested at the end of the service, unless retention of the personal data is required by law; that upon the reasonable request of the controller, the processor make available all information in the processor’s possession necessary to demonstrate the processor’s compliance with the Bill; and that any engaged subcontractor or agent enter into a written contract that requires the subcontractor or agent to meet the duties of the processor with respect to personal data.

However, the Bill would not provide consumers with a right of private action; a right to correct inaccuracies with their data; or  require that businesses recognize “do not track” signals.

At the time of this post, the Bill has been messaged back to the Senate. If passed into law, its provisions would take effect on January 1, 2025. As drafted, the Bill would not provide for additional rulemaking.

The AI application ChatGPT quickly became a household name, but already is morphing into a more advanced version of generative AI. At the same time, Microsoft’s redesigned Bing search engine will soon run on a new, next-generation OpenAI large language model. While these tools have demonstrated that generative AI has tremendous operational and business potential, a constellation of privacy and data security risks arising from their use have become visible.

In a two-part guest article series for the Cybersecurity Law Report, Ballard Spahr Privacy & Data Security attorneys, Phil Yannella, Greg Szewczyk, Tim Dickens and Emily Klode explore these issues. The first article covers AI collection and use issues under U.S. and E.U privacy laws and regulations. Part two will address product liability, healthcare and employment risks, issues under wiretapping laws, and practical compliance measures. The article is paywalled. Subscribers can access the article here.

On March 8, 2023, the U.K. Secretary of State for Science for Innovation and Technology announced the publication of the Data Protection and Digital Information (No.2) Bill. This new version of the Data Protection and Digital Information bill will effectively supersede the prior draft, which was first published in July of 2022.

The Bill would not alter the fundamental principles of the existing UK GDPR—which is very similar to the EU GDPR—allowing companies that are already compliant to remain complaint.  Instead, as the Secretary explained when it was announced, the Bill was designed with input from businesses and data experts, and is intended to create less “red tape” than the existing European GDPR.  Notably, the new Bill has several business friendly features, such as:

  • Reduced Record Keeping: A controller that carries our processing of personal data would no longer be required to maintain appropriate records of processing unless such processing is likely to result in a high risk to the rights and freedoms of individuals.
  • Removal of U.K. Representative Requirement: The Bill would omit Article 27 of the existing UK GDPR, which requires controllers or processors not established in the UK to appoint a representative that is physically located in the UK.
  • Clarified “Legitimate Interest”: Processing with a “legitimate interest” will now expressly including processing for purposes of direct marketing, processing for intra-group transmission of personal data, and processing that is necessary for the purposes of ensuring the security of a network or information systems. Further, the explanatory notes state that these express examples are illustrative only and non-exhaustive, and that a data controller may process personal data for other legitimate activities, “providing the processing is necessary for the activity and appropriate consideration is given to the potential impact of the processing on the rights and interests of data subjects.” Specifically, Article 6 of the UK GDPR will still limit such processing “where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.”
  • International Data Transfer Standards: The Bill would allow for data transfers to a third country or international organization if “the standard of the protection provided for data subjects with regard to general processing of personal data in the country or by the organisation is not materially lower than the standard of the protection provided for data subjects.”
  • Cookies: Consent would no longer be required for cookies when those cookies are used to: (1) collect information for statistical purposes about how the service or website is used, with a view to making improvements to the service or website; (2) enable the way the website appears or functions to adapt to the preferences of the subscriber or user; (3) update software, if that is the cookies sole purpose; or (4) in the case of an emergency. However, for 1-3, the user must still be provided with “clear and comprehensive information about the purpose of the storage or access” and be given a “simple” means of objecting to the storage.

The Bill is currently awaiting its second reading to be scheduled in the House of Commons.

Even if the Bill progresses, it will not radically alter (or even require change to) compliance regimes.  However, the Bill is notable in that it may indicate that the UK could be testing new paths that diverge from the EU in the future.

In a landmark decision that will have widespread effects, the Illinois Supreme Court ruled that a claim accrues each time—rather than just the first time—that data is collected in violation of the Biometric Information Privacy Act (BIPA).  Because BIPA provides statutory damages for each violation, this ruling exponentially increases potential damages, especially in the employment context.

BIPA requires informed consent prior to the collection of biometric information.  In the employment context, we have seen many class actions relating to fingerprint timekeeping technology.  In White Castle, the issue arose as to whether each employee could assert only one claim, or whether they could assert a claim for each fingerprint scan taken without consent.  Given the importance of the issue, the Seventh Circuit certified the issue to the Illinois Supreme Court last year.

On February 17, the Supreme Court issued its ruling.  Rejecting White Castle’s argument that only the first scan requires consent, the Court held that the plain language of the statute supports that “collection” occurred each time the plaintiff-employees’ fingerprints were scanned.

When coupled with the recent finding that there is a five year statute of limitation, this ruling will result in staggering damages.  For example, if an employee clocked in using this technology five days a week for fifty weeks a year over that five year period, the statutory damages for that single employee would be $1.25 million for negligent violations and $6.25 million for intentional violations.  When applying these numbers to a large workforce, it is easy to imagine damages crippling companies and insurance companies.

There may yet be some glimmer of hope for class defendants. The opinion leaves a potential opening to challenge damage awards of such magnitude: the court suggested that damages under BIPA may be “discretionary rather than mandatory” and explained that courts presiding over class actions have the discretion to fashion class awards to avoid “the financial destruction of a business.”

In any event, BIPA class actions have already flooded the courts over the last few years.  The White Castle holding will likely add fuel to the fire, both for number of suits filed and settlement positions.  It may also have downstream effects on insurance coverage as the market may correct in a similar fashion to what we saw with cyber premiums after rise in ransomware.

At the very least, companies need to take a very thorough look at the technology being used to properly assess and mitigate their risk.

2022 proved to be an historic year for privacy and data security, and 2023 is likely to follow suit.  With privacy compliance deadlines looming under three state laws, a surge in data privacy litigation, new federal cyber-regulations, new state laws governing children’s data and new EU legislation regulating digital services – privacy lawyers will be busy this year. In this webcast, Ballard Spahr partners Phil Yannella and Greg Szewczyk will discuss the main privacy issues that are likely to dominate headlines in 2023.

On Friday, January 27, California Attorney General Rob Bonta announced an investigative sweep of businesses that provide mobile apps, issuing warning letters to those that AG Bonta alleges failed to comply with the California Consumer Privacy Act (CCPA).  This sweep focused specifically on “popular retail, travel, and food service industry apps” that failed to comply with consumer opt-out requests or otherwise failed to offer mechanisms for consumers to stop the sale of their personal information.   

Investigative sweeps have been fairly common of late, and it isn’t a surprise to see a focus on mobile apps.  But, in addition to hitting mobile apps that did not offer a mechanism to opt out of the sale of personal information, the sweep also focused on mobile apps that could not process requests submitted through authorized agents—including those sent by the mobile app “Permission Slip.”  Permission Slip is an app that files requests on users’ behalf, instructing companies to stop selling data.  In some ways, Permission Slip is similar to the Global Privacy Control (GPC), which was the focus of the Attorney General’s Sephora action last fall. 

The regulatory emphasis on user enabled mechanisms, whether through apps or browser extensions, adds another layer of complexity for businesses attempting to implement their CPRA compliance efforts.  And, while the GPC had been publicly endorsed by the California Attorney General months before the Sephora action, Permission Slip is a relatively new and unknown app.  It therefore raises the question of which apps and extensions businesses have to honor in order to stay in compliance with the CPRA and other privacy laws.  With a likely flood of new products or services, it is no small issue.

For more information, please see the Attorney General’s press release, available here, as well as the Permission Slip app, available here.

With Colorado joining California as the only other state with rules implementing a comprehensive privacy law, businesses and practitioners have been anxiously watching to see whether a California-compliant privacy policy would also be compliant with the Colorado Privacy Act (“CPA”).  And, as the Colorado Attorney General has made clear, interoperability is an important guiding principle in the Colorado rulemaking process.  However, the Colorado Attorney General made equally clear that interoperability is just one principle—when the office believes there is a better way of handling an issue, it will diverge from other states’ practices.  In the initial draft of the Colorado rules, it became clear that privacy policies are one such area.  And while the revised draft of the Colorado rules take steps to try to increase interoperability, a comparison shows that Colorado is still taking a new, “purpose-driven” approach.

For years, most privacy policies followed the same core structure—what information is collected, how it is used, and how it is shared.  These three types of disclosures were not linked to each other, so consumers were not entirely sure whether how a company may be using or sharing their specific information.  For example, a consumer may know that a company collects contact information when they sign up for their newsletter and when they file a customer complaint.  The consumer may also know the company sells information to third parties who will then market to them.  But, the consumer doesn’t know what information is actually sold to those third parties. 

With the advent of the California Consumer Privacy Act (“CCPA”), we saw a new structure begin to emerge that was information-driven.  Under this model, businesses had to disclose to consumers what statutorily-defined categories of personal information it collects, whether they sold each category, and the categories of third parties to whom each category of information is sold.  To comply with these requirements (and to ensure that consumers understood what the statutory categories of information included), many businesses used some version of the “California Chart”:

CategoriesExamplesSoldThird Parties to Whom Sold
IdentifiersReal name, alias, postal address, unique personal identifier, online identifier, IP address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiersYesBusiness Partners

Going back to the original analogy, consumers would now know that the business sells “Identifiers,” which could include name and email address.  But, they still would not know whether the business sells all names and email addresses regardless of whether they were collected for the newsletter or through customer complaints.  The California Privacy Rights Act (“CPRA”) expanded the information needed in the California Chart, but it kept the same information-driven approach.

The draft rules for the Colorado Privacy Act struck a fundamentally different, purpose-driven approach.  Under this approach, for each purpose of collection, companies will need to disclose what types of information are collected, whether that information is used for targeted advertising or sales, and the third parties to whom it is sold.  To satisfy this new approach, businesses would need to use a new “Colorado Chart”:

PurposeCategories of PITargeted Advertising / SalesThird Parties to Whom Sold
NewsletterContact InformationNo / YesBusiness Partners
Customer ServiceContact InformationNo / NoN/A

Again using the same analogy, consumers can now see whether the information they provided for the newsletter is sold, and also whether the information the provided for customer service is processed differently.  This approach is in many ways the crux of the Colorado privacy policy rule.  Indeed, as the Colorado Attorney General has explained, consumers may very well have different expectations based on the context in which they provide their information.  If a consumer provides their name to receive a company’s newsletter, it may be reasonable to expect that the company may use that data for targeted marketing or sales.  But, if the consumer provides the same data to complain about a defective product, their expectation may differ. 

After the initial draft of the Colorado rules were released, it was widely recognized that this purpose-based approach was different from the California information-based approach.  However, when the Colorado Attorney General released revised rules, many commentators seemed to read them as meaning that the California Chart would satisfy Colorado requirements.  But looking at the actual changes, it appears that the Colorado approach is still very much purpose-driven:  the Colorado rules still require businesses to disclose the same set of information (i.e., the categories of information, whether it is used for targeted advertising and sales, and the categories of third parties to whom it is sold), but “linked in a way that gives Consumers a meaningful understanding of how their Personal Data will be used when they provide that Personal Data to the Controller for a specified purpose.”  The California Chart—or any information-driven disclosure—simply does not link the disclosure in this manner because their disclosures are tied to the type of information and not to the purpose.  While a company could theoretically alter the California Chart to break out purposes for each category of information, this exercise would likely be confusing. 

Simply put, unless another revised draft of the Colorado rules change course, privacy policies appear to be one area where companies likely cannot find a “lowest common denominator” for uniform compliance across the board.  Instead, it is an area where the “laboratories of democracy” are testing new approaches in an effort to find what strikes the best balance between protecting consumers and enabling businesses to function without overwhelming compliance costs.  Companies should therefore resist the urge to believe that complying with the CPRA automatically means that they are complying with the CPA.

Many privacy professional may have missed it, but In the run-up to the New Year — while many U.S. companies were focused on complying with the California Privacy Rights Act (CPRA) — Congress passed an appropriations bill that contains significant new cybersecurity requirements for medical device companies.  The  Omnibus Appropriations Bill, which was signed into law on December 29, 2022, contains provisions amending the Federal Food, Drug, and Cosmetic Act to further mandate the implementation of cybersecurity controls for certain internet connected medical devices. Specifically, any ‘device’ (as the term is broadly defined under 21 U.S.C.S. 321(h)) must comply with the new requirements if the device: (1) includes software which is validated, installed, or authorized by the sponsor; (2) has the ability to connect to the internet; and (3) contains any technological characteristics that could be vulnerable to cybersecurity threats.

The new rules go into effect 90 days after the passage of the Bill (or March 22, 2023), Thereafter, any sponsor submitting a cyber device to the FDA must:

  1. Submit to the FDA Secretary a plan to monitor, identify, and address, as appropriate, in a reasonable time, postmarket cybersecurity vulnerabilities and exploits, including coordinated vulnerability disclosure and related procedures;
  • Design, develop, and maintain processes and procedures to provide a reasonable assurance that the device and related systems are cybersecure, and make available postmarket updates and patches to the device and related systems to address: (a) On a reasonably justified regular cycle, known unacceptable vulnerabilities; and (b) As soon as possible out of cycle, critical vulnerabilities that could cause uncontrolled risks; and
  • Provide to the Secretary of the FDA a software bill of materials, including commercial, open-source, and off-the-shelf software components.

Further, the new amendments authorize the FDA to draft regulations containing additional requirements that “demonstrate reasonable assurance that the device and related systems are cybersecure” or regulations which exempt certain devices or device types from the new requirements. While there are no express timing requirements for the draft regulations, the new amendments do require the FDA to update its existing ‘‘Content of Premarket Submissions for Management of Cybersecurity in Medical Devices’’ guidance within two years, and additionally, requires the FDA to update its public facing guidance regarding improving cybersecurity of devices within 180 days.

Medical device manufacturers should carefully review their current cybersecurity controls for covered devices and keep a close eye out for the new FDA guidance and regulations. As always in the world of data privacy, if you blink, you may miss a new law or regulation.

2022 proved to be an historic year for privacy and data security.  Connecticut and Utah joined the list of states that have now passed comprehensive data privacy laws, bringing the total to five (5) states.  For the first time, federal privacy legislation advanced to a House Subcommittee, and though the American Data Privacy and Protection Act (ADPPA) hasn’t been passed by the full House yet, there is still a chance it may happen in this Congress.  The Executive Branch flexed its regulatory muscles in 2022, issuing a number of Executive Orders and regulations designed to tame the scourge of ransomware. The FTC continued its focus on corporate surveillance and signaled its intention to focus more closely on dark patterns, data minimization and children’s data. 

We expect that 2023 will also be a very busy year for privacy and data security.  Meta Pixel and chatbot litigation, which exploded last year, is likely to expand.  Companies will spend this coming year complying with the California Privacy Rights Act (CPRA) and Virginia Consumer Data Protection Act (VCDPA), which became effective on January 1, 2023, and preparing for compliance with Colorado, Utah and Connecticut privacy laws.  Many companies will also spend 2023 bracing for new legal and regulatory requirements that will become final later this year or 2024, such as proposed SEC cyber reporting requirements, the new GBLA Safeguards Rule, NY DFS cyber regulations, breach reporting for critical infrastructure, the California Age Appropriate Design Code, and the EU Digital Services Act – while planning for a world without cookies.

We will be releasing a webcast and podcast to discuss these topics in more detail.  But for now, here are our predictions for 2023.

State Privacy Laws’ Time in the Spotlight

While it is hard to predict the future, 2023 may very well be the tipping point for state privacy compliance.  Indeed, in this year alone, five states will have comprehensive privacy laws go into effect, two of those states will finalize detailed regulations, enforcement of the old CCPA provisions already went live without a cure period, and there will almost certainly be new laws passed during state legislative sessions.  In other words, the already complicated patchwork of privacy laws is continuing to grow in importance and size.     

            State Regulators Will Remain Focused on Website and Mobile App Analytics

Website analytical tools—whether used for marketing or obtaining better insight into how users interact with the platform—have already been at the center of regulators’ crosshairs.  For example, in its September 2022 CCPA enforcement action against Sephora, the California Attorney General made clear that it considers both “the trade of personal information for analytics” and “the trade of personal information for an advertising option” to be sales under the CCPA.  What the Attorney General did not make clear is what specific “common analytical tools” it considers to fall within these categories. 

In 2022, there was little incentive for businesses to fight the allegations because there was a cure period.  However, since that cure period expired on January 1—and because “sales” are part of existing CCPA provisions subject to live enforcement—2023 will likely see enforcement actions involving detailed questions of which analytical tools constitute sales.  In doing so, these actions may call into question the legitimacy of positions taken by various tech companies who label themselves as service providers in their terms and conditions.

            Privacy Policies Will Continue to Diverge

With five different privacy laws going into effect in one year, businesses have been looking for a “lowest common denominator” strategy of compliance.  And, as the Colorado Attorney General has made clear, interoperability is an important guiding principle in the Colorado rulemaking process.  However, the Colorado Attorney General made equally clear that interoperability is just one principle—when the office believes there is a better way of handling an issue, it will diverge from other state practices.  Privacy policies are one such area.

In the initial draft of the Colorado Privacy Act rules, privacy policies are purpose driven—that is, the disclosures relating to how a controller processes and shares personal data is tied to the purpose for which it is collected.  By contrast, the CCPA and CPRA rules are driven by the category of information.  And while the revised draft of the Colorado rules do take steps to try to increase interoperability, a comparison of how the disclosures must work in practice shows that they are still very much purpose driven.  We will explore this concept in more detail in a dedicated post, but for this post, we’ll state that the common CCPA/CPRA approach will not comply with the CPA rules.

Prior to 2023, many companies chose to have a separate California privacy policy already.  With the different approaches of California and Colorado, as well as different terminology under the CPRA (e.g., the word “share”), we expect to see the divergence continue.  

            Data Minimization Drives Better Data Mapping and Inventories

All five state laws going into effect in 2023 have express data minimization provisions.  As discussed below, data minimization has become an increasing focus of regulators, including in the data breach litigation and enforcement context. 

We similarly expect to see a focus on data minimization in the state privacy law compliance context, as it is clearly a “common sense” issue that resonates with regulators.  The corollary to this prediction is that companies will create better data maps and inventories to document their compliance.  Indeed, while data maps are critical to fully complying with data subject access requests, in this context, they are essentially an operational tool.  In the data minimization context, they can become evidence of compliance.  Accordingly, we expect to see companies continue to invest in developing strong data maps and inventories throughout 2023.

More States Will Pass Both Comprehensive and Specialized Privacy Laws

For the past several years, we have seen several states introduce comprehensive privacy acts and specialized privacy acts (such as biometric identifier acts).  Only a handful make it across the finish line.

It is almost certain that 2023 will bring another season of privacy bills in state legislatures and assemblies across the country.  It is almost certain that most of these will stall or fail as in other years.  However, it is likely that some states will be able to pass comprehensive privacy laws.

Predicting legislation is rarely advisable.  However, Maryland, Massachusetts, Michigan, and Minnesota all switched in 2023 to a Democratic trifecta of both legislative chambers and the governorship.  So, keep an eye on these states.  Also watch for additional states to pass biometric identifier laws, as the high profile nature of BIPA lawsuits raise the issue across the country.  To the extent these laws have different exclusions for federally regulated industries, they could create huge compliance burdens.

Data Privacy and Breach Litigation Will Expand

Data privacy litigation had been trending upwards for many years.  In 2021, there were roughly 1,200 data privacy or breach class actions filed (not including TCPA or FCRA claims).  In 2022, we saw a marked increase in new privacy class actions driven primarily by favorable court rulings and new technologies.  For a variety of reasons, described in more detail below, we expect this trend to continue in 2023. 

VPPA Litigation Will Continue   

One of the more surprising trends in 2022 was a resurgence in class action litigation under the Video Privacy Protection Act (VPPA). This rarely enforced law was passed in the late 80s, in the wake of Congressional outrage over media reports of Judge Bork’s video rental history, which emerged during his SCOTUS confirmation hearing.  The law has a very specific purpose: it requires consumer consent for the disclosure by videotape service providers of a consumer’s video viewing history, and provides liquidated damages for a violation of the law.  For decades, plaintiff’s attorneys have trying with limited success to apply this antiquated and highly specific law to internet streaming activities.

In 2022 a new variant of VPPA litigation emerged.  The new claims focus on websites’ usage of Meta Pixel, a tracking cookie that enables the sharing of a consumer’s website activity with Meta.   The typical VPPA complaint alleges that a website that shows videos shares the plaintiff’s video viewing history, without consent, with Meta via the Pixel.  As many websites have thousands of daily visitors accessing videos, the potential statutory damages for a class of website subscribers can quickly reach seven and even eight figures. What separates this latest wave of VPPA lawsuits from prior cases is the assertion that Meta Pixel transmits Facebook ID, which plaintiff’s claim personally identifies consumers and is not merely an anonymized number. 

There have been at least 70 VPPA class action lawsuits field in the past eight months.  Media and news organizations – which often embed videos on their websites– have been a particular target for plaintiff’s lawyers.   Defendants have advanced a number of arguments in their dismissal motions, including that Facebook ID is not personally identifiable data within the meaning of the VPPA and that website operators that merely post news-related videos are not video tape service providers.  Most of the recent class actions are still in the pleadings stage, and only a few courts have thus far ruled on motions to dismiss. Until there is a clear consensus among federal courts on the viability of VPPA claims, we can expect to see a continued stream of VPPA litigation in 2023 and beyond.

Wiretap Litigation Will Expand

The use of Meta Pixel also gave rise in 2022 to a number of class action lawsuits under state wiretap laws.  Plaintiffs in these cases allege that Meta Pixel allows Meta to intercept consumer communications with a website while in transit.  A major driver for these claims are favorable decisions by the Third and Ninth Circuits, both of which permitted wiretap claims to go forward against companies based on their usage of certain kinds of third-party website tools.  The Third Circuit case, Popa v. Harriet Carter Gifts, focused on the use of tracking cookies that allegedly enabled a third party digital marketing entity to intercept the plaintiff’s communications with the website operator.  The Ninth Circuit case, Javier v. Assurance IQ, centered on an insurance quote tool operated by a third party.  In both cases, the appellate courts held that the sharing of consumer communications with a third party constituted an interception that required consumer consent under wiretap laws.

The Third and Ninth Circuits’ broad view of what constitutes an interception under state wiretap laws theoretically embrace a wide array of third party tools that integrate with websites.  Not surprisingly, this has led to a surge of wiretap class actions, particularly in California and Pennsylvania.  Both state wiretap laws allow for liquated damages, require two-party consent, and provide for liability against a website operator that aids and abets a third’s party’s interception of a communications.  At least 60 wiretap class actions have been filed since August 2022 in these two states.

As noted, many of these wiretap class actions focus on Meta Pixel.  Recently, however, plaintiff’s lawyers have been asserting wiretap claims based on the usage of “chatbots”, session replay software, and insurance quote tools.  An even more recent variant of wiretap litigation focuses on hospitals’ alleged usage of Meta Pixel on patient portals, which plaintiffs allege results in the unauthorized sharing of ePHI with Meta.  These patient portal cases often assert claims under the Stored Communications Act (SCA), the Electronic Communications Privacy Act (ECPA) as well as state wiretap laws.

As with VPPA litigation, most of the new wiretap class actions are still in the pleadings stage.  Defendants have asserted a number of grounds for dismissal, including that the tools at issue do not capture the “contents” of communications as required under wiretap laws, and that the companies alleged to have intercepted such communications are service providers, not third parties under the law.  Few courts, however, have ruled on these issues to date.

Given the broad way in which courts are reading wiretap laws, it is highly likely that we will continue to see a steady stream of wiretap class action filings in 2023.  We also expect that plaintiffs will expand the focus of wiretap allegations to include other pixels, tracking cookies, and embedded website technologies operated by third parties. 

Data Breach Class Actions Likely to Stay Steady

For at least a half-dozen years, the number of data breach class actions filed each year has slowly trended upwards and we expect that 2023 will follow this pattern.  This is somewhat surprising in the wake of Trans Union v. Ramirez, in which the Supreme Court held that plaintiffs could not establish federal standing for monetary damages based on the mere risk of future harm.  Although courts within some circuits have dismissed putative breach class actions based on TransUnion, a number of courts have held that breach plaintiffs do have federal standing to proceed with their claims.  One line of reasoning used by such courts is that the breach itself gives rise to a present harm—such as emotional distress — separate from the risk of future harm. Plaintiff’s lawyers have also become adept at finding plaintiffs who have suffered out of pocket expenses arising from a data breach.  Given the huge number of data breaches that occur every year in the U.S., we expect that data breach class actions will continue to trend upwards. 

BIPA Class Actions May Be Poised to Expand (or Slow Down)

For many years, plaintiffs have filed upwards of 500 class action lawsuits each year under the Illinois Biometric Protection Act (BIPA), making Illinois one of the epicenters of data privacy litigation on the country.  In 2022, plaintiff’s attorneys continued their recent focus on cosmetic and other companies using “facial try-on” tools as well as companies using voiceprint technologies and ID verification tools. 

There were a number of court rulings in 2022 addressing a wide range of defenses to the BIPA, most going against defendants.  Illinois courts rejected arguments that:

One of the few favorable rulings for BIPA defendants last year involved higher eds that used remote exam proctoring software that allegedly captured biometric data through scans of students’ facial geometry.  Illinois courts agreed with the higher ed defendants that they are covered by the Gramm-Leach-Bliley Act (GBLA) exemption to the law. 

2023 could be a momentous year for BIPA litigation as we await rulings from the Illinois Supreme Court on two significant issues that could expand, or perhaps constrict, BIPA litigation.  In Tims v. Black Horse Carriers, the Court will address whether a one (1) year statute of limitations applies to all BIPA claims—or just BIPA claims involving “publication” of biometric data.  In Cothron v. White Castle Systems, the Illinois Supreme Court will also address whether certain BIPA claims accrue only once upon the initial collection or disclosure of biometric information, or each time a company collects or discloses biometric information.

Look to Federal Regulators For Next Wave of Privacy Litigation

Privacy litigation has often tracked issues of federal regulatory concern.  For example, online tracking litigation began a decade shortly after the FTC and other regulators became focused on the issue of commercial surveillance.  Using federal regulation as a guide to privacy litigation, here are a few areas where we may see increased litigation in 2023.

Children’s Data

Over the past several years the FTC has been very focused on children’s data.  The recent settlement with Epiq Games for $245 mm may give rise to litigation premised on illegal or deceptive practices to collect or share children’s data, particularly in-game. 

Data Minimization

Data minimization has been another recent focus of the FTC, which recently settled a claim against Drizly that required the company to delete unnecessary data.  Look for breach class actions that include allegations that defendant’s failed to delete consumer data in a timely manner. 

Dark Patterns

This is a focus not only of the FTC but state regulators as well.  The CPRA regulations, for example, include very detailed examples of illegal “dark patterns” that may steer consumers into making choices they otherwise would not have made.  We have seen some regulators pursue dark patterns claims as well (see, e.g., the District of Columbia’s settlement with Google for use of dark patterns in connection with location tracking).  It would not be surprising to see plaintiff’s lawyers begin to assert that certain online disclosures and consenting mechanisms wrongfully mislead consumers into purchasing decisions. 

Employee Monitoring

This has been a focus of several recent state laws, most notably New York City. Plaintiff’s lawyers may use this recent focus on employee privacy rights to pursue data sharing claims against third party monitoring companies. 

Artificial Intelligence

The Holy Grail for privacy ligation may be artificial intelligence, which has two of the key hallmarks of data privacy litigations: it operates largely in the dark – “surreptiously” to borrow a favorite allegation – and has the potential to negatively impact consumers.  Thus far, there has not been a lot of regulation concerning AI –apart from the use of AI to make discriminatory housing and credit decisions – but that may be changing.  Many of the new state privacy laws seek to regulate the usage of AI, and we expect to see regulations in California and Colorado that may develop some legal guardrails.  These regulations may provide greater transparency around the operation of certain AI tools and provide the legal basis for consumer fraud or UDAAP claims.

Cyber Security – Preparing for the Coming Wave of New Regulatory Requirements

In the wake of the huge spike in ransomware attacks in 2020 and 2021, the federal government dedicated significant resources in 2022 to hardening security controls and accelerating reporting obligations for critical infrastructure, public companies, and financial institutions. Some states as well as countries also proposed new cyber regulation in 2022 that will become effective in 2023 or 2024.  One of the key challenges for affected industries in 2023 will be planning to meet the enhanced reporting or security requirements of these new regulations.

Here is a run-down on the status these new cyber regulations and laws:

SEC Disclosure Requirements

On March 9, 2022, the SEC proposed a new rule to enhance and standardize disclosures regarding cybersecurity incidents, risk management, strategy, and governance.  If approved, public companies will be required to disclose material cybersecurity incidents within four (4) days of identifying that a material event has occurred.

The proposed rule also would require public companies to provide updated disclosures relating to previously disclosed cybersecurity incidents. Further, the proposed rule will require disclosures regarding the company’s cyber risk management program.

If finalized in 2023 in its current form, the new SEC reporting requirements will have a significant impact on how public companies manage and disclose cyber incidents.

GLBA Safeguards

On Nov. 15, 2022, the Federal Trade Commission (“FTC”) announced that it was delaying the compliance deadline for eight of the amendments to the Safeguards Rule until June 9, 2023, citing Small Business Administration’s Office of advocacy and a shortage of qualified personal to implement information security programs.

The new Safeguards Rule was originally set to take effect on December 9, 2022.  Covered financial institutions will now have a six month extensions to address certain provisions.

While the extension of the effective date was certainly welcome news, covered financial institutions that have not begun their compliance efforts should not wait any longer.  Indeed, with new operational requirements (such as encryption at rest and multifactor authentication) and contractual issues (such as possible amendments to existing vendor contracts) may require significant ramp-up time.

Cyber Incident Reporting for Critical Infrastructure Act of 2022

The Cyber Incident Reporting for Critical Infrastructure Act of 2022 (“CIRCIA”) was signed into law on March 15, 2022, and requires entities in critical infrastructure sectors to report certain cyber incidents to the Cybersecurity and Infrastructure Security Agency (“CISA”) not later than 72 hours after the covered entity reasonably believes that the cover cyber incident has occurred.  CIRCA will also require any federal entity receiving a report on a cyber incident to share that report with CISA within 24 hours. CISA will subsequently have to make information received under CIRCIA available to certain federal agencies within 24 hours.

As a first step toward a Notice of Proposed Rulemaking, on September 12, 2022, CISA published a Request for Information, which sought public comments (which closed on November 14, 2022) on a wide range of aspects of the CIRCIA regulations.

NYDFS Updated Cybersecurity Regulation

On November 9, 2022, New York Department of Financial Services (“NYDFS”) officially released revised proposed amendments to the cybersecurity regulation, which address cybersecurity requirements for financial services companies, along with a 60-day comment period (which expired on January 9, 2023). The original amendments were released on July 29, 2022, but after a 60-day comment period, the NYDFS determined additional edits were warranted.

In addition to new federal and state cyber regulations, there are a number of international cyber regulations affecting multinational corporations that are pending or may become final in 2023. 

DORA

The EU has endeavored to strengthen the IT security of financial institutions to ensure the financial sector in Europe remains resilient through a severe operation disruption by passing the Digital Operational Resilience Act (“DORA”). DORA sets requirements for financial institutions for cyber/ICT risk management, incident reporting, resilience testing, and third-party outsourcing.

GDPR Updates for Breach Reporting by Controllers Not Established in the EU

In late 2022, the European Data Protection Board (“EDPB”) opened a comment period for the first post-GDPR update to its 2018 data breach notification guidelines. The proposed updates to the guidelines impose more onerous personal data breach notification obligations on controllers who are not established in the EU, but are subject to the extra-territorial provisions of the GDPR. Under the proposed updates, such controllers must report breaches to every single authority for which affected data subjects reside in their Member State within the 72-hour time limit. Such a requirement will be a heavy burden. In 2023, the EDPB will review the comments and possibly revise the proposed guidelines.

European Union—More Fines and More Data Regulation

2023 is likely to be another year of large fines for U.S. tech companies operating in Europe. In addition to the more traditional actions brought under the GDPR, operators of online platforms, hosting services, and providers of network infrastructure will have to comply with the new requirements of the EU’s Digital Services Act.  The requirements of this Act vary depending on the size and business practices of the organization, but generally include new transparency and disclosure requirements as well as new controls on the dissemination of illegal content. With fines reaching as high as 6% of annual worldwide turnover, large US tech companies are sure to be a continued lightning rod for new enforcement actions.

Despite this new addition, international businesses of all sizes must continue to be wary of the compliance obligations of the GDPR.  The Irish Data Protection Authority’s recent action against Microsoft for violations, including use of an asymmetric cookie banner, demonstrate the need for continued caution.

Turning to brighter news, 2023 may carry with it a finalized US adequacy decision through an updated version of the Privacy Shield framework. Following the Biden administration’s October Executive Order and release of regulations implementing the Data Protection Review Court, the European Commission published a draft adequacy decision reflecting a positive assessment of the US privacy framework. While the new framework will certainly be subject to challenges, it was designed specifically to avoid being invalidated by a ‘Schrems III.’  Assuming a best-case scenario, we may see a finalized framework and adequacy decision in the next six to eight months.  Further, in the event of an EU-US adequacy determination, the UK-US determination would likely follow swiftly behind. 

Ad Tech—It’s All About Consent

With the California Privacy Rights Act (CPRA) operational as of January 1, 2023 and Google announcing its shift towards eliminating cookie tracking in Chrome starting mid-year, it is likely we will see contextual advertising become increasingly important for companies looking for ways to access their customers. While it is unlikely the cookie will disappear entirely, we may see it become irrelevant as replacement forms of tracking—such as MAID device identifiers, which reach mobile devices based on device ID, and other tracking technologies that comply with the slate of privacy laws now in effect —become the norm.

That being said, consent is likely going to be—and should be—first on companies’ minds. Companies should be cognizant of their tracking activities and those that continue to engage in targeted advertising will need to incorporate consent and opt-outs into their business practices or incorporate alternative tracking technologies to satisfy their advertisers as well as any applicable legal requirements.