Happy (belated) New Year! 2020 marks the second anniversary of CyberAdviser.  In the word of data privacy and cybersecurity, a great deal has happened over that span of time, including the enactment of the GDPR,  BDLC (Brazil’s new privacy law), and the CCPA, the continued expansion of data breach and biometrics litigation, important US federal and state enforcement activity, enactment of the CLOUD Act, guidance from the Supreme Court regarding Article III standing (especially critical in privacy related litigation) and privacy protections for mobile devices, numerous data breaches (over 5000 reported breaches, affecting 8 billion records in 2019), historic FTC settlements with Facebook and Equifax, the development of new AI and machine learning technologies raising new privacy and security concerns,  among other important developments. Here is a link to our 10 most read blog posts of 2019:

Analysis: Verifying Consumer Requests Under the CCPA

Privacy Legislation Proposed in New York

Denmark DPA Rules on How GDPR Applies to Voice Recordings

Analysis: Notice Provisions of CCPA Proposed Regulations

California Senate Judiciary Committee Advances Amendments to the CCPA    

Proposed Expansion of CCPA’s Private Right of Action Defeated in State Senate

Texas Legislature Weighing Proposed New Privacy Laws        

California Legislature Adopts Five Amendments to CCPA, But Largely Rejects Industry Efforts

Connecticut Becomes Latest State to Enact Insurance Data Security Law

New York State Data Privacy Law Fails   

2020 promises to be a very active year for this blog. Already several states have proposed CCPA-style privacy laws. It is also likely that other states will pass biometric protection laws, and data broker registration laws. The FTC is also expected to announce new proposed regulations to the Safeguards Rule in the coming year. India has a new proposed privacy law that we are closely monitoring. The Supreme Court will be hearing a challenge to the constitutionality of the TCPA. We will be blogging about these issues as they develop. We will also be tracking litigation under the CCPA’s new private right of action for data breaches, and enforcement actions by state AGs with regard to data privacy.

We want to thank our many readers around the world who continue to make this blog such a success. If you’d like to learn more about Ballard Spahr’s Privacy & Data Security Group, please visit our website.

Andrew Smith, Director of the FTC’s Bureau of Consumer Protection, recently announced the following three major improvements that have been made to FTC orders in data security cases:

  1. Specificity: To counter past criticisms that FTC orders to implement comprehensive information security programs were too vague, FTC orders will now require specific security safeguards that address specific allegations in the complaint brought against each company.
  2. Third-party assessor accountability: FTC orders will now give the FTC authority to approve (and re-approve every two years) the third-party assessors that are tasked with reviewing comprehensive data security programs.  Assessors can no longer be a rubber stamp, but must provide the FTC with documents supporting conclusions reached in any assessment, so that the FTC can investigate compliance with and enforce its orders.
  3. Executive responsibility: Copying other legal regimes, such as the New York Department of Financial Services Cybersecurity Regulations, FTC orders will now require companies to present to their Boards about their written information security program every year, so that senior officers can provide annual certifications of compliance to the FTC.  (Director Smith stated that he believes that holding individuals personally accountable under oath is an effective compliance mechanism to incentivize high-level oversight of, and appropriate attention to, data security.)

In his announcement, Director Smith referenced several FTC 2019 data security orders that reflect these improvements.  Companies that find themselves subject to FTC investigation should be mindful of and prepared for the evolving nature of the FTC’s data security orders in the areas involved in these orders.

On November 22nd, the CFPB issued a press release announcing that a stipulated final judgment and order (Order) were filed in the U.S. District Court for the Southern District of New York against Sterling Infosystems, Inc. (Sterling) to resolve allegations that the employment background screening company violated the Fair Credit Reporting Act (FCRA). Continue Reading CFPB settles enforcement action against employment background screening company for alleged FCRA violations

Have you ever looked at a product online and realized it was following you around the internet? Have you ever visited a different website and seen the item you were just thinking about purchasing? These friendly reminders are due to cookies–small text files stored on your browser when you visit or interact with a website or advertisement. Continue Reading CCPA’s Uncertain Effect on Digital Advertising

In this podcast, Ballard Spahr consumer financial services partner Chris Willis talks with Scott Ferris, CEO of Attunely, a provider of machine learning (ML) and artificial intelligence (AI) technology to the debt collection industry.  The podcast addresses how changes in consumer behavior have impacted collections, technology’s role in collections,  how ML/AI can improve profitability, and impediments to adopting ML/AI.  Phil Yannella, Leader of Ballard’s Privacy & Data Security Group, also discusses how the GDPR, CCPA and other US state privacy laws attempt to regulate ML/AI.  Check it out!


Following on the heels of a few relatively small HIPAA settlements, the U.S. Department of Health and Human Services Office of Civil Rights (OCR) announced that it has imposed $2,154,000 in civil monetary penalties against Jackson Health System in Florida for its failure to meet HIPAA privacy and security requirements.  The OCR announcement and accompanying information detail violations that included:

  • The unauthorized access by an employee to the records of more than 24,000 patients over a five-year period (the employee admitted to selling protected health information of more than 2,000 patients for purposes of identity theft).
  • The unauthorized access by staff members to protected health information about a professional athlete who received services at the health system (with some of the information revealed on public media).
  • The loss of certain patient records.
  • The failure to conduct adequate risk assessments, undertake appropriate measures to manage risks that were identified, and review logs that might have shown inappropriate access to information.
  • The failure to implement and maintain adequate policies and procedures to respond to breaches and the failure to report breaches timely and fully.

Significantly, this case did not involve a settlement between OCR and the health system.  The health system did engage with OCR in the course of the investigation, but ultimately chose to accept the civil monetary penalty.  As a result, the materials do not include a specific corrective action plan for the health system to follow under OCR supervision. The materials do identify measures that the health system has undertaken to improve its privacy and security programs.

Settlement agreements typically provide limited information.  By contrast, the notices published in this case provide not only details about the health system’s violations, but information about how OCR determined the amount to assess in civil monetary penalties.  It considered various factors, including the nature and extent of the violations and the harm resulting from those violations, the history of the health system’s compliance, and the health system’s financial condition and cooperation in the investigation.  OCR also took into account the health system’s mitigating and corrective actions.

Notwithstanding the size of the civil monetary penalty, it could have been larger.  OCR chose to group violations into three broad categories, relating to failures in the security management process, information access management, and the provision of notice to HHS.  It viewed the first two of these failures as attributable to reasonable cause.  New limits cap penalties for any one type of violation arising from reasonable cause at $100,000 per year.  As a result, most of the civil monetary penalty in this case is attributable to the health system’s failure to provide OCR with timely and accurate notice of a breach caused by a loss of paper records.  OCR viewed this failure as one of willful neglect, for which penalties were capped at $1.5 million, even though this violation was seen as lasting only 31 days.

The materials published by the OCR serve as a warning about issues that might arise, particularly with regard to the implementation of policies designed to prevent and detect HIPAA violations.  They also provide insight into how OCR is prepared to both impose significant civil monetary penalties and temper the amount of those penalties, even in situations that do not involve a formal settlement agreement.

For businesses, one of the more worrisome scenarios under the CCPA occurs when they mistakenly provide personal information of a consumer to the wrong party in response to a consumer request, whether because of fraud or simple mistake. Because the definition of data breach under the CCPA is very broad, the unauthorized sharing of personal information with the wrong party could theoretically give rise to a civil cause of action with statutory penalties of $100-$750, per consumer. As a result, businesses have been anxiously waiting to see how the proposed Regulations would address the consumer verification process.

The good news for businesses is that the proposed Regulations provide significant detail concerning the verification process, and those details will likely assuage the concerns of many businesses about potential litigation.

As a general matter, the proposed Regulations require that businesses verify consumers wherever possible using personal information collected from the consumer (or use a third-party identification service that complies with same requirements). Businesses should use reasonable security measures to detect fraudulent identity verification procedures and prevent the unauthorized access to or deletion of a consumer’s personal information.

Password Protected Accounts

For password-protected accounts, the proposed Regulations allow businesses to verify the consumer’s identity through its existing authentication practices if those practices are otherwise consistent with the CCPA regulations. Businesses must also require that consumers making requests through password-protected accounts re-authenticate themselves before responding to a deletion or right to know request. If the business believes that there is fraudulent or malicious activity on a password-protected account, it may require additional verification procedures to confirm the consumer request is authentic.

Two-Tier Verification For Non-Password Protected Accounts

The proposed Regulations outline a two-tier verification process for non-password protected accounts. This process requires that businesses verify requests to know categories of personal information to a “reasonable degree of certainty.” To meet this standard, businesses could match two (2) pieces of consumer provided personal information with personal information maintained by the business. For requests to know specific pieces of information, businesses must verify the consumer to a reasonably high degree of certainty, which can be accomplished by matching three (3) pieces of consumer provided personal information with personal information retained by the business. The proposed Regulations provide businesses with discretion for verifying requests to delete, depending on the sensitivity of the personal information. The table below lays this out.


Abbreviated Right to Know (Categories of Personal Information) Right to Know Specific Pieces of Information Right to Delete

Two Steps:

Using information provided by consumer, verification must be to reasonable degree of certainty, which may include matching at least two data points provided by the consumer.















All deletion requests require verification at the time of the request, and re-verification before any data is deleted.

Business may use its discretion, based on sensitivity of data, whether to use two or three-step verification.

E.g., deletion of browsing history may only require reasonable degree of certainty whereas deletion of family photos may require reasonably high degree of certainty.

Three Steps:

Using information provided by consumer, verification must be to reasonably high degree of certainty, which may include matching at least three data points provided by the consumer and obtaining a signed declaration under penalty of perjury that the requestor is the consumer whose personal information is the subject of the requests.



The proposed verification process, especially for a right to know specific pieces of information collected about the consumer, sets a high standard—particularly insofar as the proposed Regulations discourage businesses from using sensitive information such as SSN for matching purposes (as discussed below). The declaration envisioned by the proposed Regulations adds an additional hurdle for consumers.

Businesses will have to make a determination of what pieces of personal information to use for matching purposes based on what information they are holding for consumers, and the sensitivity of such information. One likely consequence, however, is that the added verification procedures will reduce the number of verifiable requests to know specific pieces of information as well as requests to delete that businesses must honor. Don’t be surprised if privacy advocates decry the heightened verification procedures in the upcoming Public Comment period.

Shielding Sensitive Data From the Verification and Response Process

The proposed Regulations directly address the concern discussed above about potential civil causes of action stemming from mistakes made during the consumer request process by explicitly prohibiting businesses from providing in response to a request to know the following sensitive information:

  • social security number
  • driver’s license number
  • state identification number
  • medical and health information
  • financial account number
  • account passwords, or
  • security questions and answers.

The proposed Regulations, furthermore, prohibit businesses from requesting such information in the verification process “unless necessary.”

The CCPA’s private right of action only applies to unauthorized access and theft, exfiltration or disclosure of personal information as defined under the California breach notification statute—which is notably narrower than the definition of personal information under the CCPA and similar to the sensitive data listed above. The effect of the proposed Regulations, then, is largely to prohibit businesses from unnecessarily collecting or disclosing the kinds of data that might trigger a cause of action under the CCPA in the event of a breach.

Use of Authorized Agents to Make Consumer Requests

The CCPA allows consumers to use authorized agents to make consumer requests, increasing the potential for fraud where tricksters impersonate authorized agents or even use bots to scam businesses into providing consumer information. The proposed Regulations allow businesses to guard against this by requiring that such agents produce a signed authorization from the consumer. Businesses can also require that consumers using agents separately verify their identify to the business to prevent fraud

Requirements Where a Business Cannot Verify a Consumer

If there is no reasonable method by which a business can verify the identity of the consumer, the business must state so in response to the request. If it is not possible for the business to verify requests from a whole category of consumers, however, the proposed Regulations allow the business to state so in the privacy policy along with an explanation as to why it has no method to identify the consumer.

The use of a categorical disclosure may be particularly useful for a business that receives requests to know from website visitors who do not have accounts open with the business and otherwise haven’t provided any other information that the business can use to verify their identity.



To the surprise of some, the proposed CCPA Regulations issued last Thursday don’t address many of the well-discussed ambiguities under the law (such as what “valuable consideration” means in the context of a sale of personal information). Rather, the proposed Regulations address a number of technical, nut-and-bolt type compliance issues concerning how businesses must make required privacy disclosures, provide opt-outs notices, verify and respond to consumer requests.

We’ll discuss the issues the proposed Regulations don’t address in a future post. But for now, we turn our attention to specific sections of the proposed Regulations, highlighting any new legal compliance issues the proposed Regulations raise.

We’ll start with Article 2, which covers “Notices to Consumers”. The CCPA requires that businesses disclose to consumers at or before the time of collection the categories of personal information to be collected from consumers and the purposes for which the personal information will be used. Businesses are prohibited from collecting or using personal information that is not disclosed prior to collection, and must obtain explicit consent from the consumers for any new use that was not previously disclosed.

Online v. Offline Disclosures

The first point to note about the Notice provisions of the proposed Regulations is that they explicitly differentiate between online and offline privacy notices. This clarifies an ambiguity in the CCPA itself, which requires notice at or before collection but doesn’t explain how businesses can provide notices online and offline.

Here’s what the proposed Regulations say about the mechanics of providing Notice:


Online Offline

Notice of information collected from consumers and business or commercial use of such personal information

Must be provided prior to collection of any personal data.   Businesses can fulfill this obligation by providing link to section of privacy policy that contains Notice information. Must be provided prior to collection of any personal data. Different, and non-exclusive, options include providing Notice in paper disclosures or provide an in-store poster with URL for Notice.
Opt Out If business sells personal information, it must furnish a “Do Not Sell My Personal Information” or “Do Not Sell My Info” link in the Notice. Business must provide URL for webpage to which the “Do Not Sell My Personal Information” or “Do Not Sell My Info” links direct users.

The proposed Regulations identify a non-exclusive list of mechanisms for providing offline Notices. One option that is not mentioned, but would certainly seem appropriate, is for businesses to provide a clearly labeled URL directing consumers to the online privacy policy in the text of any paper disclosures a business makes to consumers. If posted signage identifying the URL is a permissible way to provide an offline Notice, it would seem to follow that providing the same information in a paper disclosure would also be permissible.

Linking to Online Privacy Policy In Lieu of Providing Separate Notice

The second thing to note about the Notice provisions of the proposed Regulations is that the content of the Notice is a subset of the information that businesses must provide in the privacy policy. (The privacy policy must contain the categories of personal information the business has collected about consumers in the prior 12 months.)

The proposed Regulations appear to acknowledge the partial redundancy by creating a process by which businesses that operate a website can provide a link to the California specific portions of their privacy policy containing the same information as required by the Notice –in lieu of providing a separate Notice. Businesses that wish to take advantage of this provision will need to include the uses of personal information collected from consumers in their privacy policy (which the CCPA doesn’t currently require).

Accessibility Requirements

Third, the Notice provisions of the proposed Regulations now include an accessibility requirement. More specifically, the required Notice (as well as the opt-out notice and privacy policy) must be accessible to consumers with disabilities. At a minimum, businesses must provide “information on how a consumer with a disability may access the notice in an alternative format.”

It is likely that many businesses had not considered online accessibility issues before the proposed Regulations came out. Many businesses have gone through an ADA website accessibility analysis in the past. Those who have not should consider doing so now.

New Attestation Requirements

The fourth and final provision of Article 2 that’s noteworthy appears in section (d), which makes clear that a business that does not collect information directly from consumers does not need to provide a Notice of collection. Before selling any such personal information, however, the business must either:

  • Contact the consumer directly to provide the required Notice, and Notice of Opt-Out; or
  • Contact the source of the personal information and confirm the source provided the required Notice and obtain a signed attestation describing how the source gave the Notice at collection along with an example of the Notice.

The upshot of this new requirement is that businesses that either sell or share data with third party data brokers or lead generators, or others in the business of selling data, can assume that they will see requests from these third-parties for attestations. For such businesses, it may make sense to consider the content and format of the attestations as well as how they might automate the process of providing the attestations, if the volume is likely to be significant.

In our next post, we will address the new verification procedures outlined in the proposed Regulations.


The California Attorney General’s Office released its long-awaited proposed CCPA Regulations this afternoon.  The proposed Regulations are 24 pages long, and address a number of important technical compliance issues including how businesses should:

  • provide just in time notice to consumers of personal information collected;
  • provide notice to consumers of the right to opt out of the sale of personal information;
  • provide notice to consumers of financial incentives;
  • provide a CCPA compliant privacy policy;
  • provide methods for consumers to submit requests to know and requests to delete their personal information;
  • respond to consumer requests to know and requests to delete their personal information
  • respond to consumer requests to access or delete household information;
  • respond to requests to opt-out;
  • respond to requests to opt-in after consumers exercise their right to opt out of the sale of personal information; and
  • verify consumer requests.

The AG’s office also released a 97 page Initial Statement of Reasons (including appendices).

Ballard’s Privacy & Data Security lawyers are carefully reviewing the proposed Regulations. We will post our thoughts on the effect of the proposed Regulations, what they mean from a compliance standpoint, what issues the proposed Regulations fail to address, and what’s next for the CCPA in the coming days.

The perplexing question of what U.S. companies must do to comply with EU “cookie” law became slightly more clear with the recent decision of the European Court of Justice (CJEU) in Planet49 GmbH, but numerous questions still remain. A main source of confusion about cookies is the interplay between two EU privacy laws, the ePrivacy Directive and the GDPR. The former governs, among other things, the placement of cookies and marketing pixels on the browsers of website visitors and the latter governs the subsequent processing of personal data, which in many cases includes cookies. Some cookies, in other words, are subject to the ePrivacy Directive but not the GDPR. Another complication is that the ePrivacy Directive does not have an extra-territorial effect whereas the GDPR does have such an effect.

Many privacy professionals had hoped that the CJEU’s ruling in Planet49 would provide some much-needed clarity to a muddled legal picture. And it does, sort of.


The case involved participation in a lottery organized by Planet49 GmbH, an online gaming company. To enter the lottery, internet users were prompted to enter their personal information, then presented with two checkboxes. The first required the user to agree to be contacted by other businesses for promotional offers. The second checkbox, which contained a pre-ticked box, required the user to consent to cookies. In order to participate in the lottery, the first checkbox needed to be ticked.

The first question referred to the CJEU concerned whether the use of a pre-ticked box was sufficient to obtain valid consent for placing cookies on a user’s device. The second question referred to the CJEU was whether service providers need to give users information specifically about the duration of the operation of the cookies and access by third parties.

How Planet49 Establishes Some Clear Guidelines

The CJEU’s recent ruling in Planet49 helps to clarify some of the rules governing the placement of cookies. First, the CJEU ruled that Planet49’s use of pre-ticked boxes is not a sufficient basis to establish consent. The writing had been on the wall for the use of pre-ticked boxes even before the GDPR became effective like other references, so this part of the ruling is not surprising.

What makes the ruling significant is the Court’s finding that consent requires some action on the part of the user. Although the ruling technically addresses consent under the ePrivacy Directive, the CJEU’s ruling suggests that inferring consent from passive activities, such as the continued browsing of a website, might not meet the GDPR’s more exacting “affirmative consent” standard. This finding aligns with recent guidance from the ICO, CNIL and the German Data Protection Authority (DPA), who have all issued guidance aligning consent under the ePrivacy Directive with the GDPR standard and have explicitly stated that the continued browsing of a website alone does not constitute consent. The CJEU did not go quite this far, but the days may be numbered for cookie banners that infer consent from continued browsing.

Second, the ruling makes clear that data controllers must gather consent for the placement of all non-essential cookies on a user’s device. This includes analytic cookies, which are commonly used by most companies with a website.

Lastly, the CJEU ruling requires that data controllers disclose the duration for cookie retention as well as the sharing of cookies with third parties in order to satisfy the ePrivacy Directive’s requirement that consent be “freely given specific and informed indication of the user’s wishes.” The ruling does not state what the maximum retention period for a cookie should be, but some EU regulators have suggested retention periods in recent guidance.

What the CJEU Ruling Doesn’t Resolve

For U.S. companies with physical operations in the EU, the CJEU ruling does not address a number of thorny issues. In particular, one open question is what, if any, user actions short of physically clicking an “Accept” button might constitute a valid cookie consent. While continued browsing might not be sufficient active to establish consent, would clicking out of a cookie banner?

The CJEU ruling also does not address the question of whether use of cookie walls (whereby access to a website requires that website visitors agree to use of cookies) are permissible under the ePrivacy Directive or GDPR. There is a split among EU data regulators on this, with CNIL and the German DPA holding that cookies walls are not permissible whereas the ICO has held that cookie walls may, under certain circumstances, be valid.

The subsequent processing of tracking cookies for the placement of targeted ads is another issue that remains muddled in the wake of the CJEU opinion. The ICO has taken the position that a data controller cannot rely on legitimate interests as a basis for subsequent processing of cookies, particularly tracking cookies. The CNIL and the German DPA have not gone so far as the ICO and appear to leave open the possibility that legitimate interests may be permissible for subsequent processing of cookies.

For U.S. companies that don’t have an establishment in the EU, compliance is even more complicated insofar as these companies may be subject to the GDPR but not the ePrivacy Directive. The GDPR only governs the processing of cookies or other online identifiers that gather or contain personal information whereas the ePrivacy Directive covers the placement of any cookie or file on a user’s browser. Thus, for U.S. companies that don’t maintain an EU establishment, it remains unclear whether the guidance of EU data regulators regarding analytic cookies, for example, applies.


The bottom line is that for U.S. companies doing business in Europe, the CJEU’s recent ruling provides some important guardrails useful for fashioning cookie banners and policies, but numerous questions remain unresolved. Until an ePrivacy Regulation is released, U.S. companies will likely follow the proverbial herd, trying their best to hide in a crowd of other companies also struggling to understand where the lines lay.