The widespread use of social media platforms make them ideal for companies trying to reach a large audience.  Pharmaceutical and consumer products industries frequently maintain their own social media accounts and partner with celebrities, physicians, patients, and “influencers”—i.e., individuals who have achieved online celebrity and whose posts reach a wide audience—to endorse their products through social media campaigns.  Although U.S. regulatory agencies have already been closely monitoring the development of these advertising platforms, the Food & Drug Administration (FDA) and the Federal Trade Commission (FTC) have both recently announced efforts to modernize their understanding of the impact that endorsers have on consumers, signaling the likelihood of more aggressive enforcement in the near future.

The FDA has proposed two studies geared towards evaluating the impact of different types of endorsers (celebrity, physician, patient, and influencer) and payment disclosures on consumers. The agency has invited comments on:

  • whether the proposed collection of information is necessary for the proper performance of FDA’s functions, including whether the information will have practical utility;
  • the accuracy of FDA’s estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;
  • ways to enhance the quality, utility, and clarity of the information to be collected; and
  • ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques, when appropriate, and other forms of information technology.

The comment period ends on March 30, 2020.

The FTC is currently engaging in a systematic review of its regulations and guides, and is accepting comments on its existing “Guides Concerning the Use of Endorsements and Testimonials in Advertising” (the Guides).  The Guides serve an advisory purpose, assisting businesses and others to conform their endorsement and advertising practices to the requirements of Section 5 of the FTC Act.   The topics that the FTC is seeking comments on include the following key areas:

  • modifications to the Guide that are necessary in response to technological, economical, or environmental changes;
  • the effectiveness and necessity of disclosing material connections;
  • consumers’, with an emphasis on young consumers’, understanding of disclosures of material connections;
  • the practice of offering incentives to individuals who are not endorsers in exchange for positive reviews;
  • the practice of soliciting feedback and funneling satisfied customers to review sites and dissatisfied consumers to further customer service resolution centers; and
  • the use of affiliate links.

Commissioner Rohit Chopra released a statement on February 12, 2020 in which he encouraged “[codifying] elements of the existing endorsement guides into formal rules so that violators can be liable for civil penalties.” Businesses interested in having their input considered by the FTC should submit their responses to the FTC’s request for comments before the comment deadline of April 21, 2020.

The actions taken by these regulatory agencies reflect a growing interest in how a company uses endorsers to market consumer products and suggests that the regulatory landscape may soon evolve. Ballard Spahr will continue to monitor this space for further developments.  In the meantime, FDA and FTC regulated companies should consider submitting comments to the appropriate regulatory authority and revisting their advertising practices with regard to endorsements.


On Friday, February 7, 2020, the California Attorney General’s (AG) Office released modified regulations to the California Consumer Privacy Act (CCPA).  The modified regulations incorporate amendments to the CCPA signed into law after the AG’s Office promulgated regulations in October 2019. The modified regulations also reflect public comments made during the initial comment period, which concluded in December 2019.  Overall, the modified regulations provide helpful clarifications that should lessen compliance burdens for a number of industries.  Of note, the modified regulations:

  1. Limit Definition of Personal Information.  The modified regulations clarify that “personal information” does not include information that a business collected but cannot reasonably link to a consumer.  For example, “if a business collects the IP addresses of visitors to its website but does not link the IP address to any particular consumer or household, and could not reasonably link the IP address with a particular consumer or household” then the IP address would not be “personal information.”  This is a particularly important limitation for businesses that don’t have a direct relationship with California consumers but rather only collect personal information via the website.
  2. Define Reasonable Accessibility.  The initial proposed regulations included a new requirement that privacy policies and online notices be reasonably accessible, without offering any definition of the standards.  The modified regulations state that reasonable accessibility means compliance with generally recognized industry standards, such as the Web Content Accessibility Guidelines, v2.1 – the prevailing standard used for ensuring compliance with the Americans with Disability Act (ADA) website accessibility requirements.
  3. Requiring JustinTime Notice for Unexpected Data Collection:  The modified regulations state, “When a business collects personal information from a consumer’s mobile device for a purpose that the consumer would not reasonably expect, it shall provide a just-in-time notice containing a summary of the categories of personal information being collected and a link to the full notice at collection. For example, if the business offers a flashlight application and the application collects geolocation information, the business shall provide a just-in-time notice, such as through a pop-up window when the consumer opens the application, which contains the information required by this subsection.” This requirement aligns with Federal Trade Commission (FTC) guidelines and the 2020 Network Advertising Initiative (NAI) Code of Conduct.
  4. Removal of Webform Requirement.  The modified regulations remove a requirement set forth in the initial proposed regulations requiring businesses to provide two or more methods for consumers to submit consumer access requests, one of which was an interactive webform. The modified regulations permit businesses to meet this requirement by providing a toll-free number and a designated email address.
  5. Limiting Search Obligations in Response to Right to Know Requests.  The modified regulations clarify that a business is not required to search for personal information in response to a right to know request where the business: does not maintain the personal information in a searchable or reasonably accessible form; the business maintains the personal information for legal or compliance purposes; the business does not sell or use the personal information for a commercial purpose; and the business describes to the consumer the categories of records that may contain personal information that the business did not search. This limitation partly addresses the question of whether (and when) right to know requests include access to data held in hard to search, unstructured systems.
  6. OptOut buttons.  The modified regulations includes examples of compliant opt-out buttons.
  7. Streamlining Requirements for Data Brokers.  The initial proposed regulations required that a company selling information it had collected indirectly ensure that the first-party business had issued a “notice at collection” to the consumer.  The current draft removes this requirement provided these third parties register as data brokers and include a link to their privacy policy, which contains opt-out instructions.

There are other changes to the regulations that have the effect of limiting some of the other   compliance burdens for businesses.  As expected, however, the modified regulations do not provide additional clarity regarding the meaning of “sale/sell/selling” or define what “reasonable data security” means.

The AG’s Office will  accept public comments to the modified regulations until February 24, 2020.  The regulations are expected to be finalized in April or May 2020.

Although the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) may yet announce one or two year-end settlements, it appears that 2019 will be known more for the implementation of changes in HIPAA enforcement policy than for any of the particular matters that OCR resolved.  Last April, OCR announced that it would lower the maximum penalties assessed for most categories of HIPAA violations.  Previously, the same maximum $1.5 million cap applied to all categories of violations, regardless of severity.  The new policy lowered the limit to:

  • $25,000, when an entity does not know and would not have known of the violation when exercising reasonable diligence.
  • $100,000 when the violation is due to reasonable cause.
  • $250,000 when the violation arises from willful neglect, but is corrected.

Only violations that result from willful neglect and are not corrected remain subject to the $1.5 million cap.

It initially appeared that the new enforcement policy was producing a dramatic reduction in all settlement amounts.  A settlement reached before (although announced after) publication of the new policy resulted in a $3 million penalty payment, while the penalty amounts for the first few settlements that followed the guidance never topped $100,000.  However, as 2019 progressed, OCR announced a number of larger settlements.  For example:

  • OCR imposed a penalty of approximately $2.15 million against Jackson Health System for violations that included staff members’ unauthorized access to the protected health information (PHI) of a professional athlete, the unauthorized access by an employee of records of more than 24,000 patients (records that were eventually sold), and the loss of certain patient records. The Health System waived its right to a hearing and did not contest OCR’s Notice of Proposed Determination.
  • OCR imposed a penalty of $1.6 million against the Texas Health and Human Services Commission after it discovered a vulnerability in a web application designed to collect and report information for Medicaid waiver programs. The Commission discovered the breach when an unauthorized user reported gaining access to the application without entering credentials.  Following an investigation, OCR determined that PHI had been placed on a public server that allowed an undetermined number of unauthorized users to view names, Social Security numbers, Medicaid numbers and treatment information of approximately 6,500 individuals.
  • OCR secured a settlement of $3 million with the University of Rochester Medical Center (URMC) after URMC reported that a flash drive containing PHI was lost and an unencrypted laptop that contained PHI was subsequently stolen from a treatment facility. Following an investigation, OCR determined that URMC had failed to conduct a thorough risk analysis of vulnerabilities of the electronic PHI (ePHI) in its possession and to implement sufficient policies and procedures safeguarding the movement of hardware and media containing ePHI within and outside of the facility, including a failure to sufficiently encrypt ePHI.
  • Sentara Hospitals agreed to pay approximately $2.2 million to settle allegations that it inappropriately disclosed PHI of 577 patients when it mailed the billing statements for these patients to the wrong addresses. Sentara reported the breach to OCR, but incorrectly limited its report to eight individuals based on its erroneous understanding that it was required to report breaches only if they disclosed specific medical information, such as a patient’s diagnosis or treatment.  In addition, Sentara failed to report all affected individuals even after OCR advised it of its requirement to report all violations.

In total, half of the announced OCR actions involved more than $1.5 million in penalties (it is worth keeping in mind that the annual cap applies per type of violation, so multiple types of violations may result in assessments that exceed the $1.5 million per-type cap), while the remaining half ranged from $10,000 to $100,000.  Two of the smaller settlements involved OCR’s first enforcement actions related to its Right of Access Initiative, which focuses on the rights of individuals to receive copies of their medical records in a timely manner without being overcharged.  Both of those actions resulted in settlements of $85,000.

2019 continued a trend set in prior years by starting slowly.  Eight of the ten assessments announced by OCR occurred after mid-September.

Absent any late-breaking announcements for actions resolved at the end of last year, the total assessments for 2019 will amount to a little more than $12 million.  That amount is less than half the record-setting amount of 2018, although the number of actions resolved was similar.  It is difficult to assess how much of this decrease is attributable to the new enforcement policy.  2018 appears to have been an anomaly with more than half of the assessments arising from the $16 million settlement with Anthem.

Given the sharp division between large and small settlements, it appears that OCR is making distinctions that place violations in different categories of severity and that treat certain violations as being of the same or a different type.  At this time, there is only a small sample space with limited information as to how OCR is making these distinctions.  OCR announcements for situations where they assessed penalties without reaching an agreement provide significantly greater information on how OCR views certain matters, but still leave much to speculation.

Ultimately, parsing out what leads to larger vs. smaller penalties should not guide an entity’s approach on how to address HIPAA’s privacy and security requirements.  Health care providers, health benefit plans, healthcare clearinghouses, and their respective business associates ought to take diligent measures to safeguard PHI and otherwise comply with HIPAA.  If a violation does occur, it should be addressed promptly and thoroughly to minimize the harm to individuals and to prevent it from happening again.  And, even though the penalties may be relatively small, those subject to HIPAA should aim to respond timely and appropriately to an individual’s request for records.

Happy (belated) New Year! 2020 marks the second anniversary of CyberAdviser.  In the word of data privacy and cybersecurity, a great deal has happened over that span of time, including the enactment of the GDPR,  BDLC (Brazil’s new privacy law), and the CCPA, the continued expansion of data breach and biometrics litigation, important US federal and state enforcement activity, enactment of the CLOUD Act, guidance from the Supreme Court regarding Article III standing (especially critical in privacy related litigation) and privacy protections for mobile devices, numerous data breaches (over 5000 reported breaches, affecting 8 billion records in 2019), historic FTC settlements with Facebook and Equifax, the development of new AI and machine learning technologies raising new privacy and security concerns,  among other important developments. Here is a link to our 10 most read blog posts of 2019:

Analysis: Verifying Consumer Requests Under the CCPA

Privacy Legislation Proposed in New York

Denmark DPA Rules on How GDPR Applies to Voice Recordings

Analysis: Notice Provisions of CCPA Proposed Regulations

California Senate Judiciary Committee Advances Amendments to the CCPA    

Proposed Expansion of CCPA’s Private Right of Action Defeated in State Senate

Texas Legislature Weighing Proposed New Privacy Laws        

California Legislature Adopts Five Amendments to CCPA, But Largely Rejects Industry Efforts

Connecticut Becomes Latest State to Enact Insurance Data Security Law

New York State Data Privacy Law Fails   

2020 promises to be a very active year for this blog. Already several states have proposed CCPA-style privacy laws. It is also likely that other states will pass biometric protection laws, and data broker registration laws. The FTC is also expected to announce new proposed regulations to the Safeguards Rule in the coming year. India has a new proposed privacy law that we are closely monitoring. The Supreme Court will be hearing a challenge to the constitutionality of the TCPA. We will be blogging about these issues as they develop. We will also be tracking litigation under the CCPA’s new private right of action for data breaches, and enforcement actions by state AGs with regard to data privacy.

We want to thank our many readers around the world who continue to make this blog such a success. If you’d like to learn more about Ballard Spahr’s Privacy & Data Security Group, please visit our website.

Andrew Smith, Director of the FTC’s Bureau of Consumer Protection, recently announced the following three major improvements that have been made to FTC orders in data security cases:

  1. Specificity: To counter past criticisms that FTC orders to implement comprehensive information security programs were too vague, FTC orders will now require specific security safeguards that address specific allegations in the complaint brought against each company.
  2. Third-party assessor accountability: FTC orders will now give the FTC authority to approve (and re-approve every two years) the third-party assessors that are tasked with reviewing comprehensive data security programs.  Assessors can no longer be a rubber stamp, but must provide the FTC with documents supporting conclusions reached in any assessment, so that the FTC can investigate compliance with and enforce its orders.
  3. Executive responsibility: Copying other legal regimes, such as the New York Department of Financial Services Cybersecurity Regulations, FTC orders will now require companies to present to their Boards about their written information security program every year, so that senior officers can provide annual certifications of compliance to the FTC.  (Director Smith stated that he believes that holding individuals personally accountable under oath is an effective compliance mechanism to incentivize high-level oversight of, and appropriate attention to, data security.)

In his announcement, Director Smith referenced several FTC 2019 data security orders that reflect these improvements.  Companies that find themselves subject to FTC investigation should be mindful of and prepared for the evolving nature of the FTC’s data security orders in the areas involved in these orders.

On November 22nd, the CFPB issued a press release announcing that a stipulated final judgment and order (Order) were filed in the U.S. District Court for the Southern District of New York against Sterling Infosystems, Inc. (Sterling) to resolve allegations that the employment background screening company violated the Fair Credit Reporting Act (FCRA). Continue Reading CFPB settles enforcement action against employment background screening company for alleged FCRA violations

Have you ever looked at a product online and realized it was following you around the internet? Have you ever visited a different website and seen the item you were just thinking about purchasing? These friendly reminders are due to cookies–small text files stored on your browser when you visit or interact with a website or advertisement. Continue Reading CCPA’s Uncertain Effect on Digital Advertising

In this podcast, Ballard Spahr consumer financial services partner Chris Willis talks with Scott Ferris, CEO of Attunely, a provider of machine learning (ML) and artificial intelligence (AI) technology to the debt collection industry.  The podcast addresses how changes in consumer behavior have impacted collections, technology’s role in collections,  how ML/AI can improve profitability, and impediments to adopting ML/AI.  Phil Yannella, Leader of Ballard’s Privacy & Data Security Group, also discusses how the GDPR, CCPA and other US state privacy laws attempt to regulate ML/AI.  Check it out!


Following on the heels of a few relatively small HIPAA settlements, the U.S. Department of Health and Human Services Office of Civil Rights (OCR) announced that it has imposed $2,154,000 in civil monetary penalties against Jackson Health System in Florida for its failure to meet HIPAA privacy and security requirements.  The OCR announcement and accompanying information detail violations that included:

  • The unauthorized access by an employee to the records of more than 24,000 patients over a five-year period (the employee admitted to selling protected health information of more than 2,000 patients for purposes of identity theft).
  • The unauthorized access by staff members to protected health information about a professional athlete who received services at the health system (with some of the information revealed on public media).
  • The loss of certain patient records.
  • The failure to conduct adequate risk assessments, undertake appropriate measures to manage risks that were identified, and review logs that might have shown inappropriate access to information.
  • The failure to implement and maintain adequate policies and procedures to respond to breaches and the failure to report breaches timely and fully.

Significantly, this case did not involve a settlement between OCR and the health system.  The health system did engage with OCR in the course of the investigation, but ultimately chose to accept the civil monetary penalty.  As a result, the materials do not include a specific corrective action plan for the health system to follow under OCR supervision. The materials do identify measures that the health system has undertaken to improve its privacy and security programs.

Settlement agreements typically provide limited information.  By contrast, the notices published in this case provide not only details about the health system’s violations, but information about how OCR determined the amount to assess in civil monetary penalties.  It considered various factors, including the nature and extent of the violations and the harm resulting from those violations, the history of the health system’s compliance, and the health system’s financial condition and cooperation in the investigation.  OCR also took into account the health system’s mitigating and corrective actions.

Notwithstanding the size of the civil monetary penalty, it could have been larger.  OCR chose to group violations into three broad categories, relating to failures in the security management process, information access management, and the provision of notice to HHS.  It viewed the first two of these failures as attributable to reasonable cause.  New limits cap penalties for any one type of violation arising from reasonable cause at $100,000 per year.  As a result, most of the civil monetary penalty in this case is attributable to the health system’s failure to provide OCR with timely and accurate notice of a breach caused by a loss of paper records.  OCR viewed this failure as one of willful neglect, for which penalties were capped at $1.5 million, even though this violation was seen as lasting only 31 days.

The materials published by the OCR serve as a warning about issues that might arise, particularly with regard to the implementation of policies designed to prevent and detect HIPAA violations.  They also provide insight into how OCR is prepared to both impose significant civil monetary penalties and temper the amount of those penalties, even in situations that do not involve a formal settlement agreement.

For businesses, one of the more worrisome scenarios under the CCPA occurs when they mistakenly provide personal information of a consumer to the wrong party in response to a consumer request, whether because of fraud or simple mistake. Because the definition of data breach under the CCPA is very broad, the unauthorized sharing of personal information with the wrong party could theoretically give rise to a civil cause of action with statutory penalties of $100-$750, per consumer. As a result, businesses have been anxiously waiting to see how the proposed Regulations would address the consumer verification process.

The good news for businesses is that the proposed Regulations provide significant detail concerning the verification process, and those details will likely assuage the concerns of many businesses about potential litigation.

As a general matter, the proposed Regulations require that businesses verify consumers wherever possible using personal information collected from the consumer (or use a third-party identification service that complies with same requirements). Businesses should use reasonable security measures to detect fraudulent identity verification procedures and prevent the unauthorized access to or deletion of a consumer’s personal information.

Password Protected Accounts

For password-protected accounts, the proposed Regulations allow businesses to verify the consumer’s identity through its existing authentication practices if those practices are otherwise consistent with the CCPA regulations. Businesses must also require that consumers making requests through password-protected accounts re-authenticate themselves before responding to a deletion or right to know request. If the business believes that there is fraudulent or malicious activity on a password-protected account, it may require additional verification procedures to confirm the consumer request is authentic.

Two-Tier Verification For Non-Password Protected Accounts

The proposed Regulations outline a two-tier verification process for non-password protected accounts. This process requires that businesses verify requests to know categories of personal information to a “reasonable degree of certainty.” To meet this standard, businesses could match two (2) pieces of consumer provided personal information with personal information maintained by the business. For requests to know specific pieces of information, businesses must verify the consumer to a reasonably high degree of certainty, which can be accomplished by matching three (3) pieces of consumer provided personal information with personal information retained by the business. The proposed Regulations provide businesses with discretion for verifying requests to delete, depending on the sensitivity of the personal information. The table below lays this out.


Abbreviated Right to Know (Categories of Personal Information) Right to Know Specific Pieces of Information Right to Delete

Two Steps:

Using information provided by consumer, verification must be to reasonable degree of certainty, which may include matching at least two data points provided by the consumer.















All deletion requests require verification at the time of the request, and re-verification before any data is deleted.

Business may use its discretion, based on sensitivity of data, whether to use two or three-step verification.

E.g., deletion of browsing history may only require reasonable degree of certainty whereas deletion of family photos may require reasonably high degree of certainty.

Three Steps:

Using information provided by consumer, verification must be to reasonably high degree of certainty, which may include matching at least three data points provided by the consumer and obtaining a signed declaration under penalty of perjury that the requestor is the consumer whose personal information is the subject of the requests.



The proposed verification process, especially for a right to know specific pieces of information collected about the consumer, sets a high standard—particularly insofar as the proposed Regulations discourage businesses from using sensitive information such as SSN for matching purposes (as discussed below). The declaration envisioned by the proposed Regulations adds an additional hurdle for consumers.

Businesses will have to make a determination of what pieces of personal information to use for matching purposes based on what information they are holding for consumers, and the sensitivity of such information. One likely consequence, however, is that the added verification procedures will reduce the number of verifiable requests to know specific pieces of information as well as requests to delete that businesses must honor. Don’t be surprised if privacy advocates decry the heightened verification procedures in the upcoming Public Comment period.

Shielding Sensitive Data From the Verification and Response Process

The proposed Regulations directly address the concern discussed above about potential civil causes of action stemming from mistakes made during the consumer request process by explicitly prohibiting businesses from providing in response to a request to know the following sensitive information:

  • social security number
  • driver’s license number
  • state identification number
  • medical and health information
  • financial account number
  • account passwords, or
  • security questions and answers.

The proposed Regulations, furthermore, prohibit businesses from requesting such information in the verification process “unless necessary.”

The CCPA’s private right of action only applies to unauthorized access and theft, exfiltration or disclosure of personal information as defined under the California breach notification statute—which is notably narrower than the definition of personal information under the CCPA and similar to the sensitive data listed above. The effect of the proposed Regulations, then, is largely to prohibit businesses from unnecessarily collecting or disclosing the kinds of data that might trigger a cause of action under the CCPA in the event of a breach.

Use of Authorized Agents to Make Consumer Requests

The CCPA allows consumers to use authorized agents to make consumer requests, increasing the potential for fraud where tricksters impersonate authorized agents or even use bots to scam businesses into providing consumer information. The proposed Regulations allow businesses to guard against this by requiring that such agents produce a signed authorization from the consumer. Businesses can also require that consumers using agents separately verify their identify to the business to prevent fraud

Requirements Where a Business Cannot Verify a Consumer

If there is no reasonable method by which a business can verify the identity of the consumer, the business must state so in response to the request. If it is not possible for the business to verify requests from a whole category of consumers, however, the proposed Regulations allow the business to state so in the privacy policy along with an explanation as to why it has no method to identify the consumer.

The use of a categorical disclosure may be particularly useful for a business that receives requests to know from website visitors who do not have accounts open with the business and otherwise haven’t provided any other information that the business can use to verify their identity.