The Office of Civil Rights of the U.S. Department of Health and Human Services has issued guidance clarifying how HIPAA’s Privacy Rule permits covered entities (in particular, health care providers and health plans) or their business associates to contact former COVID-19 patients about plasma donation to treat or potentially treat patients. The guidance follows the FDA’s approval of blood plasma with COVID-19 antibodies to treat current COVID-19 patients.

The guidance observes that covered entities under HIPAA may also use former COVID-19 patients’ protected health information (PHI) for certain health care operations purposes that are not related to the care of that particular patient. For example, a covered entity may use and potentially disclose such PHI if it would help that entity with the case management of current COVID-19 patients.

However, the guidance also addresses the limits that apply to the use or disclosure of such information.  Specifically, a covered entity or its business associate may not disclose or use the COVID-19 patients’ information on behalf of a third party. In particular, covered entities need to be careful not to use or disclose PHI for marketing purposes, which may happen, for example, if PHI is used or disclosed to encourage former COVID-19 patients to make a donation at a particular blood or plasma donation center.

In the case of health care operations, covered entities must also make reasonable efforts to use or disclose only the minimum amount of PHI necessary for the particular purpose.

On August 19, 2020, the United States District Court for the Northern District of California granted preliminary approval of the class action settlement in In re Facebook Biometric Information Privacy Litigation, 3:15-cv-03747-JD.  If the settlement receives final approval, Facebook would pay $650 million to Illinois class members as compensation for violations of the Illinois Biometric Information Privacy Act (“BIPA”)—a $100 million increase from the settlement proposal that the District Court denied earlier this year.

The case arose from allegations that Facebook violated BIPA by collecting and storing class members’ biometric data in the form of scans of their faces without prior notice or consent.  Facebook harvested the scans in connection with its “Tag Suggestions” program, which looks for and identifies people’s faces in photographs uploaded to Facebook to promote user tagging.  BIPA provides statutory damages of up to $1,000 per negligent violation and up to $5,000 per intentional or reckless violation.  Plaintiffs estimated that millions of Illinois residents were Facebook users whose biometric data had been collected in violation of BIPA.

As the Court recognized, the case was “fiercely litigated” for over five years, with fights on standing, summary judgment, and class certification.  Notably, during the pendency of an interlocutory appeal to the Ninth Circuit, the Illinois Supreme Court largely adopted the District Court’s pro-plaintiff interpretation of BIPA in its 2019 Rosenbach v. Six Flags Entm’t Corp. decision.  Just as the case was about to be set for a jury trial, the parties advised the District Court that a settlement in principle had been reached, pursuant to which Facebook would pay $550 million.  However, on June 4, 2020, the District Court denied the parties’ request for preliminary approval, citing concerns about an unduly steep discount on statutory damages under BIPA and the sufficiency of notice to class members.

After renegotiating, the parties asked the District Court to grant preliminary approval for a new agreement, pursuant to which Facebook agreed to pay $650 million into a non-reversionary cash fund.  The District Court conducted an evidentiary hearing on July 23, 2020, and thereafter granted preliminary approval.  In doing so, the District Court noted that the additional $100 million “substantially allays the Court’s concerns about the potential inadequacy of payments to class members in light of BIPA’s statutory penalties.”  The District Court also approved the notice to class members, which includes email notice, Facebook news feed notice, publication notice, a settlement website, targeted internet ad campaigns, and CAFA notice.  A final approval hearing is set for January 7, 2021.

The Facebook settlement should serve as a reminder to take BIPA compliance seriously.  While the scale of Facebook’s users makes the settlement newsworthy, the lessons apply to businesses of all sizes—if you collect biometric data from Illinois consumers or employees, you must obtain written consent prior to collection and comply with other obligations.  Failure to do so renders you at risk of a class action for statutory damages and attorneys’ fees.

On August 14, 2020, the California Office of Administrative Law (“OAL”) approved in part and withdrew in part the Regulations regarding the California Consumer Privacy Act (“CCPA”).  While most of the changes are non-substantive, the OAL withdrew certain provisions of the Regulations and resubmitted them to the Attorney General’s Office for further review.  Approved sections went into effect immediately.

Among the more notable provisions withdrawn was 999.305(a)(5), which would have required businesses to obtain express consent from consumers before using previously collected information for a materially different purpose.  Rather than obtaining express consent, businesses must comply with Section 1798.100(b) of the CCPA, which prohibits businesses from using personal information “collected for additional purposes without providing the consumer with notice consistent with this section.”  Because initial notice can generally be accomplished through an online privacy policy, it appears that updates to an online privacy policy may suffice if a business intends to start using previously collected personal information for an additional purpose.

The OAL also made three changes relating to consumers’ opt-out right:  (1) withdrawing a provision that required businesses that substantially interacted with consumers offline to provide notice of the right to opt-out via an offline method; (2) withdrawing a provision that required businesses to make the opt-out process to be “easy” and “require minimal steps”; and (3) requiring businesses to entitle their opt-out page as “Do Not Sell My Personal Information” as opposed to “Do Not Sell My Info.”  While the latter change is non-substantive, it is a concrete change that many businesses may need to make.

Finally, while much has been made of the OAL’s withdrawal of a provision stating that businesses may deny a request from an authorized agent that does not submit proof that they have been authorized by the consumer (previously Section 999.326(c)), that may be a change without practical effect.  Indeed, Section 999.326(a)(1) still allows businesses to require that a consumer provide the authorized agent with signed permission to submit requests to know or delete.  And, similarly, Section 999.315(f) allows businesses to deny a request to opt-out submitted by an authorized agent if the agent cannot provide to the business the consumer’s signed permission.  Accordingly, it appears that businesses arguably may be able to deny requests if they are unable to verify that an authorized agent has a consumer’s written permission to submit requests on their behalf.

Under California law, the Attorney General’s Office has one year to resubmit withdrawn sections after further review and possible revision.  Whether or not the Attorney General’s Office revises and/or resubmits those provisions will likely be influenced by whether the California Privacy Rights Act is passed on the upcoming ballot.

On July 13, 2020, the Federal Trade Commission (FTC) held a workshop titled “Information Security and Financial Institutions: FTC Workshop to Examine Safeguards Rule.” This workshop discussed the proposed amendments to the Gramm-Leach-Bliley Act’s (GLBA) Safeguards Rule, which requires financial institutions to develop, implement, and maintain a comprehensive information security program. The GLBA Safeguards Rule has not been updated since it went into effect in 2003. The workshop explored the cost of information security for financial institutions, the availability of information security services for smaller financial institutions, and other issues raised in comments received in response to the FTC’s notice of proposed rulemaking.

During the workshop, FTC staff provided the following insights into the proposed amendments to the GLBA Safeguards Rule:

  • Designate one qualified individual to be responsible for overseeing the information security program. Although the term Chief Information Security Officer (CISO) is used in the proposed amendments, the FTC staff clarified that the qualified person does not necessarily need to carry the title of a CISO. The FTC staff noted that the necessary qualifications for the responsible individual will likely be dependent on the information security needs of each financial institution.
  • Base the information security program on a written risk assessment that must include certain criteria for determining risk and address how the information security program will address those risks. The FTC staff expressly stated that there is an expectation that risk assessments are to be done on a routine basis; financial institutions cannot complete a risk assessment one time and then never again.
  • Provide security awareness training to personnel. The FTC staff recommended that all employees receive basic security training, but information security personnel should receive more in-depth security training. The FTC staff noted that financial institutions may use a third party service provider to conduct these trainings.
  • Implement an information security program that includes access controls, developing information inventories, implementing secure development practices, conducting audits, implementing secure disposal requirements, developing change management procedures, and monitoring the activity of authorized users. The FTC staff emphasized that it is up to the financial institution to determine how to implement the various requirements and that each financial institution should be free to choose a solution that works best for each financial institution’s respective information security program.
  • Implement encryption and multifactor authentication. The FTC staff indicated their belief that financial institutions should have the flexibility to determine how to implement encryption and multifactor authentication. However, the FTC staff noted that in the event it is not feasible for a financial institution to implement encryption or multifactor authentication, the financial institution should come up with alternative controls that have been reviewed and approved by the qualified individual in charge of the financial institution’s information security program.
  • Financial institutions that maintain information about fewer than 5,000 consumers would be exempted from most of the written requirements. The FTC staff explained that the exception was written so that small financial institutions with small budgets that have access to tens of thousands of consumers’ data are still expected to implement security controls that are appropriate to the amount of data they are collecting, not necessarily to the size of their business.

The deadline to submit comments about the proposed amendments to the GLBA Safeguards Rule is August 12, 2020. Financial institutions that are subject to the GLBA Safeguards Rule should review their current information security program in light of the proposed amendments to determine how any changes may affect their information security programs.

On July 16, 2020, the European Court of Justice (Court) ruled in the “Schrems II” case that the one of the most commonly used cross border data transfer mechanisms between the European Union (EU) and the United States (US), the EU-US Privacy Shield Framework (Privacy Shield), has been invalidated. The Court reasoned that when transferring European data subjects’ personal data to a third country, the business in the third country must be able to protect this personal data with roughly the same level of protection that the personal data is guaranteed to have within the EU by the General Data Protection Regulation (GDPR). However, the Court said there should also be an assessment of how the third country’s legal system and public authorities plan to access the personal data and whether this access affords the necessary protections guaranteed within the EU.

The Court found that the surveillance laws in the U.S. allow for the U.S. government to access the personal data of Europeans that is transferred to the U.S. and that the Privacy Shield does not protect Europeans’ personal data from such U.S. government surveillance. Furthermore, the Court found that Europeans are not afforded the right to bring actions in U.S. courts to prevent this type of access as they could in the EU. Therefore, the Court ruled that the adequacy decision that forms the basis for the Privacy Shield is invalid, because the Privacy Shield is not able to offer Europeans an equivalent level of protection as they would be entitled to in the EU. This means those businesses that currently rely on the Privacy Shield, which includes over 5,000 active participants, will need to find an alternative mechanism to transfer personal data from the EU to the US.

By contrast, the Court upheld one of the other mechanisms of transfers to the U.S.—the standard contractual clauses, which Schrems had also challenged. The Court reasoned that while standard contractual clauses do not bind the authorities of third countries—and therefore does not suffer from the same deficiencies as Privacy Shield—the data exporter and the data importer are both required to verify, prior to the transfer, whether the data importer can afford data subjects appropriate safeguards, enforceable rights, and effective legal remedies. On that basis, the Court found that the standard contractual clauses adequately protects personal data with roughly the same level of protection that personal data is guaranteed to have by the GDPR.

In a press conference given by the European Commission, Věra Jourová, Vice-President for Values and Transparency, highlighted that the European Commission is working to modernize the standard contractual clauses and the requirements of this ruling will be incorporated into any future updates of the standard contractual clauses. Jourová also commented that businesses can still rely on binding corporate rules for the transfer of personal data from the EU and the US.

Businesses that are currently Privacy Shield certified should start examining different transfer mechanisms as an alternative to Privacy Shield. Whether they chose to use standard contractual clauses or binding corporate rules, businesses that transfer EU data to the U.S. must adopt appropriate safeguards, enforceable rights, and effective legal remedies to data subjects whose information they receive.

The Financial Crimes Enforcement Network (“FinCEN”) just issued another Advisory pertaining to two consumer fraud schemes exacerbated by the COVID-19 pandemic. This Advisory focuses on “imposter schemes” and “money mule schemes, ”which we discuss below.

This most recent Advisory is the latest in a string of pronouncements relating to the pandemic by FinCEN, which has stated that it regularly will issue such documents. As we have blogged, FinCEN issued an Advisory on May 18 regarding medical scams related to the pandemic, and issued a companion Notice that “provides detailed filing instructions for financial institutions, which will serve as a reference for future COVID-19 advisories.” On April 3, 2020, FinCEN also updated its March 16, 2020 COVID-19 Notice in order to assist “financial institutions in complying with their Bank Secrecy Act (“BSA”) obligations during the COVID-19 pandemic, and announc[ing] a direct contact mechanism for urgent COVID-19-related issues.”

The most recent Advisory again provides a list of potential red flags that FinCEN believes that financial institutions should be monitoring for, in order to detect, prevent, and report such suspicious activity. As we previously have commented: although such lists can be helpful to financial institutions, they ultimately may impose de facto heightened due diligence requirements. The risk is that, further in time, after memories of the stressors currently imposed by COVID-19 have faded, some regulators may focus only on perceived historical BSA/AML compliance failures and will invoke these lists not merely as efforts by FinCEN to assist financial institutions in deterring crime, but as instances in which FinCEN was putting financial institutions on notice.

Further, the most recent Advisory suffers from the fact that its list of red flags for imposter schemes is best directed at consumers themselves, rather than at financial institutions offering services to consumers: many of the red flags pertain to anomalies in the communications sent directly by fraudsters to targeted consumer victims – information that financial institutions rarely possess. Continue Reading FinCEN Issues Advisory on COVID-19 and Imposter and Money Mule Schemes

On April 30th, U.S. Senators from across multiple committees joined together to announce legislation that would protect consumer privacy rights in the wake of the COVID-19 pandemic. Sen. Roger Wicker (R-MS), Chair of the Senate Committee on Commerce, Science, and Transportation; Sen. John Thune (R-SD), Chair of the Subcommittee on Communications, Technology, Innovation, and the Internet; Sen. Jerry Moran (R-KS), Chair of the Subcommittee on Consumer Protection, Product Safety, Insurance and Data Security; and Sen. Marsha Blackburn (R-TN) plan to introduce a bill that would provide individuals with transparency, choice, and control over the collection and use of their personal health, geolocation, and proximity data, while also holding businesses accountable to individuals if those businesses use personal information in response to COVID-19.

The COVID-19 Consumer Data Protection Act relies on notice and consent to protect personal information. The bill would require disclosures about how personal information will be used, to whom it might be transferred, and for how long it would be held. This would include tracking the spread, signs, or symptoms of COVID-19; measuring compliance with social distancing guidelines; monitoring compliance with COVID-19 orders or directives issued by federal, state or local governments; and conducting contact tracing of COVID-19 cases. These disclosures would allow individuals to make informed choices about whether to give express consent for a business to “collect, process, or transfer the covered data of an individual.” However, individuals will also have the right to opt-out at any time after giving consent, and upon receiving an opt-out request, a business must honor such opt out within 14 days by stopping any such collecting, processing, or transferring of the personal information, or the business can de-identify the personal information.

Once personal information has been collected, businesses will be required to “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity” of the personal information. Furthermore, businesses will be required to delete or de-identify personal information when it is no longer being used for a COVID-19 purpose. Businesses would also be required to issue publicly available reports every 30 days about: the aggregate number of individuals whose data has been collected, processed or transferred; the categories of data that were collected, processed or transferred; the purposes for which data was collected, processed or transferred; and those to whom it was transferred.

The bill contains a number of exemptions, including for: aggregated, de-identified, or publicly available data; information from education records that is already subject to the Family Educational Rights and Privacy Act; health information already subject to the Health Insurance Portability and Accountability Act; and for compliance with legal obligations.

The bill would task the Federal Trade Commission (FTC) with the responsibility to issue “guidelines recommending best practices” for data minimization of personal information being collected and/or processed for a COVID-19 purpose. The FTC and state attorneys general would also be empowered to enforce the new requirements.

Opposition to the bill will likely focus on the failure of the bill to create an individual private right of action to enforce the privacy rights created as well as the inclusion of a preemption clause that would prevent states from adopting, enforcing, or continuing to maintain any law that is “related to the collection, processing, or transfer of covered data” in the bill.

Ballard privacy and data security lawyers Philip Yannella and Greg Szewczyk have authored an article for the Cybersecurity Law Report discussing the privacy and data security issues raised by Zoombombing, including  potential first and third party liability.  The article is available to subscribers here.




With the ongoing covid crisis leaving businesses of all sizes concerned about the short and medium term future, the intimidating task of considering a liquidation or restructuring is inevitably starting to become a reality.  Although privacy in the bankruptcy context is nothing new—especially in the context of personally identifiable information (“PII”) held by a company—it is an issue that has been overlooked by many companies.  However, by taking proactive measures, a business can transform the personal data it holds from a reorganization liability into an asset.

Whether a set of PII can be sold is one of the more common ways privacy issues come into play during liquidation and reorganization proceedings.  In 2005, Congress amended the Bankruptcy Code to prohibit sales of PII when the debtor “discloses to an individual a policy prohibiting the transfer of [PII] to persons that are not affiliated with the debtor and if such policy is in effect on the date of the commencement of the case.”  11 U.S.C. § 363(b)(1).  The 2005 amendment defines PII broadly to include an individual’s name, physical address, email address, telephone number, and various types of financial information.

Further, courts have interpreted the amendment as prohibiting PII sales during bankruptcy proceedings unless the privacy policy discloses such potential sales—i.e., a privacy policy silent on such sales impliedly prohibits the sale.  Indeed, even where a privacy policy seems to generally reference the possibility of PII sales during a reorganization, courts have refused to allow the sale without restrictive conditions.  See In re Borders Group, Inc., No. 11-10614 (MG), 2011 Bankr. LEXIS 4606 (Bankr. S.D.N.Y. Sept. 27, 2011). And importantly, as demonstrated in the high profile bankruptcies of RadioShack in 2015 and Toysmart in 2000, the conditions courts place on PII sales can be so restrictive that companies are better off paying to destroy the PII rather than include it in the sales.

In addition to privacy policies, companies subject to HIPAA, GLBA, and the new California Consumer Privacy Act would also face additional requirements before a PII sale could be approved as part of a reorganization plan.  Further, apart from privacy, a company’s failure to implement and properly document its information security program could significantly impact the value of its non-PII assets.

Companies are understandably tightening their belts as they try to weather this covid storm.  However, by taking a fresh look at their privacy and information security programs and making relatively minor changes, companies may be able to turn potential liabilities into assets—and therefore better position themselves to emerge as a going concern.