On July 13, 2020, the Federal Trade Commission (FTC) held a workshop titled “Information Security and Financial Institutions: FTC Workshop to Examine Safeguards Rule.” This workshop discussed the proposed amendments to the Gramm-Leach-Bliley Act’s (GLBA) Safeguards Rule, which requires financial institutions to develop, implement, and maintain a comprehensive information security program. The GLBA Safeguards Rule has not been updated since it went into effect in 2003. The workshop explored the cost of information security for financial institutions, the availability of information security services for smaller financial institutions, and other issues raised in comments received in response to the FTC’s notice of proposed rulemaking.

During the workshop, FTC staff provided the following insights into the proposed amendments to the GLBA Safeguards Rule:

  • Designate one qualified individual to be responsible for overseeing the information security program. Although the term Chief Information Security Officer (CISO) is used in the proposed amendments, the FTC staff clarified that the qualified person does not necessarily need to carry the title of a CISO. The FTC staff noted that the necessary qualifications for the responsible individual will likely be dependent on the information security needs of each financial institution.
  • Base the information security program on a written risk assessment that must include certain criteria for determining risk and address how the information security program will address those risks. The FTC staff expressly stated that there is an expectation that risk assessments are to be done on a routine basis; financial institutions cannot complete a risk assessment one time and then never again.
  • Provide security awareness training to personnel. The FTC staff recommended that all employees receive basic security training, but information security personnel should receive more in-depth security training. The FTC staff noted that financial institutions may use a third party service provider to conduct these trainings.
  • Implement an information security program that includes access controls, developing information inventories, implementing secure development practices, conducting audits, implementing secure disposal requirements, developing change management procedures, and monitoring the activity of authorized users. The FTC staff emphasized that it is up to the financial institution to determine how to implement the various requirements and that each financial institution should be free to choose a solution that works best for each financial institution’s respective information security program.
  • Implement encryption and multifactor authentication. The FTC staff indicated their belief that financial institutions should have the flexibility to determine how to implement encryption and multifactor authentication. However, the FTC staff noted that in the event it is not feasible for a financial institution to implement encryption or multifactor authentication, the financial institution should come up with alternative controls that have been reviewed and approved by the qualified individual in charge of the financial institution’s information security program.
  • Financial institutions that maintain information about fewer than 5,000 consumers would be exempted from most of the written requirements. The FTC staff explained that the exception was written so that small financial institutions with small budgets that have access to tens of thousands of consumers’ data are still expected to implement security controls that are appropriate to the amount of data they are collecting, not necessarily to the size of their business.

The deadline to submit comments about the proposed amendments to the GLBA Safeguards Rule is August 12, 2020. Financial institutions that are subject to the GLBA Safeguards Rule should review their current information security program in light of the proposed amendments to determine how any changes may affect their information security programs.

On July 16, 2020, the European Court of Justice (Court) ruled in the “Schrems II” case that the one of the most commonly used cross border data transfer mechanisms between the European Union (EU) and the United States (US), the EU-US Privacy Shield Framework (Privacy Shield), has been invalidated. The Court reasoned that when transferring European data subjects’ personal data to a third country, the business in the third country must be able to protect this personal data with roughly the same level of protection that the personal data is guaranteed to have within the EU by the General Data Protection Regulation (GDPR). However, the Court said there should also be an assessment of how the third country’s legal system and public authorities plan to access the personal data and whether this access affords the necessary protections guaranteed within the EU.

The Court found that the surveillance laws in the U.S. allow for the U.S. government to access the personal data of Europeans that is transferred to the U.S. and that the Privacy Shield does not protect Europeans’ personal data from such U.S. government surveillance. Furthermore, the Court found that Europeans are not afforded the right to bring actions in U.S. courts to prevent this type of access as they could in the EU. Therefore, the Court ruled that the adequacy decision that forms the basis for the Privacy Shield is invalid, because the Privacy Shield is not able to offer Europeans an equivalent level of protection as they would be entitled to in the EU. This means those businesses that currently rely on the Privacy Shield, which includes over 5,000 active participants, will need to find an alternative mechanism to transfer personal data from the EU to the US.

By contrast, the Court upheld one of the other mechanisms of transfers to the U.S.—the standard contractual clauses, which Schrems had also challenged. The Court reasoned that while standard contractual clauses do not bind the authorities of third countries—and therefore does not suffer from the same deficiencies as Privacy Shield—the data exporter and the data importer are both required to verify, prior to the transfer, whether the data importer can afford data subjects appropriate safeguards, enforceable rights, and effective legal remedies. On that basis, the Court found that the standard contractual clauses adequately protects personal data with roughly the same level of protection that personal data is guaranteed to have by the GDPR.

In a press conference given by the European Commission, Věra Jourová, Vice-President for Values and Transparency, highlighted that the European Commission is working to modernize the standard contractual clauses and the requirements of this ruling will be incorporated into any future updates of the standard contractual clauses. Jourová also commented that businesses can still rely on binding corporate rules for the transfer of personal data from the EU and the US.

Businesses that are currently Privacy Shield certified should start examining different transfer mechanisms as an alternative to Privacy Shield. Whether they chose to use standard contractual clauses or binding corporate rules, businesses that transfer EU data to the U.S. must adopt appropriate safeguards, enforceable rights, and effective legal remedies to data subjects whose information they receive.

The Financial Crimes Enforcement Network (“FinCEN”) just issued another Advisory pertaining to two consumer fraud schemes exacerbated by the COVID-19 pandemic. This Advisory focuses on “imposter schemes” and “money mule schemes, ”which we discuss below.

This most recent Advisory is the latest in a string of pronouncements relating to the pandemic by FinCEN, which has stated that it regularly will issue such documents. As we have blogged, FinCEN issued an Advisory on May 18 regarding medical scams related to the pandemic, and issued a companion Notice that “provides detailed filing instructions for financial institutions, which will serve as a reference for future COVID-19 advisories.” On April 3, 2020, FinCEN also updated its March 16, 2020 COVID-19 Notice in order to assist “financial institutions in complying with their Bank Secrecy Act (“BSA”) obligations during the COVID-19 pandemic, and announc[ing] a direct contact mechanism for urgent COVID-19-related issues.”

The most recent Advisory again provides a list of potential red flags that FinCEN believes that financial institutions should be monitoring for, in order to detect, prevent, and report such suspicious activity. As we previously have commented: although such lists can be helpful to financial institutions, they ultimately may impose de facto heightened due diligence requirements. The risk is that, further in time, after memories of the stressors currently imposed by COVID-19 have faded, some regulators may focus only on perceived historical BSA/AML compliance failures and will invoke these lists not merely as efforts by FinCEN to assist financial institutions in deterring crime, but as instances in which FinCEN was putting financial institutions on notice.

Further, the most recent Advisory suffers from the fact that its list of red flags for imposter schemes is best directed at consumers themselves, rather than at financial institutions offering services to consumers: many of the red flags pertain to anomalies in the communications sent directly by fraudsters to targeted consumer victims – information that financial institutions rarely possess. Continue Reading FinCEN Issues Advisory on COVID-19 and Imposter and Money Mule Schemes

On April 30th, U.S. Senators from across multiple committees joined together to announce legislation that would protect consumer privacy rights in the wake of the COVID-19 pandemic. Sen. Roger Wicker (R-MS), Chair of the Senate Committee on Commerce, Science, and Transportation; Sen. John Thune (R-SD), Chair of the Subcommittee on Communications, Technology, Innovation, and the Internet; Sen. Jerry Moran (R-KS), Chair of the Subcommittee on Consumer Protection, Product Safety, Insurance and Data Security; and Sen. Marsha Blackburn (R-TN) plan to introduce a bill that would provide individuals with transparency, choice, and control over the collection and use of their personal health, geolocation, and proximity data, while also holding businesses accountable to individuals if those businesses use personal information in response to COVID-19.

The COVID-19 Consumer Data Protection Act relies on notice and consent to protect personal information. The bill would require disclosures about how personal information will be used, to whom it might be transferred, and for how long it would be held. This would include tracking the spread, signs, or symptoms of COVID-19; measuring compliance with social distancing guidelines; monitoring compliance with COVID-19 orders or directives issued by federal, state or local governments; and conducting contact tracing of COVID-19 cases. These disclosures would allow individuals to make informed choices about whether to give express consent for a business to “collect, process, or transfer the covered data of an individual.” However, individuals will also have the right to opt-out at any time after giving consent, and upon receiving an opt-out request, a business must honor such opt out within 14 days by stopping any such collecting, processing, or transferring of the personal information, or the business can de-identify the personal information.

Once personal information has been collected, businesses will be required to “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity” of the personal information. Furthermore, businesses will be required to delete or de-identify personal information when it is no longer being used for a COVID-19 purpose. Businesses would also be required to issue publicly available reports every 30 days about: the aggregate number of individuals whose data has been collected, processed or transferred; the categories of data that were collected, processed or transferred; the purposes for which data was collected, processed or transferred; and those to whom it was transferred.

The bill contains a number of exemptions, including for: aggregated, de-identified, or publicly available data; information from education records that is already subject to the Family Educational Rights and Privacy Act; health information already subject to the Health Insurance Portability and Accountability Act; and for compliance with legal obligations.

The bill would task the Federal Trade Commission (FTC) with the responsibility to issue “guidelines recommending best practices” for data minimization of personal information being collected and/or processed for a COVID-19 purpose. The FTC and state attorneys general would also be empowered to enforce the new requirements.

Opposition to the bill will likely focus on the failure of the bill to create an individual private right of action to enforce the privacy rights created as well as the inclusion of a preemption clause that would prevent states from adopting, enforcing, or continuing to maintain any law that is “related to the collection, processing, or transfer of covered data” in the bill.

Ballard privacy and data security lawyers Philip Yannella and Greg Szewczyk have authored an article for the Cybersecurity Law Report discussing the privacy and data security issues raised by Zoombombing, including  potential first and third party liability.  The article is available to subscribers here.




With the ongoing covid crisis leaving businesses of all sizes concerned about the short and medium term future, the intimidating task of considering a liquidation or restructuring is inevitably starting to become a reality.  Although privacy in the bankruptcy context is nothing new—especially in the context of personally identifiable information (“PII”) held by a company—it is an issue that has been overlooked by many companies.  However, by taking proactive measures, a business can transform the personal data it holds from a reorganization liability into an asset.

Whether a set of PII can be sold is one of the more common ways privacy issues come into play during liquidation and reorganization proceedings.  In 2005, Congress amended the Bankruptcy Code to prohibit sales of PII when the debtor “discloses to an individual a policy prohibiting the transfer of [PII] to persons that are not affiliated with the debtor and if such policy is in effect on the date of the commencement of the case.”  11 U.S.C. § 363(b)(1).  The 2005 amendment defines PII broadly to include an individual’s name, physical address, email address, telephone number, and various types of financial information.

Further, courts have interpreted the amendment as prohibiting PII sales during bankruptcy proceedings unless the privacy policy discloses such potential sales—i.e., a privacy policy silent on such sales impliedly prohibits the sale.  Indeed, even where a privacy policy seems to generally reference the possibility of PII sales during a reorganization, courts have refused to allow the sale without restrictive conditions.  See In re Borders Group, Inc., No. 11-10614 (MG), 2011 Bankr. LEXIS 4606 (Bankr. S.D.N.Y. Sept. 27, 2011). And importantly, as demonstrated in the high profile bankruptcies of RadioShack in 2015 and Toysmart in 2000, the conditions courts place on PII sales can be so restrictive that companies are better off paying to destroy the PII rather than include it in the sales.

In addition to privacy policies, companies subject to HIPAA, GLBA, and the new California Consumer Privacy Act would also face additional requirements before a PII sale could be approved as part of a reorganization plan.  Further, apart from privacy, a company’s failure to implement and properly document its information security program could significantly impact the value of its non-PII assets.

Companies are understandably tightening their belts as they try to weather this covid storm.  However, by taking a fresh look at their privacy and information security programs and making relatively minor changes, companies may be able to turn potential liabilities into assets—and therefore better position themselves to emerge as a going concern.

In light of COVID-19, many organizations are taking advantage of free video conferencing capabilities offered by Zoom. Almost overnight, Zoom has become one of the most popular video conferencing services among businesses and schools. Daily Zoom users have skyrocketed from 10 million users in December 2019 to 200 million users in March 2020.  Continue Reading Increased Use of Zoom Raises Privacy and Security Concerns

Businesses subject to the California Consumer Privacy Act (“CCPA”) that have begun exploring the possibility of collecting data from visitors to their facilities to track potential coronavirus exposure and to allow/deny entry must take into consideration the fact that, by doing so, they would almost certainly be collecting data that would constitute personal information under the CCPA. For businesses subject to the CCPA, the question arises as to whether such a practice is permissible.  

As an initial matter, businesses should ensure that they have provided adequate notice of the collection and usage.  Depending on the nature of the business, that notice could be made through their online privacy policy or in-store (or facility) signage.  If the business already collects this type of personal information and is using it for new purpose the coronavirus, Section 999.305(a)(5) of the California Attorney General’s proposed regulations may require the business to directly notify such individuals and obtain consent for the materially different use of the personal information.   

Even with the proper notice, businesses must also consider what they will do if facility visitors seek to exercise their deletion rights—and whether deleting such information renders the any such screening program dangerously flawed.  The CCPA provides nine exceptions that allow a business to “deny” a request for deletion.  However, the CCPA does not include exceptions for public health crises or emergencies.  Further, although it may depend on the locality, this type of usage would likely not constitute “complying with a legal obligation,” and therefore would not fall under the exception in Cal. Civ. Code § 1798.105(d)(8). 

Accordingly, if a business wishes to deny a request for deletion and stay within the bounds of the CCPA, it must interpret another exception as encompassing using personal information to ensure the safety of employees and visitors and to curb the spread of a global pandemic.  One possibility is that screening individuals constitutes an internal use that is “solely internal” and “reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business.”  Cal. Civ. Code § 1798.105(d)(7).  Similarly, it could constitute an internal use “in a lawful manner that is compatible with the context in which the consumer provided the information.”  Cal. Civ. Code § 1798.105(d)(9).  While both of these exceptions could likely be read broadly enough to allow a colorable argument, both also require that the use be strictly internal.  To the extent a business may use such information externally—such as in conjunction with governmental or health agencies when determining potential contamination connections—the exceptions may not apply.

Another possibility is that screening individuals could falls under the “detect[ing] security incidents” exception, which is not limited to strictly internal use.  Cal. Civ. Code § 1798.105(d)(2).  The security incidents exception traditionally applies to information used for information security and anti-fraud purposes.  However, “security incident” is not defined in the CCPA, and it is therefore not statutorily limited to that context.  Businesses could thus take the position that detecting visitors with coronavirus amounts to detecting a security incident.

Given the current crisis, it seems highly unlikely that the California Attorney General’s Office would focus its resources on businesses that are using information to try to prevent the spread of coronavirus—so long as businesses are not profiting from the information they are collecting.  Further, the enforcement deadline is not set to commence until July 1, 2020.  Nonetheless, businesses should still be trying to ensure that their practices during this crisis comply with applicable laws, including the CCPA.   While none of the CCPA’s deletion exceptions directly fit using personal information to screen for coronavirus, they do provide some cover for businesses that feel that such steps are necessary to ensure the safety of its employees and patrons.  So long as businesses are not using this data for other reasons, they likely have a defensible position in the unlikely event that the California Attorney General investigates the practice. 

Health care providers, health plans, and others who are subject to HIPAA are sure to have questions about when they may disclose information about individuals who have contracted, or been exposed to, Coronavirus (COVID-19).

To address these questions, the Office of Civil Rights, U.S. Department of Health and Human Services, has issued guidance.  First, it published a bulletin, reminding us that the privacy rules of HIPAA continue to apply in an emergency while identifying when the rules allow for the responsible use and disclosure of protected health information in the case of a serious contagion.  OCR supplemented that guidance with a second bulletin and an announcement that provide relief from certain requirements to hospitals and telemedicine providers.

The First Bulletin:  Basic HIPAA Guidance

The threshold question under HIPAA is whether HIPAA applies at all. It is important to remember that HIPAA’s privacy rules extend only to covered entities (health plans, health care clearinghouses, and most health care providers) and their business associates. If an employee notifies his or her employer that that the employee is self-quarantining because he or she has tested positive for the virus, the employer would not be subject to HIPAA’s requirements with regard to that information. But if an employer finds out that an employee has the virus from the employer’s health plan, that information would be subject to HIPAA.

Even if HIPAA does not apply, its requirements may serve as a useful touchstone for how to handle personally identifiable information in difficult situations.

Under HIPAA, an individual’s protected health information (PHI) may be disclosed without the individual’s authorization in various circumstances, including:

  • to providers for the treatment of patients;
  • to appropriate authorities engaged in public health activities;
  • to individuals at risk for contracting or spreading the virus (if permitted by other applicable laws);
  • to an individual’s friends and family members involved in the individual’s care (with the individual’s verbal consent or, often, tacit permission);
  • to a person in a position to prevent or lessen a serious and imminent threat to the health and safety of an individual or the public (consistent with other applicable laws and standards for ethical conduct).

Thus, information may be disclosed to the Center for Disease Control and to state and local health departments that are collecting information about the spread of the virus, and HIPAA will not prevent reasonable and appropriate action to alert individuals who have been exposed to the virus.


However, covered entities still need to be mindful of HIPAA’s requirements to safeguard PHI from inappropriate uses and disclosures. Covered entities and business associates must continue to take care to use and disclose only the minimum amount of PHI necessary and to verify the identity and, where appropriate, authority of individuals making inquiries. In view of the attention that the virus is receiving, particular care should be taken in communications with the media.

The Second Bulletin:  Relief for Hospitals

Effective March 15, certain hospitals will not be subject to penalty or sanction under HIPAA if they fail to comply with the following HIPAA requirements:

  • obtaining a patient’s agreement to speak with family members or friends involved in the patient’s care. See 45 CFR 164.510(b).
  • honoring a request to opt out of the facility directory. See 45 CFR 164.510(a).
  • distributing a notice of privacy practices. See 45 CFR 164.520.
  • addressing a patient’s request for privacy restrictions. See 45 CFR 164.522(a).
  • addressing a patient’s request for confidential communications. See 45 CFR 164.522(b).

This waiver is limited in scope and duration.  It extends only to hospitals that have instituted a disaster protocol and that are located in an emergency area identified in the HHS Secretary’s January 31, 2020 public health emergency declaration.  The waiver extends only up to 72 hours from the time a hospital implements its disaster protocol.

The Announcement:  Relief for Telemedicine Providers

Effective March 17, OCR will not impose penalties on telemedicine providers who, in good faith, communicate with patients through any non-public facing communication product.  The policy applies to video and audio products and to communications about all telemedicine issues, not only issues pertaining to COVID-19.  Thus, a provider could video chat with a patient about a sprained ankle on Apple FaceTime, Facebook Messenger video chats, Google Hangouts video, Skype, or a similar service.  Providers should enable all available encryption and privacy modes when using these applications and are encouraged to notify patients of that the use of such applications introduce certain privacy risks.

The relief does not extend to public facing applications, such as Facebook.

Providers may seek out services that aim to be HIPAA-compliant from vendors that will enter into business associate agreements.  However, OCR will not impose penalties for “the lack of a BAA with video communication vendors or any other noncompliance with the HIPAA Rules that relates to the good faith provision of telehealth services during the COVID-19 nationwide public health emergency.”


Absent a specific exception, individuals and entities that are subject to HIPAA must comply with its privacy and security requirements.  Those requirements include provisions that allow for the proper use and disclosure of protected health information in a number of ways relevant to the current public health emergency.  OCR has provided enforcement relief to telemedicine providers and certain hospitals for a limited range of HIPAA violations.  Covered entities and business associates under HIPAA should watch for additional guidance, and should be mindful that the current state of emergency will end at some, as yet undefined, date in the future and with it, the specific relief offered by OCR will also likely end.