In April 2019, the California Assembly Privacy and Consumer Protection Committee rejected a proposal known commonly as the “Privacy for All Act” (AB-1760), which among other things would have provided a private right of action for all violations of the California Consumer Privacy Act (CCPA). The rejection of AB-1760 was a blow to consumer privacy advocates. A similar measure, SB-561, would also have provided a private right of action for all privacy violations. That bill has also been defeated, meaning that the CCPA’s private right of action provisions will not be expanded this year. Continue Reading Proposed Expansion of CCPA’s Private Right of Action Defeated in State Senate

Following the speedy enactment of the California Consumer Privacy Act (CCPA or Act) in June 2018, business and consumer advocates alike have been pressuring California lawmakers to clarify the many ambiguities raised by the Act’s sweeping requirements. California lawmakers recently responded to these calls for greater clarity by proposing a slate of amendments to address some of the more controversial provisions of the CCPA, including the definition of “personal information”, requirements regarding information sharing, and the scope of industry exemptions. Continue Reading Proposed Amendments to the California Consumer Privacy Act May Limit Scope of the Act

After a quiet winter, the Department of Health and Human Services’ Office for Civil Rights (OCR) revived with the spring, issuing a set of frequently asked questions and two recent announcements.

The FAQs address the situation where an individual requests a covered entity to disclose protected health information (“PHI”) to an app. The covered entity must generally comply with the request, even if the app is unsecured. It may be prudent to advise the individual of concerns about the app, but the individual has the right under HIPAA to access most PHI held by a covered entity set and to direct where the covered entity should send that information.

In case of unauthorized access to PHI that has been transmitted to the app, the liability of a covered entity, or an electronic health records (“EHR”) developer acting as a business associate for the covered entity, will be determined by the relationship with the app. If, for example, there is no relationship with the app developer and the app does not perform functions on behalf of the covered entity, the FAQs provide comfort that the covered entity, and EHR developer, will not be exposed to penalties under HIPAA in the event of unauthorized access to PHI that has been transmitted to the app.

Conversely, if a relationship exist, for example, the app developer serves as a business associate of the EHR developer or the app is performing functions for the covered entity, the EHR developer and covered entity may be exposed to liability. Covered entities should consider building appropriate contractual protections into their business associate agreements to safeguard against such liabilities.

Within two weeks of publishing the new FAQs, the OCR issued a notification that it is reducing the maximum penalties that will apply to certain types of HIPAA violations.  For penalty purposes, the OCR breaks HIPAA violations into four categories based on the severity of the violation.  Prior to the new guidance, only the minimum penalty per violation increased with severity.  The maximum penalty that could be imposed was the same for each category: $1.5 million for any type of violation per year.  Under the new guidance, the maximum remains at $1.5 million for the most serious category of violations, but is lowered significantly for other types of violations.  Going forward, the maximum penalty will be:

  • Where the entity did not know and by exercising reasonable diligence would not have known of the violation, $25K per type of violation per year.
  • Where the violation arises from reasonable cause, $100K per type of violation per year.
  • Where the violation arises from willful neglect and is corrected, $250K per type of violation per year.
  • Where the violation arises from willful neglect and is not corrected, $1.5M per type of violation per year.

However, the reduction in the maximum penalties does not necessarily translate to a significant reduction in what the OCR will seek in enforcement actions, at least not with respect to the resolution agreements that the OCR has historically announced.  Those resolution agreements typically pertain to situations where the OCR finds serious, uncorrected violations, often of more than one type.

One week after this notification, the OCR seemed to signal that it will continue to seek large settlement amounts when it followed the notice of penalty reductions with an announcement of a $3 million settlement with Touchstone Medical Imaging, LLC for violations that exposed the PHI of more than 300,000 patients.  Although the OCR reached this resolution agreement prior to issuing its notice of the reduction in penalties, it is unlikely that the reduction would have made any difference with regard to the Touchstone breach, where the OCR found several HIPAA violations, including issues with the timing and thoroughness of the health care provider’s investigation of the incident, which in turn led to a delay in its provision of notice of the breach to affected individuals.

The practical changes that will come from the reduction in penalties remain to be seen.  Covered entities and their business associates under HIPAA may take comfort that relatively minor violations that are quickly addressed will not result in multimillion dollar liabilities, but based on past settlement announcements, it seems unlikely that the OCR would enforce the HIPAA requirements so harshly.  On the other hand, it appears as if the OCR will continue to seek substantial monetary penalties for significant, uncorrected breaches.

The Denmark Data Protection Authority (DPA) ruled on April 11, 2019 that affirmative consent is required when companies record customer telephone calls. Because voice recordings constitute personal data under the European Union’s (EU) General Data Protection Regulation (GDPR), international companies that communicate via telephone with EU customers will need to take steps to ensure GDPR compliance.

In this case, Denmark’s largest telecommunications company, TDC A/S, provided disclosures to its customers that calls may be recorded for training purposes, but the company offered no mechanism for customers to opt-in or opt-out of the recording. During one such call, the customer requested that the call not be recorded, but the service agent said there was no way to turn off the recording. The Denmark DPA rejected the company’s arguments that its recording practices served a legitimate interest, such as the improvement of its customer service, and concluded that the company’s telephone recording practices violated the GDPR. Continue Reading Denmark DPA Rules on How GDPR Applies to Voice Recordings

Recently, legislators in Texas introduced two bills relating to consumer privacy and data protection: H.B. No. 4518, the Texas Consumer Privacy Act (“Texas CPA”) and H.B. No. 4390, the Texas Privacy Protection Act (“TPPA”). These bills bear a strong resemblance to the California Consumer Privacy Act (the “California CPA”), and would lay the groundwork for extensive administrative schemes protecting consumers’ rights to their personal information.

Texas CPA

The Texas CPA bears strong similarity to California CPA. The Texas CPA, which, if adopted, would take effect September 1, 2020, applies to companies that do business and collect consumer data and:

  • Derive at least 50% of their annual revenue selling consumers’ personal information; or
  • Exceed $25 million in gross annual revenue (with that amount subject to adjustment by the Texas Attorney General every two years); or
  • Buy, sell, or receive the personal information of at least 50,000 consumers, households, or devices for commercial purposes
  • The Texas CPA would also apply to entities owned by companies that would be subject to the law. Similar to the California CPA, the Texas CPA contains express provisions governing rulemaking, implementation, and enforcement of the law. Notably, the legislation highlights various consumer rights, including (but not limited to):
  • A consumer’s right to disclosure, from the business, of the personal information the business collected.
  • A consumer’s right to deletion of the personal information that the business collected (with some limited, specific exceptions).
  • A consumer’s right to opt out of the sale of his or her personal information.

Continue Reading Texas Legislature Weighing Proposed New Privacy Laws

Utah Governor Gary Herbert is expected to sign a new privacy law in the coming weeks that will make his state the first to protect private electronic data stored with third-party providers from government access without a warrant.

Under the legislation passed unanimously by the Utah Legislature earlier this month, law enforcement agencies need a warrant to obtain information about an individual from wireless communications providers, email platforms, search engine providers, or social media companies.

While much of the focus over the past two years has been on laws to protect consumer privacy rights, protecting private information from disclosure to law enforcement has also generated attention. Traditionally, the general rule followed, on both the federal and state levels, has been that law enforcement agencies can access information through third-party providers because individuals have no reasonable expectation of privacy when they share their personal information with third parties. Continue Reading Utah Privacy Law Would Be First to Require Search Warrant for Government to Access Stored Data

On March 20, 2019, the Supreme Court refused to address the adequacy of a $8.5 million Google privacy class action settlement and instead remanded to a lower court to determine whether the class action plaintiffs had standing to assert a claim under the Stored Communications Act (“SCA”).  The Court’s holding serves as a reminder that despite the recent trend in finding standing for privacy violations, it can still be an open issue.

Frank v. Gaos arose out of Google’s use of “referrer headers,” whereby Google allegedly transmitted users’ search terms to the servers that hosted the webpages the users selected as a result of the searches.  Plaintiffs alleged that Google’s transmission of users’ search terms violated the SCA, which prohibits an entity providing an electronic communication service to the public from “knowingly divulg[ing] to any person or entity the contents of a communication while in electronic storage by that service.”  After lengthy motion practice, Google agreed to pay $8.5 million, most of which would be distributed to six non-profit cy pres recipients selected by class counsel and Google to “promote public awareness and education, and/or to support research, development, and initiatives, related to protecting privacy on the Internet.”  Five class members objected to the settlement on several grounds relating to fairness.

During the pendency of the class action and settlement, the Supreme Court issued its 2016 ruling in Spokeo, Inc. v. Robins, which held that Article III standing requires a concrete injury even in the context of a statutory violation.  However, when the objecting class members’ appeal reached the Supreme Court, no party made any arguments relating to standing.  Nonetheless, the Solicitor General filed a brief as amicus curiae urging the Supreme Court to vacate and remand for the lower courts to address standing under the Spokeo standard.  The Supreme Court ordered supplemental briefing on the issue and ultimately remanded for the lower courts to do just that, emphasizing that its opinion should not be interpreted “as expressing a view on any particular resolution of the standing question.”  Justice Thomas filed a lone dissent to the per curiam  opinion, arguing that “[b]y alleging the violation of ‘private dut[ies] owed personally’ to them ‘as individuals,’ the plaintiffs established standing.”

Over recent years, the trend among lower courts and state supreme courts has been to find standing for privacy violations even where the plaintiff did not sustain actual damage beyond a violation of his or her statutory right.  Although the Court did not express a view on whether standing exists for such a claim under the SCA, its holding demonstrates—to plaintiffs, defendants, state legislatures, and Congress—that the issue of statutory standing in privacy cases has not been resolved.

The FTC has proposed amendments to its 2003 Safeguards Rule and the 2000 Privacy Rule, applicable to financial institutions under the Gramm Leach Bliley Act (GLBA). The proposed changes are informed by the FTC’s enforcement experience and are intended to keep pace with technological developments. Continue Reading FTC Seeks Comment on Proposed Amendments to Safeguards and Privacy Rules

Following numerous privacy complaints, the State Office for Data Protection Supervision (BayLDA) recently conducted a random audit on 40 companies and found widespread problems with their cookie disclosures. The purpose of the audit was to determine whether website users were able to obtain transparent information regarding the use and tracking of their information by third-party providers. Ultimately, the BayLDA found that all 40 companies were in violation of the GDPR.

Based on their findings, BayLDA announced it is considering fining these companies under GDPR provisions regarding website cookie and tracking practices. Since none of the audited companies was technology-focused, the BayLDA’s findings should serve as a warning to all companies, no matter their industry. Below, we highlight the main takeaways from the BayLDA audit.

All Companies Are At Risk

The BayLDA did not discriminate when it selected companies to audit. While major technology companies have been at the forefront of these compliance discussions, the BayLDA audit shows that no company is safe and that all companies are potentially subject to oversight and enforcement by Data Protection Authorities. This audit should be a warning to all companies that have yet to comply with GDPR.

Cookie Banners Beware

All companies should be especially aware of the BayLDA findings regarding the use of cookie banners. The audit found that most cookie banners were a mere interference, hindering the user-friendliness of the website’s services, and were wholly ineffective in protecting users from unknown tracking.

Transparency Requires More Than Common Naming Techniques

The BayLDA findings also call for transparency on a more granular level. In particular, disclosures must be more specific as to the kinds of cookies being used. BayDLA suggests identification of the actual cookie utilized, rather than broad descriptors such as  “performance” or “analytic” cookies. Many companies already provide this level of granular disclosure but many do not.

Affirmative Consent of Users Is Not Automatic

One of the more problematic findings reported by the BayLDA is that the majority of companies automatically dropped tracking cookies on users as soon as the user visited a company’s website. In the view of the BayLDA, the timing of the cookie drop means that no audited company obtained active consent from users prior to the cookie drop. Rather, user tracking began before the user could make an informed decision as to the collection and processing of its data. Even if browsing a website constitutes active consent—an issue that has not been clearly decided—such consent cannot reasonably inferred if tracking begins prior to the user’s continued browsing.  Meanwhile, the German Data Processing Authority has advised it will release guidance on cookies and consent in the future.

The rules governing the use of cookies, and cookie disclosures more generally, is one of the more complex and undecided areas of European privacy law. While the BayLDA’s audit does not rise to the level of formal guidance or regulation, the findings do point in the direction of an emerging consensus given the respect the BayLDA commands among EU data privacy regulators. If nothing else, US companies subject to the GDPR should pay careful attention to the findings and consider modest changes to their policies while formal guidance and regulation develops.


As tax season winds on, the W-2 form scam has emerged as one of the most dangerous and common phishing email schemes during this time of year.

W-2s are information-rich documents containing an employee’s name, Social Security number, address, salary, and other personal information. Each year, cyber criminals target these documents in order to sell the sensitive information contained therein and to submit fraudulent tax returns in hopes of defrauding the IRS. Continue Reading Avoid Taking the Bait of W-2 Phishing Schemes