Data Privacy
Subscribe to Data Privacy's Posts

New Data Disposal Law in Delaware Requires Action by Impacted Businesses

While the federal government continues its inaction on data security bills pending in Congress, some U.S. states have been busy at work on this issue over the summer.  A new Delaware law H.B. 295, signed into law on July 1, 2014 and effective January 1, 2015, provides for a private right of action in which a court may order up to triple damages in the event a business improperly destroys personal identifying information at the end of its life cycle.  In addition to this private right of action, the Delaware Attorney General may file suit or bring an administrative enforcement proceeding against the offending business if it is in the public interest.

Under the law, personal identifying information is defined as:

A consumer’s first name or first initial and last name in combination with any one of the following data elements that relate to the consumer, when either the name or the data elements are not encrypted:

  • his or her signature,
  • full date of birth,
  • social security number,
  • passport number, driver’s license or state identification card number,
  • insurance policy number,
  • financial services account number, bank account number,
  • credit card number, debit card number,
  • any other financial information or
  • confidential health care information including all information relating to a patient’s health care history, diagnosis condition, treatment or evaluation obtained from a health care provider who has treated the patient, which explicitly or by implication identifies a particular patient.

Interestingly, this new law exempts from its coverage:  banks and financial institutions that are merely subject to the Gramm-Leach-Bliley Act, but the law only exempts health insurers and health care facilities if they are subject to and in compliance with the Health Insurance Portability and Accountability Act (HIPAA), as well as credit reporting agencies if they are subject to and in compliance with the Fair Credit Reporting Act (FCRA).

Given how broadly the HIPAA and FCRA exemptions are drafted, we expect plaintiffs’ attorneys to argue for the private right of action and triple damages in every case where a HIPAA- or FCRA-covered entity fails to properly dispose of personal identifying information, arguing that such failure evidences noncompliance with HIPAA or FCRA, thus canceling the exemption.   Note, however, that some courts have refused to allow state law claims of improper data disposal to proceed where they were preempted by federal law.  See, e.g., Willey v. JP Morgan Chase, Case No. 09-1397, 2009 U.S. Dist. LEXIS 57826 (S.D.N.Y. July 7, 2009) (dismissing individual and class claims alleging improper data disposal based on state law, finding they were pre-empted by the FCRA).

The takeaway?  Companies that collect, receive, store or transmit personal identifying information of residents of the state of Delaware (or any of the 30+ states in the U.S. that now have data disposal laws on the books) should examine their data disposal policies and practices to ensure compliance with these legal requirements.  In the event a business is alleged to have violated one of [...]

Continue Reading




read more

Article 29 Working Party Defends BCR-P to European Institutions

On 12 June 2014, in a letter from the Article 29 Data Protection Working Party to the President of the European Parliament, the Working Party has defended, and urged the EU institutions to discuss, Binding Corporate Rules for Processors (BCR-P) in respect of the forthcoming EU General Data Protection Regulation.

In its letter, the Working Party clarifies its views on BCR-P, outlines the safeguards that BCR-P offer and addresses concerns that have led some to call for the dropping of BCR-P. The letter suggests that these issues should be covered during future trialogues between the EU Council, the European Commission (whom both received copies of the letter) and the European Parliament.

Background

Binding Corporate Rules (BCR) represent one of the ways that a data controller can overcome the general prohibition contained in the EU Data Protection Directive (95/46/EC) on cross-border transfers of personal data to countries outside the EEA that do not offer adequate levels of data protection. Broadly, BCR are legally enforceable corporate rules applied by company groups which, on the approval of the relevant national data protection authority, are deemed to ensure sufficient protection for international transfers between group companies.

In December 2011, the European Commission announced that BCR would be updated in the new EU General Data Protection Regulation. Whilst BCR only apply to data controllers, the Working Party is a proponent for BCR-P (which apply similarly to data processors rather than data controllers) and, in June 2012, established a BCR-P framework. In brief, BCR-P permit data processors, on the instruction of data controllers, to forward personal data to their group companies, otherwise known as “sub-processing”. The Working Party has officially permitted companies to apply for BCR-P since January 2013. To date, three international organisations have BCR-P approved by their national data protection authorities, with a further 10 currently under review.

In Defence of BCR-P

In its letter, the Working Party encloses an explanatory document setting out the main guarantees offered to data controllers, data subjects and data protection authorities generally, relating to:

  • Use of external sub-processors;
  • Conflict between an applicable legislation and BCR-P and/or Service agreements / Access by law enforcement authorities;
  • Controllers’ rights;
  • Data subjects’ rights;
  • Processors’ obligations towards data protection authorities; and
  • Implementation of accountability measures.

The Working Party also stresses the high level of protection that BCR-P offer to international transfers of personal data, which, according to the Working Party represent the “optimal solution” to encourage data protection principles abroad. In the alternative, the Working Party suggests that model clauses or Safe Harbour do not offer a comparable level of protection.

In response to calls for the European Parliament to drop BCR-P from future legislation due to a lack of guarantees to frame sub-processing activities, the Working Party clarifies that BCR-P offer greater levels of protection that those currently provided by the European Parliament. Furthermore, the Working Party concludes that to drop BCR-P would create legal uncertainty and represent a loss generally to those organisations with approved BCR-P or those currently [...]

Continue Reading




read more

Supreme Court Prohibits Warrantless Mobile Phone Searches, Underscores Individual Right to Privacy

The Supreme Court of the United States’ recent decision prohibiting warrantless mobile phone searches incident to arrest underscores unique privacy concerns raised by modern technology. The decision has an immediate impact on an individual’s rights under the Fourth Amendment, and may also have an impact on evolving areas of white collar and employment law.

Read the full article.




read more

Article 29 Working Party Publishes Statement on the Risk-Based Approach to Data Protection

On May 30, 2014, the European Union’s Article 29 Data Protection Working Party adopted “Statement on the role of a risk-based approach in data protection legal frameworks” (WP281).  The Working Party, made up of EU member state national data protection authorities, confirmed its support for a risk-based approach in the EU data protection legal framework, particularly in relation to the proposed reform of the current data protection legislation.  However, with a view to “set the record straight,” the Working Party also addresses its concerns as to the interpretation of such an approach and sets out its “key messages” on the issue.

Approaching Risk

In support of the risk-based approach, which broadly calls for increased obligations proportionate to the risks involved in data processing, the Working Party sets out examples of its application in the current Data Protection Directive (95/46/EC) and the proposed General Data Protection Regulation.  The Working Party confirms that the risk-based approach must result in the same level of protection for data subjects, no matter the size of the particular organisation or the amount of data processed.  However, the Working Party clarifies that the risk-based approach should not be interpreted as an alternative to established data protection rights, but instead a “scalable and proportionate approach to compliance.”  Consequently, the Working Party accepts that low-risk data processing may involve less stringent obligations on data controllers than comparatively high-risk data processing.

Key Messages

To conclude its views on the risk-based approach, the Working Party establishes 13 key messages – in summary:

  1. Protection of personal data is a fundamental right and any processing should respect that right;
  2. Whatever the level of risk involved, data subjects’ legal rights should be respected;
  3. While the levels of accountability obligations can vary according to the risk of the processing, data controllers should always be able to demonstrate compliance with their data protections obligations;
  4. While fundamental data protection principles relating to data controllers should remain the same whatever the risks posed to data subjects, such principles are still inherently scalable;
  5. Accountability obligations should be varied according to the type and risk of processing involved;
  6. All data controllers should document their processing, although the form of documentation can vary according to the level of risk posed by the processing;
  7. Objective criteria should be used when determining risks which could potentially negatively impact a data subject’s rights, freedoms and interests;
  8. A data subject’s rights and freedoms primarily concerns the right to privacy, but also encompasses other fundamental rights, such as freedom of speech, thought and movement, prohibition on discrimination, and the right to liberty, conscience and religion;
  9. Where specific risks are identified, additional measures should be taken – data protection authorities should be consulted regarding highly risky processing;
  10. WHile pseudonymising techniques are important safeguards that can be taken into account when assessing compliance, such techniques alone do not justify a reduced regime on accountability obligations;
  11. The risk-based approach should be assessed on a very wide scale and take into account every potential/actual adverse effect;
  12. The legitimate [...]

    Continue Reading



read more

The EU Article 29 Working Party’s Opinion on Privacy and Anonymity: It’s Harder Than You Think

On April 10, 2014, the European Union’s Article 29 Data Protection Working Party adopted ‘‘Opinion 05/2014 on Anonymisation Techniques’’ (WP216).  The Working Party, made up of the national data protection authorities of the EU member states, acknowledges that there is no one-size-fits-all solution and that most anonymisation techniques have inherent limitations.

However, the publication of the Opinion is timely, as ever-increasing amounts of data are being captured via devices and networks, stored cheaply and interrogated ever more creatively as  technologies evolve. This wholesale collection and processing of data may provide clear benefits for society, individuals and organisations, but, under EU law, such benefits must be derived lawfully, and that requires respecting the protection of the individual’s personal data and the right to a private life.

Read the full article.




read more

The California AG’s New Guide on CalOPPA – A Summary for Privacy Pros

Last week, the California Attorney General’s Office (AGO) released a series of recommendations entitled Making Your Privacy Practices Public (Guide) designed to help companies meet the requirements of California’s Online Privacy Protection Act (CalOPPA) and “provide privacy policy statements that are meaningful to consumers.”

As we have previously discussed, CalOPPA requires website operators to disclose (1) how they respond to Do Not Track (DNT) signals from browsers and other mechanism that express the DNT preference, and (2) whether third parties use or may use the site to track (i.e., collect personally identifiable information about) individual California residents “over time and across third party websites.”   Since the disclosure requirements became law, however, there has been considerable confusion among companies on how exactly to comply, and some maintain that despite W3C efforts, there continues to be no industry-wide accepted definition of what it means to “respond” to DNT signals.  As a result, the AGO engaged in an outreach process, bringing stakeholders together to provide comments on draft recommendations over a period of several months, finally culminating in the AGO publishing the final Guide earlier this week.

The Guide is just that – a guide – rather than a set of binding requirements.  However, the recommendations in the Guide do seem to present a road map for how companies might steer clear of an AGO enforcement action in this area.  As a result, privacy professionals may want to consider matching up the following key recommendations from the Guide with existing privacy policies, to confirm that they align or to consider whether it is necessary and appropriate to make adjustments:

  • Scope of the Policy:  Explain the scope of the policy, such as whether it covers online or offline content, as well as other entities such as subsidiaries.
  • Availability:  Make the policy “conspicuous” which means:
    • for websites, put a link on every page that collects personally identifiable information (PII).
    • for mobile apps that collect PII, put link at point of download, and from within the app – for example: put a link accessible from the “about” or “information” or “settings” page.
  • Do Not Track:
    • Prominently label the section of your policy regarding online tracking, for example: “California Do Not Track Disclosures”.
    • Describe how you respond to a browser’s Do Not Track signal or similar mechanisms within your privacy policy instead of merely providing a link to another website; when evaluating how to “describe” your response, consider:
      • Do you treat users whose browsers express the DNT signal differently from those without one?
      • Do you collect PII about browsing activities over time and third party sites if you receive the DNT signal?  If so, describe uses of the PII.
    • If you choose to link to an online program rather than describe your own response, provide the link with a general description of what the program does.
  • Third Party Tracking:
    • Disclose whether third parties are or may be collecting PII.
    • When drafting the disclosure [...]

      Continue Reading



read more

Incorporating Risk Analysis Into Your HIPAA Strategy

In building a stout privacy and security compliance program that would stand up well to federal HIPAA audits, proactive healthcare organizations are generally rewarded when it comes to data breach avoidance and remediation. But an important piece of that equation is performing consistent risk analyses.

McDermott partner, Edward Zacharias, was interviewed by HealthITSecurity to discuss these topics and more.

Read the full interview.




read more

The New Normal: Big Data Comes of Age

On May 1, 2014, the White House released two reports addressing the public policy implications of the proliferation of big data. Rather than trying to slow the accumulation of data or place barriers on its use in analytic endeavors, the reports assert that big data is the “new normal” and encourages the development of policy initiatives and legal frameworks that foster innovation, promote the exchange of information and support public policy goals, while at the same time limiting harm to individuals and society. This Special Report provides an overview of the two reports, puts into context their conclusions and recommendations, and extracts key takeaways for businesses grappling with understanding what these reports—and this “new normal”—mean for them.

Read the full article.




read more

Thinking Outside the HIPAA Box

On Wednesday, May 7, the Federal Trade Commission (FTC) held the third of its Spring Seminars on emerging consumer privacy issues.  This session focused on consumer-generated health information (CHI).  CHI is data generated by consumers’ use of the Internet and mobile apps that relates to an individual’s health.  The “H” in CHI defies easy definition but likely includes, at minimum, data generated from internet or mobile app activity related to seeking information about specific conditions, disease/ medical condition management tools, support and shared experiences through online communities or tools for tracking diet, exercise or other lifestyle data.

In the United States, many consumers (mistakenly) believe that all of their health-related information is protected, at the federal level, by the Health Information Portability and Accountability Act (HIPAA).  HIPAA does offer broad privacy protections to health-related information, but only to identifiable health information received by or on behalf of a “covered entity” or a third party working for a covered entity.  Covered entities are, essentially, health plans and health care providers who engage in reimbursement transactions with health plans (referred to as “Protected Health Information” or “PHI”). When HIPAA was enacted in 1996, PHI was the primary type of health information, but CHI, which is generally not also PHI, has changed that.  As FTC Commissioner Julie Brill noted her in her opening remarks, CHI is “health data stored outside the HIPAA silo.”

Without the limitations imposed by HIPAA, online service providers and mobile apps generally (except where state law requires differently) can treat CHI like other digital non-health data that they collect from consumers.  As a result, the FTC expressed concerned that CHI may be aggregated, shared and linked in ways that consumers did not foresee and may not understand.

The panelists at the FTC discussed the difficulty in defining CHI, and whether and how it is different from other kinds of data collected from consumers.  One panelist noted that whether a consumer considers his or her CHI sensitive is highly individualized.  For example, are the heart rate and exercise data collected by mobile fitness apps sensitive? Would the answer to this question change if these data points were linked with other data points that began to suggest other health or wellness indicators, just as weight?  Would the answer change if that linked data was used to predict socioeconomic status that is often linked to certain health, wellness and lifestyle indicators or used to inform risk rating or direct to consumer targeted advertising?

Panelists also discussed the larger and more general question of how to define privacy in a digital economy and how to balance privacy with the recognized benefits of data aggregation and data sharing.  These questions are compounded by the difficulty of describing data as being anonymized or de-identified – foundational principles in most privacy frameworks – because the quality of being “identifiable” in the digital economy may depend on the proximity of a piece of data to other pieces of data.

Though the “how” and “what” of additional [...]

Continue Reading




read more

Disclosures Need Not Contain Customers’ Actual Names to Violate the Video Privacy Protection Act Rules Hulu Court

In the latest of a string of victories for the plaintiffs in the Video Privacy Protection Act (VPPA) class action litigation against Hulu, LLC, the U.S. District Court for the Northern District of California ruled that Hulu’s sharing of certain customer information with Facebook, Inc. may have violated the VPPA, even though Hulu did not disclose the actual names of its customers.  The ruling leaves Hulu potentially liable for the disclosures under the VPPA and opens the door to similar claims against other providers of online content.

The decision by U.S. Magistrate Judge Laurel Beeler addressed Hulu’s argument on summary judgment that it could not have violated the VPPA because Hulu “disclosed only anonymous user IDs and never linked the user IDs to identifying data such as a person’s name or address.”  The court rejected Hulu’s argument, stating that “[Hulu’s] position paints too bright a line.”  Noting that the purpose of the VPPA was to prevent the disclosure of information “that identifies a specific person and ties that person to particular videos that the person watched” the court held that liability turned on whether the Hulu’s disclosures were “merely an anonymized ID” or “whether they are closer to linking identified persons to the videos they watched.”

Under this principle, the court held that Hulu’s disclosures to comScore, a metrics company that Hulu employed to analyze its viewership for programming and advertising purposes, did not violate the VPPA.  According to the court, Hulu’s disclosure to comScore included anonymized user IDs and other information that could theoretically be used to identify the particular individuals and their viewing choices.  But the plaintiffs had no evidence that comScore had actually used the information in that way.  As the evidence did not “suggest any linking of a specific, identified person and his video habits” the court held that the disclosures to comScore did not support a claim under the VPPA.

But the court held that Hulu’s disclosure to Facebook had potentially violated the VPPA.  Hulu’s disclosures to Facebook included certain cookies that Hulu sent to Facebook that allowed Hulu to load a Facebook “Like” button on users’ web browsers.  The court held that the cookies that Hulu sent to Facebook to accomplish this task “together reveal information about what the Hulu user watched and who the Hulu user is on Facebook.”  The court noted that this disclosure was “not merely the transmission of a unique, anonymous ID”; rather it was “information that identifies the Hulu user’s actual identity on Facebook” as well as the video that the Facebook user was watching.  Thus, the court held, Hulu’s disclosures to Facebook potentially violated the VPPA.

The Court’s ruling that disclosure of seemingly anonymous IDs can potentially lead to liability under the VPPA should cause companies that are potentially covered by the law to reexamine the ways in which they provide data to third parties.  Such companies should carefully consider not only what information is disclosed but also how the recipients of that data can reasonably be expected [...]

Continue Reading




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law