Data Privacy
Subscribe to Data Privacy's Posts

Is There an End in Sight for EU Data Protection Reform?

On 5 November 2014, Peter Hustinx, the European Data Protection Supervisor (EDPS), together with Germany’s Federal Data Protection Commissioner, Andrea Voβhoff, held a panel discussion in respect of the state of play and perspectives on EU data protection reform.

Although participants identified a number of key outstanding issues to be resolved prior to the conclusion of the reform process, there was some optimism that such issues could be overcome, and the process completed, before the end of 2015.

Background

The EDPS is an independent supervisory authority whose members are elected by the European Parliament and the Council in order to protect personal information and privacy, in addition to promoting and supervising data protection in the European Union’s institutions and bodies.  The role of the EDPS includes inter alia advising on privacy legislation and policies to the European Commission, the European Parliament and the Council and working with other data protection authorities (DPA) to promote consistent data protection throughout Europe.

The proposed data protection regulation is intended to replace the 1995 Data Protection Directive (95/46/EC) (the Directive) and aims not only to give individuals more control over their personal data, but also make it easier for companies to work across borders by harmonising laws between all EU Member States.  The European Parliament and the Civil Liberties, Justice and Home Affairs (LIBE) Committee have driven the progress on new data protection laws, but there has been frustration aimed at the Council of Ministers for their slow progress.  Following the vote by the European Parliament in March 2014 in favour of the new data protection laws, the next steps include the full Ordinary Legislative Procedure (co-decision procedure), which requires the European Parliament and the Council to reach agreement together.

The panel discussion attendees were made up of institutional representatives and key figures involved in the EU Data Protection Reform Package, including: Stefano Mura (Head of the Department for International Affairs at Italy’s Ministry of Justice); Jan Albrecht MEP (Vice-Chair and Rapporteur of the European Parliament LIBE Committee); and Isabelle Falque-Pierrotin (President of CNIL and Chair of the Article 29 Working Party).  The purpose of the panel discussion was to consider the outstanding issues and next steps to finalise proposals on EU data protection reform, particularly in the context of the recent CJEU rulings on data retention and the right to be forgotten.

Key Messages

The key points raised during the panel discussion included:

  • There is optimism that the reform process will be completed in the next year subject to resolving outstanding issues, such as:
    • Whether public authority processing should be included in the proposed data protection regulation – Andrea Voshoff commented that this issue was being considered by the Council of Ministers Committee in relation to the introduction of a clause preventing the lowering of standards by national laws.  Stefano Mura added that while there is a desire for both a uniform approach between the EU Member States and a right for Member States to regulate their own public sectors, a [...]

      Continue Reading



read more

When Seeking Cyber Coverage, Preparation is Key

In 2014, major data breaches were reported at retailers, restaurants, online marketplaces, software companies, financial institutions and a government agency, among others.  According to the nonprofit Privacy Rights Clearinghouse, 567 million records have been compromised since 2006.  Companies with data at risk should consider purchasing so-called cybersecurity insurance to help them weather storms created by assaults on their information infrastructure.  A company’s insurance broker and insurance lawyer can be of significant help in procuring insurance that meets a company’s need.

As an additional benefit, preparation for the cybersecurity insurance underwriting process itself likely will decrease the risk of a debilitating cyber incident.  The underwriting process for cybersecurity insurance is focused on the system that a company employs to protect its sensitive data, and can be detailed and exhaustive.  Like other insurance carriers, cybersecurity insurance carriers use the underwriting process to investigate prospective policyholders and ascertain the risks the carriers are being asked to insure.  Before applying for cybersecurity insurance, companies should perform due diligence on their information systems and correct as many potential risks as possible before entering the underwriting process.

Applicants for cybersecurity insurance may expect to answer questions about prior data breaches, information-technology vendors, antivirus and security protocols, and the species of data in their custody.  Carriers might also ask about “continuity plans” for the business, the company’s security or privacy policies, whether those policies are the product of competent legal advice, whether the company’s networks can be accessed remotely and, if so, what security measures are in place.  The investigation might even extend to a company’s employment practices, such as password maintenance and whether departing employees’ network access is cancelled prior to termination.  If a company has custody of private health information, carriers might delve into a company’s compliance with the Health Insurance Portability and Accountability Act of 1996.  Anything that makes a company more or less at risk for a data breach is fair game in the cybersecurity underwriting process.

Due diligence and corrective action prior to approaching an insurance company should yield three related results.  First, it should reduce the company’s risk of a data breach.  Because the insurance carriers are focused on what makes a company a larger or smaller risk to underwrite, companies can use carriers’ underwriting questions as a roadmap to improving the security of their information-technology systems.  Second, it should make the company more attractive to the prospective insurance company.  Insurance companies obviously prefer policyholders that do not present substantial risk of claims.  A company’s ability to present its systems as safe and secure will give a carrier a greater degree of comfort in reviewing and approving the application for insurance.  Finally, it should reduce the company’s premium for cybersecurity insurance.  Premium rates have a simple, direct relationship with risk.  As a policyholder’s risk profile increases, so too does the premium.  Shoring up gaps in a company’s security profile therefore should pay dividends in lower insurance costs.

Companies with sensitive data in their care should investigate options for cybersecurity insurance.  In [...]

Continue Reading




read more

You Are Invited: Join FTC Chairwoman Ramirez on November 12 at Our Menlo Park Office for a Conversation on Privacy and Technology

Will you be in the Bay Area on November 12?  You are invited to join Federal Trade Commission (FTC) Chairwoman Edith Ramirez at McDermott’s office in Menlo Park, California for a conversation in privacy and technology.  The FTC is celebrating its 100th anniversary and this will be the first time Chairwoman Ramirez is visiting the Bay Area since her appointment.  Come and ask the tough questions, join the lively conversation and mark this important visit by Chairwoman Ramirez as she talks about all things privacy and technology to some of the top tech teams in the country.  Please RSVP as space is limited.  A complimentary networking reception with Chairwoman Ramirez will immediately follow the program.

To register, please click here.




read more

Are You Monitoring Your French Employees? Make Sure You Have Registered That Activity with the CNIL!

French employers must declare monitoring to the French Data Protection Authority (CNIL) in advance if they want to use evidence obtained from that monitoring in court.   The use of the employee’s company mailbox for personal purposes is tolerated under French law, when reasonable. Where it is considered abusive, however, it could constitute a breach of conduct against which the employer may impose sanctions.

Employers generally use monitoring software to discourage and establish evidence of abuse. Such software may be lawful provided the employer follows the rules stipulated by the French Labor Code and the French Data Protection Act to ensure the protection of personal data. In particular, the employer must submit information to and engage in consultation with the works council, provide information to employees impacted by the software, as well as make a formal declaration of the proposed monitoring activities to CNIL – except where a Data Protection Correspondent (Correspondant Informatique et Libertés) is appointed.

These requirements must be met before the implementation of the monitoring software. If these steps are not fulfilled, the software and monitoring activity remains illicit and the employer cannot rely on evidence obtained through that software to establish the employee’s misconduct.

The requirement to comply with the French data privacy law was reinforced by the French Social Supreme Court in a case where an employer’s software monitoring company mailbox flows had detected that an employee had dispatched or received 1,228 personal messages. But the employer’s declaration to the CNIL about the software had been filed after the beginning of the employee’s dismissal process.

The Social Supreme Court ruled that the employer could not use the data collected and, more generally, that any data collected by an automated personal data processing tool prior to its CNIL filing, constitutes an illicit means of evidence.

This decision marks the first time that the French Social Supreme Court has officially ruled that prior declaration to the CNIL is a necessary condition affecting the validity of evidence in this context.  This is a similar conclusion and rationale to the 2013 decision where the sale of client files was rendered null and void by the French Supreme Commercial Court for failure to comply with the CNIL registration obligations and demonstrates once again how data protection is becoming a key matter in all legal areas, including employment law.




read more

California Continues to Lead with New Legislation Impacting Privacy and Security

At the end of September, California Governor Edmund G. Brown, Jr. approved six bills designed to enhance and expand California’s privacy laws. These new laws are scheduled to take effect in 2015 and 2016.  It will be important to be mindful of these new laws and their respective requirements when dealing with personal information and when responding to data breaches.

Expansion of Protection for California Residents’ Personal Information – AB 1710

Under current law, any business that owns or licenses certain personal information about a California resident must implement reasonable security measures to protect the information and, in the event of a data or system breach, must notify affected persons.  See Cal. Civil Code §§ 1798.81.5-1798.83.  Current law also prohibits individuals and entities from posting, displaying, or printing an individual’s social security number, or requiring individuals to use or transmit their social security number, unless certain requirements are met.  See Cal. Civil Code § 1798.85.

The bill makes three notable changes to these laws.  First, in addition to businesses that own and license personal information, businesses that maintain personal information must comply with the law’s security and notification requirements.  Second, in the event of a security breach, businesses now must not only notify affected persons, but also provide “appropriate identity theft prevention and mitigation services” to the affected persons at no cost for at least 12 months, if the breach exposed or may have exposed specified personal information.  Third, in addition to the current restrictions on the use of social security numbers, individuals and entities now also may not sell, advertise to sell, or offer to sell any individual’s social security number.

Expansion of Constructive Invasion of Privacy Liability – AB 2306

Under current law, a person can be liable for constructive invasion of privacy if the person uses a visual or auditory enhancing device and attempts to capture any type of visual image, sound recording, or other physical impression of the person in a personal or familial activity under circumstances in which the person had a reasonable expectation of privacy.  See Cal. Civil Code § 1708.8.

The bill expands the reach of the current law by removing the limitation requiring the use of a “visual or auditory enhancing device” and imposing liability if the person uses any device to capture a visual image, sound recording, or other physical impression of a person in a personal or familial activity under circumstances in which the person had a reasonable expectation of privacy.

The law will also continue to impose liability on those who acquire the image, sound recording, or physical impression of the other person, knowing that it was unlawfully obtained.  Those found liable under the law may be subject to treble damages, punitive damages, disgorgement of profits and civil fines.

Protection of Personal Images and Videos (“Revenge Porn” Liability)– AB 2643

Assembly Bill 2643 creates a private right of action against a person who intentionally distributes by any means, without consent, material that exposes a person’s intimate body parts or the [...]

Continue Reading




read more

Article 29 Working Party Discusses the Right to be Forgotten

On 18 September 2014, the European Union’s Article 29 Data Protection Working Party published a press release outlining its recent plenary session discussions on the so-called “right to be forgotten” or “de-listed.”

The Working Party identifies that search engines, as data controllers, are under an obligation to acknowledge requests to be de-listed and establishes amongst European data protection authorities a “tool box” for ensuring a common approach to complaints handling in the case of refusals to de-list.

Background

The Working Party, made up of EU member state national data protection authorities, is an independent advisory body on data protection and privacy, set up under Article 29 of the Data Protection Directive (95/46/EC) (DPD) in order to contribute to the DPD’s uniform application.

The purpose of its latest plenary session held on 16 and 17 September 2014 was to discuss the aftermath of the European Court of Justice’s (ECJ) May 2014 ruling which recognised an EU citizen’s right to have the results of searches conducted against their name and containing their personal information removed where such information was inaccurate, inadequate, irrelevant or excessive for the purposes of data processing.

Key Messages

The Working Party has acknowledged that there is high public demand for the right to be forgotten, based on the number of complaints received by European data protection authorities relating to refusals by search engines to de-list since the ECJ ruling.

The Working Party has agreed that there is a need for a uniform approach to the handling of de-listing complaints.  As such the Working Party has proposed that:

  • It is necessary to put in place a network of dedicated contact persons within European data protection authorities to develop common case-handling criteria; and
  • Such a network will provide data protection authorities with a record of decisions taken on complaints and a dashboard to assist in reviewing similar, new or more difficult cases.

Going forwards the Working Party also confirmed that it will continue to review how search engines comply with the ECJ’s ruling, having already held a consultation process with search engines and media companies over the summer.




read more

GPEN Publishes Privacy Sweep Results

On 10 September 2014, the Global Privacy Enforcement Network (GPEN) published the results of its privacy enforcement survey or “sweep” carried out earlier in 2014 with respect to popular mobile apps.  The results of the sweep are likely to lead to future initiatives by data protection authorities to protect personal information submitted to mobile apps.

The purpose of the sweep was to determine the transparency of the privacy practices of some 1,211 mobile apps and involved the participation of 26 data protection authorities across the globe.  The results of the sweep suggest that a high proportion of the apps downloaded did not sufficiently explain how consumers’ personal information would be collected and used.

Background

GPEN was established in 2010 on the recommendation of the Organisation for Economic Co-operation and Development.  GPEN aims to create cooperation between data protection regulators and authorities throughout the world in order to strengthen personal privacy globally.  GPEN is currently made up of 51 data protection authorities across some 39 jurisdictions.

Over the course of a week in May 2014, GPEN’s “sweepers” – made up of 26 data protection authorities across 19 jurisdictions, including the UK Information Commissioner’s Office (ICO) – participated in the survey by downloading and briefly interacting with the most popular apps released by developers in their respective jurisdictions, in an attempt to recreate a typical consumer’s experience.  In particular GPEN intended the sweep to increase public and commercial awareness of data protection rights and responsibilities as well as identify specific high-level issues which may become the focus of future enforcement actions and initiatives.

Sweep Results

The key negative findings of GPEN sweep include:

  • 85 percent of apps failed to clearly explain how personal information would be processed.
  • 59 percent of apps did not clearly indicate basic privacy information (with 11 percent failing to include any privacy information whatsoever).
  • 31 percent of apps were excessive in their permission requests to access personal information.
  • 43 percent of the apps had not sufficiently tailored their privacy communications for the mobile app platform – often instead relying on full version privacy policies found on websites.

However, the sweep results also highlighted a number of examples of best practices for app developers, including:

  • Many apps provided clear, easy-to-read and concise explanations about exactly what information would be collected, how and when it would be used and, in some instances, explained specifically and clearly what would not be done with the information collected.
  • Some apps provided links to the privacy policies of their advertising partners and opt-out elections in respect of analytic devices.
  • There were good examples of privacy policies specifically tailored to the app platform, successfully making use of just-in-time notifications (warning users when personal information was about to be collected or used), pop-ups and layered information, allowing for consumers to obtain more detailed information if required.

Many of the GPEN members are expected to take further action following the sweep results.  For its part, the UK ICO has commented that in light [...]

Continue Reading




read more

France About to Embark on a Cookies Sweep Day

Impending sweep day to verify compliance with guidelines on cookies

During the week of September 15–19, 2014, France’s privacy regulator, the Commission Nationale de l’Informatique et des Libertés (CNIL), is organizing a “cookies sweep day” to examine compliance with its guidelines on cookies and other online trackers.

Starting in October 2014, the CNIL will also be conducting onsite and remote inspections to verify compliance with its guidelines on cookies.

Depending on the findings of the sweep and inspections, the CNIL may issue warnings or financial sanctions to non-compliant websites and applications.

Investigations gaining momentum

France is not the only country stepping up its data privacy efforts.  Parallel sweeps to the one conducted by the CNIL in September 2014 will be undertaken simultaneously by data protection authorities across the European Union.  The purpose of the coordinated action is to compare practices on the information given by websites to internet users and the methods to obtain their consent for cookies.

Nor is this the first time such a sweep has been organized in France.  In May 2013, the CNIL joined 19 counterparts worldwide in an audit of the 2,180 most visited websites and applications.  In that operation, known as “Internet Sweep Day”, the CNIL examined the compliance of 250 frequently visited websites and found that 99 percent of websites visited by French internet users collect personal information.  Of those that provided information on their data privacy policy, a considerable number did not render it easily accessible, clearly articulated or even written in French.

Compliance made simpler through CNIL guidelines

EU Directive 2002/58 on Privacy and Electronic Communications imposes an obligation to obtain prior consent before placing or accessing cookies and similar technologies on web users’ devices, an obligation incorporated into French law by Article 32-II of the French Data Protection Act.

Not all cookies require prior consent by internet users.  Exempt are cookies used “for the sole purpose of carrying out the transmission of a communication over an electronic communications network” and those that are “strictly necessary for the provision of an information service explicitly requested by the subscriber or user.”

For those cookies that require prior consent, the CNIL will verify how consent is obtained.  Under the CNIL guidelines, consent may be obtained either through an actual click or by the user’s further navigation within the site notwithstanding a continuing banner informing him or her of the website’s use of cookies.

Website owners can rely on tools made available by the CNIL to ensure their compliance with the cookie requirements.  In particular, a set of guidelines released by the CNIL in December 2013 explains how to obtain consent for the use of cookies and other online trackers in compliance with EU and French data protection requirements.

Under the CNIL guidelines, owners of websites may not force internet users to accept cookies.  Instead, the users must be able to block advertising cookies and still use the relevant service.  Internet users can withdraw their consent at any time, and cookies have a [...]

Continue Reading




read more

Wearable Technologies Are Here To Stay: Here’s How the Workplace Can Prepare

More than a decade ago, “dual use” devices (i.e., one device used for both work and personal reasons) began creeping into workplaces around the globe.  Some employees insisted on bringing fancy new smart phones from home to replace the company-issued clunker and, while many employers resisted at first, dual use devices quickly became so popular that allowing them became inevitable or necessary for employee recruitment and retention, not to mention the cost savings that could be achieved by having employees buy their own devices.  Because of early resistance, however, many HR and IT professionals found themselves scrambling in a reactive fashion to address the issues that these devices can raise in the workplace after they were already prevalent.  Today, most companies have robust policies and procedures to address the risks presented by dual use devices, setting clear rules for addressing privacy, security, protection of trade secrets, records retention and legal holds, as well as for preventing harassment, complying with the National Labor Relations Act (NLRA), protecting the company’s relationships and reputation, and more.

In 2014, there is a new trend developing in the workplace:  wearable technologies.   The lesson to be learned from the dual use device experience of the past decade: Companies should consider taking proactive steps now to identify the risks presented by allowing wearables at work, and develop a strategy to integrate them into the workplace in a way that maximizes employee engagement, but minimizes corporate risk.

An effective integration strategy will depend on the particular industry, business needs, geographic location and corporate culture, of course.  The basic rule of thumb from a legal standpoint, however, is that although wearables present a new technology frontier, the old rules still apply.  This means that companies will need to consider issues of privacy, security, protection of trade secrets, records retention, legal holds and workplace laws like the NLRA, the Fair Labor Standards Act, laws prohibiting harassment and discrimination, and more.

Employers evaluating use of these technologies should consider two angles.  First, some companies may want to introduce wearables into the workplace for their own legitimate business purposes, such as monitoring fatigue of workers in safety-sensitive positions, facilitating productivity or creating efficiencies that make business operations run more smoothly.  Second, some companies may want to consider allowing “dual use” or even just “personal use” wearables in the workplace.

In either case, companies should consider the following as part of an integration plan:

  • Identify a specific business-use case;
  • Consider the potential for any related privacy and security risks;
  • Identify how to mitigate those risks;
  • Consider incidental impacts and compliance issues – for instance, how the technologies impact the existing policies on records retention, anti-harassment, labor relations and more;
  • Build policies that clearly define the rules of the road;
  • Train employees on the policies;
  • Deploy the technology; and
  • Review the program after six or 12 months to confirm the original purpose is being served and whether any issues have emerged that should be addressed.

In other words, employers will need to run through [...]

Continue Reading




read more

Processing Personal Data in Russia? Consider These Changes to Russian Law and How They May Impact Your Business

Changes Impacting Businesses that Process Personal Data in Russia

On July 21, 2014, a new law Federal Law № 242-FZ was adopted in Russia (Database Law) introducing amendments to the existing Federal Law “On personal data” and to the existing Federal Law “On information, information technologies and protection of information.”  The new Database Law requires companies to store and process personal data of Russian nationals in databases located in Russia.  At a minimum, the practical effect of this new Database Law is that companies operating in Russia that collect, receive, store or transmit (“process”) personal data of natural persons in Russia will be required to place servers in Russia if they plan to continue doing business in that market.  This would include, for example, retailers, restaurants, cloud service providers, social networks and those companies operating in the transportation, banking and health care spheres.  Importantly, while Database Law is not scheduled to come into force until September 1, 2016, a new bill was just introduced on September 1, 2014 to move up that date to January 1, 2015.  The transition period is designed to give companies time to adjust to the new Database Law and decide whether to build up local infrastructure in Russia, find a partner having such infrastructure in Russia, or cease processing information of Russian nationals.  If the bill filed on September 1 becomes law, however, that transition period will be substantially shortened and businesses operating in Russia will need to act fast to comply by January 1.

Some mass media in Russia have interpreted provisions of the Database Law as banning the processing of Russian nationals’ personal data abroad.  However, this is not written explicitly into the law and until such opinion is confirmed by the competent Russian authorities, this will continue to be an open question.  There is hope that the lawmakers’ intent was to give a much needed boost to the Russian IT and telecom industry, rather than to prohibit the processing of personal data abroad.  If this hope is confirmed, then so long as companies operating in Russia ensure that they process personal data of Russian nationals in databases physically located in Russia, they also should be able to process this information abroad, subject to compliance with cross-border transfer requirements.  

The other novelty of this new Database Law is that it grants the Russian data protection authority (DPA) the power to block access to information resources that are processing information in breach of Russian laws.  Importantly, the Database Law provides that the blocking authority applies irrespective of the location of the offending company or whether they are registered in Russia.  However, the DPA can initiate the procedure to block access only if there is a respective court judgment.  Based on the court judgment the DPA then will be able to require a hosting provider to undertake steps to eliminate the infringements.  For example, the hosting provider must inform the owner of the information resource that it must eliminate the infringement, or the hosting [...]

Continue Reading




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law