Cybersecurity
Subscribe to Cybersecurity's Posts

Court of Justice of the European Union Says Safe Harbor Is No Longer Safe

Earlier today, the Court of Justice of the European Union (CJEU) announced its determination that the U.S.-EU Safe Harbor program is no longer a “safe” (i.e., legally valid) means for transferring personal data of EU residents from the European Union to the United States.

The CJEU determined that the European Commission’s 2000 decision (Safe Harbor Decision) validating the Safe Harbor program did not and “cannot eliminate or even reduce the powers” available to the data protection authority (DPA) of each EU member country. Specifically, the CJEU opinion states that a DPA can determine for itself whether the Safe Harbor program provides an “adequate” level of personal data protection (i.e., “a level of protection of fundamental rights and freedoms that is essentially equivalent to that guaranteed within the European Union” as required by the EU Data Protection Directive (95/46/EC)).

The CJEU based its decision invalidating that Safe Harbor opinion in part on the determination that the U.S. government conducts “indiscriminate surveillance and interception carried out … on a large scale”.

The plaintiff in the case that gave rise to the CJEU opinion, Maximilian Schrems (see background below), issued his first public statement praising the CJEU for a decision that “clarifies that mass surveillance violates our fundamental rights.”

Schrems also made reference to the need for “reasonable legal redress,” referring to the U.S. Congress’ Judicial Redress Act of 2015. The Judicial Redress Act, which has bi-partisan support, would allow EU residents to bring civil actions in U.S. courts to address “unlawful disclosures of records maintained by an [U.S. government] agency.

Edward Snowden also hit the Twittersphere with “Congratulations, @MaxSchrems. You’ve changed the world for the better.”

Background

Today’s CJEU opinion invalidating the Safe Harbor program follows on the September 23, 2015, opinion from the advocate general (AG) to the CJEU in connection with Maximilian Schrems vs. Data Protection Commissioner.

In June 2013, Maximilian Schrems, an Austrian student, filed a complaint with the Irish DPA. Schrems’ complaint related to the transfer of his personal data collected through his use of Facebook. Schrems’ Facebook data was transferred by Facebook Ireland to Facebook USA under the Safe Harbor program. The core claim in Schrems’ complaint is that the Safe Harbor program did not adequately protect his personal data, because Facebook USA is subject to U.S. government surveillance under the PRISM program.

The Irish DPA rejected Schrems’ complaint because Facebook was certified under the Safe Harbor Program. Schrems appealed to the High Court of Ireland, arguing that the Irish (or any other country’s) DPA has a duty to protect EU citizens against privacy violations, like access to their personal data as part of U.S. government surveillance. Since Schrems’ appeal relates to EU law (not solely Irish law), the Irish High Court referred Schrems’ appeal [...]

Continue Reading




read more

Amendment to the Personal Information Protection Act Passed in the National Assembly July 6, 2015

On July 6, 2015, the Korean National Assembly passed a bill containing several amendments to the Personal Information Protection Act (PIPA). This bill (the Amendment Bill) combines a number of major provisions from nine previous different bills – e.g., one introduced in 2013 and eight proposed in 2014 following the massive data breach of three major credit card companies that occurred in January 2014 (the Credit Card Company Data Breach). Although the amended version of the PIPA (the Amended Act) will take effect upon its promulgation (yet to be determined), most of the provisions that will significantly affect the obligations and responsibilities of data handlers are scheduled to take effect either a year after the Amended Act’s promulgation or on January 1, 2016. For timely compliance with the amended law, companies processing customer or employee data need to keep an eye on the respective effective dates of provisions of the Amended Act that are particularly applicable to them.

1. Significance of the Amendment

The PIPA was adopted in 2011, among others, to protect the privacy of individuals and their personal information from unlawful collection, leakage, appropriation and misuse. However, even after the PIPA’s enactment in 2011, large-scale data breaches were not uncommon, and the Credit Card Company Data Breach last year was the final straw that prompted a call for stricter data protection and privacy regulations across the board to raise awareness of the significance of data protection and security and potential serious risks. The Amendment Bill keeps pace with the stricter rules of the recently amended version of the Utilization and Protection of Credit Information Act.

More specifically, the Amendment Bill extends stronger protection measures to individuals affected by data breaches by providing for punitive damages and statutory damages. Further, heavier penalties are imposed on those who violate certain provisions of the PIPA, and illegal proceeds generated from such violations are subject to forfeiture and collection. Whereas the current version of the PIPA provided for the recovery of damages in the event an individual’s personal information was stolen, lost, leaked, falsified or damaged, the Amendment Bill explicitly prescribes “fabrication” of personal information as an additional type of data breach, so that affected individuals will also be able to claim damages if their personal information is fabricated. The Amendment Bill also awards broader authority to the Personal Information Protection Committee (PIPC) to address loopholes relating to the practical operation of the PIPC in the PIPA, and provides for the legal grounds for the designation of institutions for data protection certification. Overall, the Amendment Bill contains provisions that increase the level of penalties imposed on violators.

Some of the key changes to the PIPA pursuant to this amendment are summarized below.

2. Adoption of Punitive Damages and Statutory Damages Provisions

The Amendment Bill deletes Article 39(2) of the PIPA which sets forth the mitigating circumstances of a data handler’s liability for damages incurred by a data subject whose personal information is mishandled. Furthermore, under the Amendment Bill, if a person suffers [...]

Continue Reading




read more

‘Right to Be Forgotten’ in Russian Data Protection Law Has Passed All Stages of Approval

On July 14, 2015, Vladimir Putin, the president of the Russian Federation, has signed the law on implementation of the “right to be forgotten” (the Law). The Law comes into force on January 1, 2016.

1. New obligations imposed on search engines on the Internet

The right to be forgotten applies to the information that had been disseminated by search engine operators distributing advertisements on the Internet for attracting attention of Russian consumers in the following cases:

  • Information had been disseminated in contradiction of legislative requirements;
  • Information that is inaccurate;
  • Information that is accurate but is no longer relevant due to the subsequent development or actions of a data subject (with some exceptions).

2. How will the right to be forgotten be exercised?

The request to delist search results submitted by a data subject (applicant) must contain certain information prescribed by the Law (e.g., full name, passport data, contact information of the applicant, specific information that should be forgotten, reasons for delisting, reference to the Internet website containing information, which shall be delisted, and consent to processing of the applicant’s personal data).

It is important to note that the right to be forgotten may be exercised only by individuals, and not legal entities.

Within 10 business days as of the receipt of the delisting request, search engine must perform one of the following actions:

  • Delist search results related to personal information of the applicant in case search results gained via search requests included name and (or) surname of the applicant
  • Provide the applicant with substantiated written refusal to delist the said search results

If the applicant does not agree with the decision made by the search engine, he or she is entitled to file a respective claim to the competent court.

Information on filing the delisting request by the applicant must be kept confidential by the search engine.

3. Liability for non-compliance

Along with the finally approved Law, another initiative has been submitted to the State Duma on May 29, 2015, and may be considered by the State Duma in the autumn session this year. If passed, the new initiative would institute an administrative fine in the amount of RUR 100,000 (approximately EUR 1,580) for a search engine’s unlawful failure to delist the links related to data subject’s personal information upon his or her request, or in the amount of RUR three million (approximately EUR 47,619) for the search engine’s failure to comply with the court decision requiring delisting of such links.




read more

With No Federal Law in Sight, States Continue to Refine Their Own Data Privacy Laws

With no Congressional consensus to adopt a federal data privacy and breach notification statute, states are updating and refining their already-existing laws to enact more stringent requirements for companies.  Two states recently passed updated data privacy laws with significant changes.

Rhode Island

The Rhode Island Identity Theft Protection Act (Rhode Island Data Law), an update to Rhode Island’s already-existing data security and breach notification law, introduces several new requirements for companies that store, collect, process, use or license personal identifying information (PII) about Rhode Island residents.

A few of these provisions are particularly noteworthy.  First, the new law requires entities to “implement and maintain a risk-based information security program which contains reasonable security procedures and practices,” scaled to the size of the entity and the type of personal information in its possession.  Second, the Rhode Island Data Law requires that any entity that discloses PII to a third party have a written contract with the third party pursuant to which the third party will also implement and maintain an information security program to protect the personal information.  Third, the Rhode Island Data Law requires any entity that experiences a data breach of personal information to notify affected residents within 45 calendar days after it knows that a breach has occurred.  (Rhode Island also required this under its previous law, but there was no precise time frame.)  Among other information, the notification must now contain information about data protection services to be offered to the resident, as well as information about how the resident can request a security credit freeze.

Under both the old and new laws, a health care provider, insurer or covered entity that follows the medical privacy and security rules established by the federal government pursuant to the Health Insurance Portability and Accountability Act (HIPAA) is deemed compliant with the law’s requirements.  The Rhode Island Data Law will become effective June 26, 2016.

Connecticut

The Connecticut Act Improving Data Security and Effectiveness (Connecticut Data Law) similarly updates Connecticut’s existing law and introduces more stringent requirements for entities that that store, collect, process, use or license PII about Connecticut residents.

Perhaps most noteworthy, the Connecticut Data Law puts in place important new requirements about notification following a data breach.  Unlike the older Connecticut breach notification law, the Connecticut Data Law now requires an entity to notify affected individuals of a data breach within a set time period of 90 days.  In addition, if the breach involves disclosure of Social Security numbers, the entity must also provide free credit monitoring services to individuals for one year.  Many companies provide credit monitoring at no cost to their customers affected by a data breach voluntarily.  However, laws like Connecticut’s make credit monitoring a mandatory part of any company’s response.

Additionally, the Connecticut Data Law imposes significant new requirements on insurers and state contractors that handle PII.  Health insurers are required to develop and follow a written data security program, and to certify annually to [...]

Continue Reading




read more

Don’t Miss the Upcoming Privacy + Security Forum

McDermott partners Heather Egan Sussman and Jennifer Geetter are scheduled to speak at the upcoming Privacy + Security Forum in Washington, D.C. on October 21–23, 2015. The Forum is an exciting new annual event, organized by Professors Daniel Solove and Paul Schwartz, that will bring together many of the biggest names in privacy and security to (1) break down the silos between privacy and security; and (2) bring more rigor to conferences so that participants gain useful practical knowledge. Ms. Sussman and Ms. Geetter have been invited to share their knowledge and experience in helping multi-national companies build highly successful and functional privacy and security programs.

Held in Washington, D.C., the Forum’s pre-conference workshops are on Wednesday, October 21, and the conference is on Thursday, October 22–Friday, October 23. There are now 100+ confirmed speakers with more to be announced soon. Click here for more information on speakers and sessions.

Want to attend? Contact Ms. Sussman or Ms. Geetter to receive the McDermott discount: 25 percent off the registration fee.




read more

Start with Security

On June 30, 2015, the Federal Trade Commission (FTC) published “Start with Security: A Guide for Businesses” (the Guide).

The Guide is based on 10 “lessons learned” from the FTC’s more than 50 data-security settlements. In the Guide, the FTC discusses a specific settlement that helps clarify the 10 lessons:

  1. Start with security;
  2. Control access to data sensibly;
  3. Require secure passwords and authentication;
  4. Store sensitive personal information securely and protect it during transmission;
  5. Segment networks and monitor anyone trying to get in and out of them;
  6. Secure remote network access;
  7. Apply sound security practices when developing new products that collect personal information;
  8. Ensure that service providers implement reasonable security measures;
  9. Implement procedures to help ensure that security practices are current and address vulnerabilities; and
  10. Secure paper, physical media and devices that contain personal information.

The FTC also offers an online tutorial titled “Protecting Personal Information.”

We expect that the 10 lessons in the Guide will become the FTC’s road map for handling future enforcement actions, making the Guide required reading for any business that processes personal information.




read more

Canadian Government Amends and Strengthens PIPEDA, Adding Breach Notification Requirement and Filling Other Gaps

Just prior to recessing for the summer, the Canadian government enacted the Digital Privacy Act. It includes a number of targeted amendments to strengthen existing provisions of the Personal Information Protection and Electronic Documents Act (PIPEDA), but falls short of providing the Privacy Commissioner of Canada (Commissioner) with direct enforcement powers, as some stakeholders—including the former Commissioner—had proposed.

The Digital Privacy Act was introduced in April 2014 as part of the government’s “Digital Canada 150” strategy. While it was touted as providing new protections for Canadians when they surf the web and shop online, there is nothing that is particularly “digital” about the bill, which will equally affect the bricks and mortar, paper-based world.

Of particular note, the Digital Privacy Act creates a duty to report data breaches to both the Privacy Commissioner and to affected individuals “where it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to the individual.” Failure to report data breaches in the prescribed manner could result in fines of up to $100,000 for non-compliant organizations. While the majority of the new law is currently in force, the provisions relating to breach notification have yet to be proclaimed in force by the government.

Once in force, the mandatory breach-reporting regime will bring the federal law into alignment with many international laws, as well as with Alberta’s own Personal Information Protection Act, which has had a breach notification provision since 2009. However, unlike the Alberta law, the Digital Privacy Act would also require organizations to maintain records of all data breaches involving personal information under their control—even if they do not require reporting to the Commissioner or to affected individuals—and to provide these records to the Commissioner on request. Failure to comply with these requirements could also result in a fine of up to $100,000.

The law also creates an explicit authority to enable the federal Privacy Commissioner to enter into a compliance agreement with an organization, where the Commissioner believes on reasonable grounds that the organization has, or is about to, contravene the Act.  If such an agreement is later contravened, the Commissioner will be able to apply to the Federal Court of Canada for a remedial order, even if the original limitation period for such an application has lapsed. The law also extends the limitation period for an application to the Federal Court for damages or injunctive relief to one year after the Commissioner issues a report of findings or otherwise discontinues an investigation. Previously, such applications had to be brought by either the Commissioner or a complainant within 45 days of a report of findings or discontinuation.

The Digital Privacy Act also imposes new requirements on the form of consent that the Act requires from individuals respecting the handling of their personal information. Going forward, any consent will be valid only if an individual to whom an organization’s activities are directed would understand the nature, purpose and consequences of the collection, use and disclosure of [...]

Continue Reading




read more

CNIL Announces Inspection Program—Focus Will Be on BCR Compliance and Treatment of Psychosocial Data, Among Others

The mission of the French data protection authority—the Commission Nationale Informatique et Libertés (CNIL)—is “to protect personal data, support innovation, [and] preserve individual liberties.”

In addition to its general inspections, every year the CNIL establishes a different targeted-inspection program. This program identifies the specific areas that CNIL’s controls will concentrate on for the following year. The 2014 inspection program was focused on everyday life devices, such as online payment, online tax payment and dating websites, among other things.

On May 25, 2015, the CNIL announced its 2015 inspection program and identified a focus on six issues in particular: contactless payment, Driving Licenses National File (Le Fichier National des Permis de Conduire), the “well-being and health” connected devices, monitoring tools used for attendance in public places, the treatment of personal data during evaluation of psychosocial risks and the Binding Corporate Rules.

The last two issues caught our attention:

  • Treatment of personal data during evaluation of psychosocial risks: Since 2008, many companies have been investigating psychosocial risks within the workplace in order to provide a more stress-free environment. This practice, however, raises issues concerning the employee’s right not to share private information with the employer. The CNIL will try to identify which prior investigations may have jeopardized (or may still be jeopardizing) the employee’s rights to privacy.
  • Binding Corporate Rules: Companies seeking to export data outside of the European Union (EU) may adopt a voluntary set of data-protection rules within their corporate group called Binding Corporate Rules (BCR). These BCRs are intended to provide a level of privacy and data protection within the entire corporate group equivalent to the one found under EU law. So far, 68 companies have adopted BCRs. Through its 2015 inspection program, the CNIL wants to give the BCRs a closer look, making sure that the means and devices used are in compliance with French law.

In addition to focusing its 2015 inspection program on BCR compliance, the CNIL also announced, earlier this year, the simplification of intra-group data transfers. Prior to simplification, companies whose BCRs had been approved by the CNIL were also required to obtain the CNIL’s approval for each new type of transfer. The CNIL has since declared that a new, personalized “single decision” will be given to companies with approved BCRs. In return, the companies must keep an internal record of all transfers detailing certain information (the general purpose of each transfer based on the BCR; the category of data subjects concerned by the transfer; the categories of personal data transferred; and information on each data recipient) in accordance with the terms of the single decision issued.

With respect to its targeted inspection program, the question still remains: How many inspections will the CNIL conduct in 2015? In 2014, the CNIL performed a total number of 421 inspections. The CNIL declares that, in 2015, the objective is to achieve 550 inspections. However, only 28 percent of the CNIL’s inspections typically result from the annual inspection program. Forty percent are initiated by the [...]

Continue Reading




read more

Should We Hack Back?

“No,” says U.S. Assistant Attorney General Leslie R. Caldwell.  At the most recent Cybersecurity Law Institute held at Georgetown University Law Center in late May, the head of the U.S. Department of Justice’s (DOJ) Criminal Division offered guidance to attendees on how to prevent and combat cybercrime. She also spoke about significant victories that the Criminal Division had achieved with the help of private sector and foreign collaboration. In the last year or so alone, the U.S. government extradited about a dozen high-level cybercriminals from around the world.

In her speech, Caldwell urged the private sector to work more closely with the government, explaining that “the Criminal Division is better positioned than ever before” to help organizations bring intruders to justice, defend networks and prevent cybercrimes from happening in the first place. Among other things, she reported that the new DOJ Cybersecurity Unit has broken new ground, including recently releasing well-received guidance called “Best Practices for Victim Response and Reporting of Cyber Incidents,” which we discussed on this blog post earlier this month  – and made the case for why businesses should not take defensive measures such as “hacking back” against attackers in an effort to punish an attacker or to retrieve or delete stolen data.

Caldwell summed up the Division’s legal position on hacking back: “based on a simple, plain-text reading of the Computer Fraud and Abuse Act, such conduct is generally unlawful.” If that were not reason enough, she explained, businesses should still avoid hacking back for these legal, policy and practical reasons:

  1. Hacking back tactics pose a significant threat to innocent third parties whose infrastructure may be hijacked by cybercriminals, in order to more easily commit crimes and to mask the hacker’s identity during subsequent investigations;
  2. Hacking back can interfere with and irreparably harm ongoing government investigations;
  3. Hacking back carries the danger of dramatic escalation against unknown and potentially sophisticated adversaries who may have powerful and destructive technical capabilities;
  4. Such activities may be illegal in foreign jurisdictions;
  5. Hacking back may have serious effects on international relations and could have foreign policy consequences; and
  6. There is a low likelihood that such activities would be beneficial and yield anything other than the momentary pleasure that comes with taking action.

Caldwell’s points are well taken. From our perspective, one of the best ways for a company to prevent, detect, respond to, remediate, survive and even thrive following a cyberattack is to have in place an effective Incident Response Plan that has been tested, adapted and improved over time to reflect changing technology, business circumstances and emerging threats to the organization. Companies that want to incorporate strategies for hacking back into their plans should carefully consider the legal and practical risks and consult with legal counsel prior to taking any action.




read more

Federal Agents Lacked Authority to Search Airplane Passenger’s Laptop, Court Says

A federal court this month found that federal agents lacked authority to conduct a warrantless search of a defendant’s laptop seized at an airport, rejecting the government’s argument that it has unfettered authority to search containers at the border to protect the homeland.  The court distinguished laptops from handbags due to their “vast storage capacity” and found that there was little or no reason to suspect that “criminal activity was afoot” at the time the defendant was about to cross the border.  Rather, agents confiscated the laptop before the defendant boarded his plane at Los Angeles International Airport as part of a pre-existing investigation into the defendant for violation of export control laws.  The agents then sent the laptop to San Diego for extensive forensic imaging and searches over an indefinite period of time.  The court held that this amounted to an unreasonable invasion of the defendant’s right to privacy.

The court relied in part on the U.S. Supreme Court’s recent decision in Riley v. California, 134 S. Ct. 2473 (2014), explaining that Riley “made it clear that the breadth and volume of data stored on computers and other smart devices make today’s technology different in ways that have serious implications for the Fourth Amendment analysis . . . ”

It would not be surprising for the government to appeal the ruling in view of the importance of the border exception to the Fourth Amendment’s search warrant requirement.

Although the decision is grounded in the Fourth Amendment and therefore generally applicable to searches conducted by the government, courts consider Fourth Amendment precedent when evaluating searches by private corporations acting as instruments or agents of the government.  See, e.g., Skinner v. Ry. Labor Executives Ass’n, 489 U.S. 602, 614 (1989) (Fourth Amendment applied to drug and alcohol testing required by private railroads in reliance on federal regulations); United States v. Ziegler, 474 F.3d 1184, 1190 (9th Cir. 2007) (Information Technology department representatives for private company who worked with Federal Bureau of Investigation and seized copies of employee’s hard drive acted as “de facto government agents,” thereby implicating the Fourth Amendment); United States v. Reed, 15 F.3d 928 (9th Cir. 1994) (Fourth Amendment applied to hotel employee’s warrantless search of defendant’s room in light of the presence of police lookouts and the employee’s intent to help police gather proof of narcotics trafficking).  Therefore, companies should take notice of this decision and evaluate the extent to which the court’s rationale may be applied in the private employer context.

The case is United States v. Jae Shik Kim, et al., No. 1:13-cr-00100-ABJ (D.D.C. 2013).  The decision is at Docket Entry 42.




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law