Consumer Protection
Subscribe to Consumer Protection's Posts

FCC Updates Contest Rule to Provide Flexibility to Meet Disclosure Requirements

Radio and television stations, as well as their audiences, have reason to celebrate. Last week, the Federal Communications Commission (FCC) announced significant updates to its regulations regarding the disclosure of material terms associated with promotional contests and sweepstakes conducted by television and radio broadcast stations. Since 1976, Section 73.1216 of the FCC rules (the Contest Rule) required broadcast stations that advertised its contests and sweepstakes to the general public to disclose the material terms on air. These on-air disclosures typically have taken the form of very rapidly recited terms at the end of the broadcast announcing the contest or extremely small print found at the bottom of the television screen.  In an attempt to give broadcasters more flexibility in meeting their disclosure obligations and adapt to changing consumer expectations in the Internet Age, the FCC has updated the Contest Rule to allow broadcast stations to disclose material contest and sweepstakes terms on a readily accessible public website.

Under the revised Contest Rule, television and radio broadcasters that choose to disclose material terms of contests and sweepstakes through a website must do the following:

  • Provide the terms on a “publically accessible” website (i.e., designed to be available to public 24/7, free of charge, with no registration requirement);
  • Broadcast the relevant website address periodically on air, providing sufficient information for a consumer to find the terms easily. Broadcasters can meeting this requirement by mechanically reciting the website address as it appears in a browser (e.g., http-colon-backslash, etc.”) or using simple instructions (e.g., “for contest rules go to kxyz.com and then click on the contest tab);
  • Provide a conspicuous link or tab on the broadcaster’s home page, labeled in a way that makes clear its relation to contest or sweepstakes information;
  • Maintain the material terms on the website for at least 30 days after the contest or sweepstakes has ended;
  • Where the material terms of contest has changed, announce that the terms have changed on air within 24 hours and periodically thereafter, and direct participants to the website to review the changes;
  • Ensure that the material terms disclosed on the website conform in all substantive respects to the contest or sweepstakes terms broadcast over the air.

While discussing the updated regulations, the FCC affirmed its commitment to the core principles of the Contest Rule and reminded broadcasters that regardless of the medium of disclosure, broadcasters must provide complete, accurate and timely information about the contests they conduct, ensure that such information is not false, misleading or deceptive, and conduct their contests substantially as announced or advertised.




read more

Amendment to the Personal Information Protection Act Passed in the National Assembly July 6, 2015

On July 6, 2015, the Korean National Assembly passed a bill containing several amendments to the Personal Information Protection Act (PIPA). This bill (the Amendment Bill) combines a number of major provisions from nine previous different bills – e.g., one introduced in 2013 and eight proposed in 2014 following the massive data breach of three major credit card companies that occurred in January 2014 (the Credit Card Company Data Breach). Although the amended version of the PIPA (the Amended Act) will take effect upon its promulgation (yet to be determined), most of the provisions that will significantly affect the obligations and responsibilities of data handlers are scheduled to take effect either a year after the Amended Act’s promulgation or on January 1, 2016. For timely compliance with the amended law, companies processing customer or employee data need to keep an eye on the respective effective dates of provisions of the Amended Act that are particularly applicable to them.

1. Significance of the Amendment

The PIPA was adopted in 2011, among others, to protect the privacy of individuals and their personal information from unlawful collection, leakage, appropriation and misuse. However, even after the PIPA’s enactment in 2011, large-scale data breaches were not uncommon, and the Credit Card Company Data Breach last year was the final straw that prompted a call for stricter data protection and privacy regulations across the board to raise awareness of the significance of data protection and security and potential serious risks. The Amendment Bill keeps pace with the stricter rules of the recently amended version of the Utilization and Protection of Credit Information Act.

More specifically, the Amendment Bill extends stronger protection measures to individuals affected by data breaches by providing for punitive damages and statutory damages. Further, heavier penalties are imposed on those who violate certain provisions of the PIPA, and illegal proceeds generated from such violations are subject to forfeiture and collection. Whereas the current version of the PIPA provided for the recovery of damages in the event an individual’s personal information was stolen, lost, leaked, falsified or damaged, the Amendment Bill explicitly prescribes “fabrication” of personal information as an additional type of data breach, so that affected individuals will also be able to claim damages if their personal information is fabricated. The Amendment Bill also awards broader authority to the Personal Information Protection Committee (PIPC) to address loopholes relating to the practical operation of the PIPC in the PIPA, and provides for the legal grounds for the designation of institutions for data protection certification. Overall, the Amendment Bill contains provisions that increase the level of penalties imposed on violators.

Some of the key changes to the PIPA pursuant to this amendment are summarized below.

2. Adoption of Punitive Damages and Statutory Damages Provisions

The Amendment Bill deletes Article 39(2) of the PIPA which sets forth the mitigating circumstances of a data handler’s liability for damages incurred by a data subject whose personal information is mishandled. Furthermore, under the Amendment Bill, if a person suffers [...]

Continue Reading




read more

The Connected Car and Keeping YOU in the Driver’s Seat

Remember KITT? KITT (the Knight Industries Two Thousand) was the self-directed, self-driving, supercomputer hero of the popular 1980s television show Knight Rider. Knight Rider was a science fiction fantasy profiling the “car of the future.” The self-directed car is science fiction no more. The future is now and, in fact, we’ve seen a lot of press this year about self-driving or driverless cars.

Driverless cars, equipped with a wide variety of connected systems including cameras, radar, sonar and LiDar (light detection and ranging), are expected on the road within the next few years. They can sense road conditions, identify hazards and negotiate traffic, all from a remote command center. Just as with most connected devices in the age of the Internet of Things (IoT), these ultra-connected devices claim to improve efficiency and performance, and enhance safety.

Though not quite driverless yet, connected vehicles are already on the market, in-market and on the road. Like many IoT “things”, ultra-connected vehicles systems may be vulnerable to hacker attacks.

Christopher Valasek and Charlie Miller, two computer security industry leaders, have presented on this topic at various events, including the 2014 Black Hat USA security conference . They analyzed the information security vulnerabilities of various car makes and models, rating the vehicles on three specific criteria: (1) the area of their wireless “attack surface” (i.e., how many data incorporating features such as Bluetooth, Wi-Fi, keyless entry systems, automated tire monitoring systems); (2) access to the vehicles network through those data points; and (3) the vehicle’s “cyberphysical” features (i.e., connected features such as parking assist, automated braking, and other technological driving aides). This last category of features, combined with access through the data points outlined in items (1) and (2), presented a composite risk profile of each vehicle make’s hackability. Their conclusions were startling: radios, brakes, steering systems were all found to be accessible.

Miller and Valasek claim that their intent was to encourage car manufacturers to consider security in vehicle system connectivity and cyberphysical attributes. They approached vehicle manufacturers and shared their report with the Department of Transportation and the Society of Automobile Engineers. Some manufacturers promised to investigate their vehicle systems and correct the deficiencies. Some seemingly ignored the report altogether. They did, however, catch the attention of Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT). On July 21, 2015, Senators Markey and Blumenthal introduced legislation that would direct the National Highway Traffic Safety Administration (NHTSA) and the Federal Trade Commission (FTC) to establish federal standards to secure vehicles and protect drivers’ privacy. The Security and Privacy in Your Car Act, aptly coined “the SPY Car Act”, would also require manufacturers to establish a ‘cyber dashboard’ that rates vehicle security, informing consumers as to the security performance of their vehicle.

As proposed, the SPY Car Act would require that all motor vehicles manufactured in the U.S. be “equipped with reasonable measures to protect against hacking attacks.” All “entry points” are to be protected through “reasonable” measures against hacking. Internal networks are to [...]

Continue Reading




read more

‘Right to Be Forgotten’ in Russian Data Protection Law Has Passed All Stages of Approval

On July 14, 2015, Vladimir Putin, the president of the Russian Federation, has signed the law on implementation of the “right to be forgotten” (the Law). The Law comes into force on January 1, 2016.

1. New obligations imposed on search engines on the Internet

The right to be forgotten applies to the information that had been disseminated by search engine operators distributing advertisements on the Internet for attracting attention of Russian consumers in the following cases:

  • Information had been disseminated in contradiction of legislative requirements;
  • Information that is inaccurate;
  • Information that is accurate but is no longer relevant due to the subsequent development or actions of a data subject (with some exceptions).

2. How will the right to be forgotten be exercised?

The request to delist search results submitted by a data subject (applicant) must contain certain information prescribed by the Law (e.g., full name, passport data, contact information of the applicant, specific information that should be forgotten, reasons for delisting, reference to the Internet website containing information, which shall be delisted, and consent to processing of the applicant’s personal data).

It is important to note that the right to be forgotten may be exercised only by individuals, and not legal entities.

Within 10 business days as of the receipt of the delisting request, search engine must perform one of the following actions:

  • Delist search results related to personal information of the applicant in case search results gained via search requests included name and (or) surname of the applicant
  • Provide the applicant with substantiated written refusal to delist the said search results

If the applicant does not agree with the decision made by the search engine, he or she is entitled to file a respective claim to the competent court.

Information on filing the delisting request by the applicant must be kept confidential by the search engine.

3. Liability for non-compliance

Along with the finally approved Law, another initiative has been submitted to the State Duma on May 29, 2015, and may be considered by the State Duma in the autumn session this year. If passed, the new initiative would institute an administrative fine in the amount of RUR 100,000 (approximately EUR 1,580) for a search engine’s unlawful failure to delist the links related to data subject’s personal information upon his or her request, or in the amount of RUR three million (approximately EUR 47,619) for the search engine’s failure to comply with the court decision requiring delisting of such links.




read more

With No Federal Law in Sight, States Continue to Refine Their Own Data Privacy Laws

With no Congressional consensus to adopt a federal data privacy and breach notification statute, states are updating and refining their already-existing laws to enact more stringent requirements for companies.  Two states recently passed updated data privacy laws with significant changes.

Rhode Island

The Rhode Island Identity Theft Protection Act (Rhode Island Data Law), an update to Rhode Island’s already-existing data security and breach notification law, introduces several new requirements for companies that store, collect, process, use or license personal identifying information (PII) about Rhode Island residents.

A few of these provisions are particularly noteworthy.  First, the new law requires entities to “implement and maintain a risk-based information security program which contains reasonable security procedures and practices,” scaled to the size of the entity and the type of personal information in its possession.  Second, the Rhode Island Data Law requires that any entity that discloses PII to a third party have a written contract with the third party pursuant to which the third party will also implement and maintain an information security program to protect the personal information.  Third, the Rhode Island Data Law requires any entity that experiences a data breach of personal information to notify affected residents within 45 calendar days after it knows that a breach has occurred.  (Rhode Island also required this under its previous law, but there was no precise time frame.)  Among other information, the notification must now contain information about data protection services to be offered to the resident, as well as information about how the resident can request a security credit freeze.

Under both the old and new laws, a health care provider, insurer or covered entity that follows the medical privacy and security rules established by the federal government pursuant to the Health Insurance Portability and Accountability Act (HIPAA) is deemed compliant with the law’s requirements.  The Rhode Island Data Law will become effective June 26, 2016.

Connecticut

The Connecticut Act Improving Data Security and Effectiveness (Connecticut Data Law) similarly updates Connecticut’s existing law and introduces more stringent requirements for entities that that store, collect, process, use or license PII about Connecticut residents.

Perhaps most noteworthy, the Connecticut Data Law puts in place important new requirements about notification following a data breach.  Unlike the older Connecticut breach notification law, the Connecticut Data Law now requires an entity to notify affected individuals of a data breach within a set time period of 90 days.  In addition, if the breach involves disclosure of Social Security numbers, the entity must also provide free credit monitoring services to individuals for one year.  Many companies provide credit monitoring at no cost to their customers affected by a data breach voluntarily.  However, laws like Connecticut’s make credit monitoring a mandatory part of any company’s response.

Additionally, the Connecticut Data Law imposes significant new requirements on insurers and state contractors that handle PII.  Health insurers are required to develop and follow a written data security program, and to certify annually to [...]

Continue Reading




read more

FCC Releases Order Clarifying TCPA

Last Friday, July 10, 2015, the Federal Communications Commission (FCC) released Declaratory Ruling and Order 15-72 (“Order 15-72”) to address more than 20 requests for clarity on FCC interpretations of the Telephone Consumer Protection Act (TCPA). The release of Order 15-72 follows a June 18th open meeting at which the FCC adopted the rulings now reflected in Order 15-72 that are intended to “close loopholes and strengthen consumer protections already on the books.”

Keys rulings in Order 15-72 include:

  • Confirming that text messages are “calls” subject to the TCPA;
  • Clarifying that consumers may revoke their consent to receive robocalls (i.e., telemarketing calls or text messages from an automated system or with a prerecorded or artificial voice) “at any time and through any reasonable means”;
  • Making telemarketers liable for robocalls made to reassigned wireless telephone numbers without consent from the current account holder, subject to “a limited,one-call exception for cases in which the caller does not have actual or constructive knowledge of the reassignment”;
  • Requiring consent for internet-to-phone text messages;
  • Clarifying that “nothing … prohibits” implementation of technology that helps consumers block unwanted robocalls;
  • Allowing certain parties an 89-day (after July 10, 2015) window to update consumer consent to “prior express written consent” as the result of an ambiguous provision in the 2012 FCC Order that established the “prior express written consent” requirement; and
  • Exempting from the consent requirement certain free “pro-consumer financial- and healthcare-related messages”.

We are reviewing the more than 135 pages of Order 15-72, as well as the separate statements of FCC Commissioners Wheeler, Clyburn, Rosenworcel (dissenting in part), Pai (dissenting) and O’Rielly (dissenting in part). Please check back soon for more information and analysis.




read more

Start with Security

On June 30, 2015, the Federal Trade Commission (FTC) published “Start with Security: A Guide for Businesses” (the Guide).

The Guide is based on 10 “lessons learned” from the FTC’s more than 50 data-security settlements. In the Guide, the FTC discusses a specific settlement that helps clarify the 10 lessons:

  1. Start with security;
  2. Control access to data sensibly;
  3. Require secure passwords and authentication;
  4. Store sensitive personal information securely and protect it during transmission;
  5. Segment networks and monitor anyone trying to get in and out of them;
  6. Secure remote network access;
  7. Apply sound security practices when developing new products that collect personal information;
  8. Ensure that service providers implement reasonable security measures;
  9. Implement procedures to help ensure that security practices are current and address vulnerabilities; and
  10. Secure paper, physical media and devices that contain personal information.

The FTC also offers an online tutorial titled “Protecting Personal Information.”

We expect that the 10 lessons in the Guide will become the FTC’s road map for handling future enforcement actions, making the Guide required reading for any business that processes personal information.




read more

Canadian Government Amends and Strengthens PIPEDA, Adding Breach Notification Requirement and Filling Other Gaps

Just prior to recessing for the summer, the Canadian government enacted the Digital Privacy Act. It includes a number of targeted amendments to strengthen existing provisions of the Personal Information Protection and Electronic Documents Act (PIPEDA), but falls short of providing the Privacy Commissioner of Canada (Commissioner) with direct enforcement powers, as some stakeholders—including the former Commissioner—had proposed.

The Digital Privacy Act was introduced in April 2014 as part of the government’s “Digital Canada 150” strategy. While it was touted as providing new protections for Canadians when they surf the web and shop online, there is nothing that is particularly “digital” about the bill, which will equally affect the bricks and mortar, paper-based world.

Of particular note, the Digital Privacy Act creates a duty to report data breaches to both the Privacy Commissioner and to affected individuals “where it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to the individual.” Failure to report data breaches in the prescribed manner could result in fines of up to $100,000 for non-compliant organizations. While the majority of the new law is currently in force, the provisions relating to breach notification have yet to be proclaimed in force by the government.

Once in force, the mandatory breach-reporting regime will bring the federal law into alignment with many international laws, as well as with Alberta’s own Personal Information Protection Act, which has had a breach notification provision since 2009. However, unlike the Alberta law, the Digital Privacy Act would also require organizations to maintain records of all data breaches involving personal information under their control—even if they do not require reporting to the Commissioner or to affected individuals—and to provide these records to the Commissioner on request. Failure to comply with these requirements could also result in a fine of up to $100,000.

The law also creates an explicit authority to enable the federal Privacy Commissioner to enter into a compliance agreement with an organization, where the Commissioner believes on reasonable grounds that the organization has, or is about to, contravene the Act.  If such an agreement is later contravened, the Commissioner will be able to apply to the Federal Court of Canada for a remedial order, even if the original limitation period for such an application has lapsed. The law also extends the limitation period for an application to the Federal Court for damages or injunctive relief to one year after the Commissioner issues a report of findings or otherwise discontinues an investigation. Previously, such applications had to be brought by either the Commissioner or a complainant within 45 days of a report of findings or discontinuation.

The Digital Privacy Act also imposes new requirements on the form of consent that the Act requires from individuals respecting the handling of their personal information. Going forward, any consent will be valid only if an individual to whom an organization’s activities are directed would understand the nature, purpose and consequences of the collection, use and disclosure of [...]

Continue Reading




read more

CNIL Announces Inspection Program—Focus Will Be on BCR Compliance and Treatment of Psychosocial Data, Among Others

The mission of the French data protection authority—the Commission Nationale Informatique et Libertés (CNIL)—is “to protect personal data, support innovation, [and] preserve individual liberties.”

In addition to its general inspections, every year the CNIL establishes a different targeted-inspection program. This program identifies the specific areas that CNIL’s controls will concentrate on for the following year. The 2014 inspection program was focused on everyday life devices, such as online payment, online tax payment and dating websites, among other things.

On May 25, 2015, the CNIL announced its 2015 inspection program and identified a focus on six issues in particular: contactless payment, Driving Licenses National File (Le Fichier National des Permis de Conduire), the “well-being and health” connected devices, monitoring tools used for attendance in public places, the treatment of personal data during evaluation of psychosocial risks and the Binding Corporate Rules.

The last two issues caught our attention:

  • Treatment of personal data during evaluation of psychosocial risks: Since 2008, many companies have been investigating psychosocial risks within the workplace in order to provide a more stress-free environment. This practice, however, raises issues concerning the employee’s right not to share private information with the employer. The CNIL will try to identify which prior investigations may have jeopardized (or may still be jeopardizing) the employee’s rights to privacy.
  • Binding Corporate Rules: Companies seeking to export data outside of the European Union (EU) may adopt a voluntary set of data-protection rules within their corporate group called Binding Corporate Rules (BCR). These BCRs are intended to provide a level of privacy and data protection within the entire corporate group equivalent to the one found under EU law. So far, 68 companies have adopted BCRs. Through its 2015 inspection program, the CNIL wants to give the BCRs a closer look, making sure that the means and devices used are in compliance with French law.

In addition to focusing its 2015 inspection program on BCR compliance, the CNIL also announced, earlier this year, the simplification of intra-group data transfers. Prior to simplification, companies whose BCRs had been approved by the CNIL were also required to obtain the CNIL’s approval for each new type of transfer. The CNIL has since declared that a new, personalized “single decision” will be given to companies with approved BCRs. In return, the companies must keep an internal record of all transfers detailing certain information (the general purpose of each transfer based on the BCR; the category of data subjects concerned by the transfer; the categories of personal data transferred; and information on each data recipient) in accordance with the terms of the single decision issued.

With respect to its targeted inspection program, the question still remains: How many inspections will the CNIL conduct in 2015? In 2014, the CNIL performed a total number of 421 inspections. The CNIL declares that, in 2015, the objective is to achieve 550 inspections. However, only 28 percent of the CNIL’s inspections typically result from the annual inspection program. Forty percent are initiated by the [...]

Continue Reading




read more

Federal Agents Lacked Authority to Search Airplane Passenger’s Laptop, Court Says

A federal court this month found that federal agents lacked authority to conduct a warrantless search of a defendant’s laptop seized at an airport, rejecting the government’s argument that it has unfettered authority to search containers at the border to protect the homeland.  The court distinguished laptops from handbags due to their “vast storage capacity” and found that there was little or no reason to suspect that “criminal activity was afoot” at the time the defendant was about to cross the border.  Rather, agents confiscated the laptop before the defendant boarded his plane at Los Angeles International Airport as part of a pre-existing investigation into the defendant for violation of export control laws.  The agents then sent the laptop to San Diego for extensive forensic imaging and searches over an indefinite period of time.  The court held that this amounted to an unreasonable invasion of the defendant’s right to privacy.

The court relied in part on the U.S. Supreme Court’s recent decision in Riley v. California, 134 S. Ct. 2473 (2014), explaining that Riley “made it clear that the breadth and volume of data stored on computers and other smart devices make today’s technology different in ways that have serious implications for the Fourth Amendment analysis . . . ”

It would not be surprising for the government to appeal the ruling in view of the importance of the border exception to the Fourth Amendment’s search warrant requirement.

Although the decision is grounded in the Fourth Amendment and therefore generally applicable to searches conducted by the government, courts consider Fourth Amendment precedent when evaluating searches by private corporations acting as instruments or agents of the government.  See, e.g., Skinner v. Ry. Labor Executives Ass’n, 489 U.S. 602, 614 (1989) (Fourth Amendment applied to drug and alcohol testing required by private railroads in reliance on federal regulations); United States v. Ziegler, 474 F.3d 1184, 1190 (9th Cir. 2007) (Information Technology department representatives for private company who worked with Federal Bureau of Investigation and seized copies of employee’s hard drive acted as “de facto government agents,” thereby implicating the Fourth Amendment); United States v. Reed, 15 F.3d 928 (9th Cir. 1994) (Fourth Amendment applied to hotel employee’s warrantless search of defendant’s room in light of the presence of police lookouts and the employee’s intent to help police gather proof of narcotics trafficking).  Therefore, companies should take notice of this decision and evaluate the extent to which the court’s rationale may be applied in the private employer context.

The case is United States v. Jae Shik Kim, et al., No. 1:13-cr-00100-ABJ (D.D.C. 2013).  The decision is at Docket Entry 42.




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law