On February 27, 2015, the Obama White House released an “Administration Discussion Draft” of its Consumer Privacy Bill of Rights Act of 2015 (Proposed Consumer Privacy Act).
The Proposed Consumer Privacy Act revises and builds on the “Consumer Privacy Bill of Rights” that the Obama White House released in its 2012 Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy report.
As described during President Obama’s January 12 visit to the Federal Trade Commission (FTC), the Proposed Consumer Privacy Act identifies seven “basic principles to both protect personal privacy and ensure that industry can keep innovating.” These seven principles are:
- Transparency (§101): Transparency is a principle frequently cited in guidance from the FTC, as well as self-regulatory framework, such as the Digital Advertising Alliance’s cross-industry code for interest based-advertising. The Proposed Consumer Privacy Act describes transparency as “concise and easily understandable language, accurate, clear, timely, and conspicuous notice about privacy and security practices.” The notice required from an entity subject to the Proposed Consumer Privacy Act (defined as a “covered entity” (CE)) must describe the entity’s collection, use, disclosure, retention, destruction and security practices.
- Individual Control (§102): The Individual Control principle means offering consumers a “reasonable means to control the processing (i.e., taking any action regarding) personal data about them in proportion to the privacy risk to the individual and consistent with context.” An individual must have a way to either withdraw consent related to his or her personal data that is “reasonably comparable” to the means by which the consent was initially granted consent or request that the CE “de-identify” (as defined in the Proposed Consumer Privacy Act) his or her personal data.
- Respect for Context (§103): Under the Respect for Context principle, a CE must process personal data reasonably “in light of context.” If the processing is not reasonable, the CE must undertake a “privacy risk analysis” to identify and take reasonable steps to mitigate privacy-related risk, including “heightened transparency and individual control,” such as just-in-time notices. Reasonableness is presumed when a CE’s personal data processing “fulfills an individual’s request.”
- Focused Collection and Responsible Use (§104): The Focused Collection and Responsible Use principle requires that a CE limit its collection, retention and use of personal data to a “manner that is reasonable in light of context.” The CE also must “delete, destroy, or de-identify” personal data within a “reasonable time” after the original purpose for its collection, retention, or use has been fulfilled.
- Security (§105): Under the Security principle, a CE must: identify internal and external “risks to privacy and security” of personal data; implement and maintain safeguards “reasonably designed” to secure personal data; regularly assess the efficacy of the safeguards, and adjust the safeguards to reflect material changes to business practices or “any other circumstances that create a material impact on the privacy or security” of personal data under the CE’s control. The four factors presented for evaluating the reasonableness of security safeguards are: (i) degree of privacy risk; (ii) foreseeability of risks; (iii) “widely accepted practices”; and (iv) cost.
- Access and Accuracy (§106): The Access principle requires a CE to give an individual “reasonable access to, or accurate representation of” that individual’s personal data under the CE’s control, with limitations on access for (among others) legally privileged information, law enforcement, national security or frivolous requests. The Accuracy principle also requires a CE to establish a procedure for an individual to ensure that his or her personal data held by the CE is accurate. A CE does not need to correct personal data obtained “directly from the individual” or from certain governmental databases. The CE may decline to correct the personal data but must destroy or delete it upon request.
- Accountability (§107): Accountability under the Proposed Consumer Privacy Act means that a CE must (among other measures) provide employee training, conduct evaluation of privacy protections, adopt a “privacy by design” approach to its systems and practices and “bind” downstream users of its personal data to the CE’s commitments to the individuals from which the personal data was collected, as well as the requirements of the Proposed Consumer Privacy Act.
The Proposed Consumer Privacy Act does not cover much new ground, but its significance rests in the knitting together of the existing guidelines from the FTC and other federal regulators (e.g., National Telecommunications and Information Administration) and industry self-regulatory codes, as well as its intention to begin to provide a more certain outline to practices that will be deemed unfair or deceptive. The Act also buttresses the FTC’s role as “enforcer in chief” for consumer privacy, which has significance given challenges to the scope of the FTC’s authority to regulate data security.
Some of the provisions that we found notable in the Proposed Consumer Privacy Act include:
- That “context” (a defined term) of personal data collection is a prominent feature in determining whether consent has been obtained from individuals (see, g. Subsection 4k(3)) is notable for a business struggling with how to manage consumer expectations about how their personal data is used and disclosed.
- A CE that complies with an FTC-approved code of conduct for processing personal data has safe harbor protection under the Proposed Consumer Privacy Act (§301). Federal regulators have consistently shown support for industry codes of conduct as a means to police data privacy and protection. In its 2012 White House Report, the Obama White House noted the importance of self-regulatory guidance as a framework for regulating consumer privacy and security on the Internet. Also in 2012, the U.S. Department of Commerce recommended legislative authority for the FTC to provide input about and directly enforce “industry codes of conduct.” Further, the FTC has indicated that publicizing participation in a code of conduct but failing to adhere to the code can be a deceptive trade practice. The Proposed Consumer Privacy Act expressly prohibits a state or local government from enforcing any personal data processing law to if the CE is entitled to safe harbor protection through compliance with an approved industry code.
- Enforcement is carried out through federal and state regulators and does not include a private right of action, consistent with the Federal Trade Commission Act, the Children’s Online Privacy Protection Act (COPPA) and HIPAA.
- On the federal level, the FTC can treat violations as unfair and deceptive trade practices pursuant to Section 5 of the Federal Trade Commission Act. The FTC is not allowed to undertake enforcement against a CE during its first 18 months’ of processing personal data. Presumably, this grace period is intended to fulfill the stated goal of not stifling innovation. Note, too, that the FTC is expressly prohibited from requiring a CE to deploy or use specific products or technologies.
- On the state level, a state Attorney General can bring a civil action for violation of the Proposed Consumer Privacy Act that “caused or is causing” harm to a substantial number” of its state’s residents. The only remedy available is injunctive relief unless the FTC intervenes.
- The Proposed Consumer Privacy Act would preempt “any provision of a statute, regulation, or rule of a State or local government” that “imposes requirements on covered entities with respect to personal data processing” but not state consumer protection or data breach notification laws, or state or local laws that address the “processing of health information or financial information.”
We will continue to assess and monitor the Proposed Consumer Privacy Act, as well as the White House’s other privacy legislation, such as the Student Digital Privacy Act, against the backdrop of the wide range of privacy, security, breach and data utility initiatives underway at the state and federal level.