HOUSE BILL REPORT

2SSB 5376

This analysis was prepared by non-partisan legislative staff for the use of legislative members in their deliberations. This analysis is not a part of the legislation nor does it constitute a statement of legislative intent.

As Reported by House Committee On:

Innovation, Technology & Economic Development

Appropriations

Title: An act relating to the management and oversight of personal data.

Brief Description: Protecting consumer data.

Sponsors: Senate Committee on Ways & Means (originally sponsored by Senators Carlyle, Palumbo, Wellman, Mullet, Pedersen, Billig, Hunt, Liias, Rolfes, Saldaña, Hasegawa and Keiser).

Brief History:

Committee Activity:

Innovation, Technology & Economic Development: 3/22/19, 4/3/19 [DPA];

Appropriations: 4/6/19, 4/9/19 [DPA(APP w/o ITED)].

Brief Summary of Second Substitute Bill

(As Amended by Committee)

  • The Legislature declares statements of intent regarding consumer data privacy.

HOUSE COMMITTEE ON INNOVATION, TECHNOLOGY & ECONOMIC DEVELOPMENT

Majority Report: Do pass as amended. Signed by 5 members: Representatives Hudgins, Chair; Kloba, Vice Chair; Slatter, Tarleton and Wylie.

Minority Report: Do not pass. Signed by 4 members: Representatives Smith, Ranking Minority Member; Boehnke, Assistant Ranking Minority Member; Morris and Van Werven.

Staff: Yelena Baker (786-7301).

Background:

Personal information and privacy interests are protected under various provisions of state law. The Washington State Constitution provides that no person shall be disturbed in his private affairs without authority of law. The Public Records Act protects a person's right to privacy under certain circumstances if disclosure of personal information would be highly offensive and is not of legitimate concern to the public.

The Consumer Protection Act (CPA) prohibits unfair methods of competition and unfair or deceptive practices in the conduct of any trade or commerce. The Attorney General may investigate and prosecute claims under the CPA on behalf of the state or individuals in the state.

In 2016 the Office of Privacy and Data Protection (OPDP) was created to serve as a central point of contact for state agencies on policy matters involving data privacy and data protection. The primary duties of the OPDP with respect to state agencies include conducting privacy reviews and trainings, coordinating data protection, and articulating privacy principles and best policies.

–––––––––––––––––––––––––––––––––

Summary of Amended Bill:

Key Definitions.

"Controller" means the natural or legal person which, along or jointly with others, determines the purposes and means of the processing of personal data.

"Processor" means a natural or legal person that processes personal data on behalf of the controller.

"Consumer" means a natural person who is a Washington resident acting only in an individual or household context and does not include a natural person acting in a commercial or employment context.

"Personal data" means any information that is linked or reasonably linkable to an identified or identifiable natural person. Personal data includes reidentified data and does not include deindentified data.

Controller and Processor Obligations.

Specific obligations related to personal data are created for legal entities that conduct business in Washington or intentionally target their products or services to Washington residents.

Responsibility According to Role.

Controllers are responsible for meeting the obligations set forth in the bill. Processors must adhere to instructions of the controller and assist controllers in meeting set obligations. Processing by a processor is governed by a contract between the controller and the processor.

Third parties are responsible for assisting controllers and processors in meeting their obligations with regard to personal data third parties receive from controllers or processors. Third parties must comply with consumer requests made known to them by a controller.

Consumer Rights.

A consumer retains ownership in the consumer's personal data processed by a controller, a processor, or a third party, and may exercise certain rights with regard to the consumer's personal data by submitting a request to a controller.

Upon receiving a verified consumer request, a controller must:

A controller must communicate any correction, deletion, or restriction of processing carried out pursuant to a consumer's verified request to each third-party recipient to whom the controller knows the personal data has been disclosed, including through a sale, within one year preceding the verified request, unless this proves functionally impractical, technically infeasible, or involves disproportionate effort, or the controller knows or is informed by the third party that the third party is not continuing to use the personal data.

A controller may request additional information needed to confirm the identity of the consumer making a request to exercise a consumer right and may refuse to act on manifestly unfounded or excessive requests.

A controller must respond to received requests within 30 days, unless certain circumstances permit an extension of up to 60 additional days. Within 30 days of receiving a consumer request, a controller must provide the consumer with information about any action taken on a request, any extension, the reasons for the delay or for not taking action, and information about the process for internal review of the controller's decision.

Transparency.

Controllers must be transparent and accountable for their processing of personal data by making available a clear privacy notice that includes certain information, such as the categories of personal data collected and the purposes for which the categories of personal data are used and disclosed to third parties.

Controllers that sell personal data to data brokers must disclose such sales and provide consumer with information as to how consumers may object to such sales.

Compliance and Risk Assessments.

Controllers must develop, implement, and make publicly available an annual plan for complying with the obligations under the bill, and may report metrics on their public website to demonstrate and corroborate their compliance with these obligations.

Controllers must conduct and document risk assessments prior to processing personal data when a change in processing materially impacts the risk to individuals and on at least an annual basis.

Risk assessments must take into account the type of personal data to be processed and must identify and weigh the benefits of processing against the potential risks to the rights of the consumer associated with the processing. If the risk assessment determines that the potential risks of privacy harm outweigh the interests of the controller, consumer, and the public, the controller may only engage in such processing with the consumer's consent.

Processing for a business purpose shall be presumed to be permissible unless:

Risk assessments must be made available to the Attorney General upon request and are exempt from public inspection under the Public Records Act.

Deidentified Data.

A controller or processor that uses, sells, or shares deidentified data must:

Exemptions.

Local and state governments, municipal corporations, and institutions of higher education are exempt from the provisions of the act. In addition, the bill does not apply to the following information:

The obligations imposed on controllers or processors do not restrict a controller's or a processor's ability to conduct a number of specified activities, such as: complying with federal, state, or local laws; complying with a civil inquiry or a criminal investigation; establishing or defending legal claims; protecting against malicious or illegal activity; or processing personal data of a consumer where the consumer has consented to such processing.

The Office of Privacy and Data Protection (OPDP) may grant controllers one-year waivers to permit processing that is necessary for reasons of public health, for archiving purposes, to safeguard intellectual property rights, or to protect the vital interests of a consumer or another natural person.

A controller may not sell any personal data that the controller processes under one of the exemptions or pursuant to a waiver issued by the OPDP.

Controllers and processors are not required to re-identify deidentified data, or to retain or link personal data that would not otherwise be retained or linked.

Facial Recognition Technology.

Prior to using facial recognition technology, controllers and processors must verify, through independent testing, that no variation occurs in the accuracy of the technology on the basis of race, skin tone, ethnicity, gender, or age of an individual. Controllers, processors, and providers of facial recognition technology must notify consumers if an automated decision system makes decisions that produce legal effects or affect legal rights of any Washington resident.

Controllers that use facial recognition technology:

Processors that provide facial recognition services must:

State and local government agencies are prohibited from using facial recognition technology to engage in surveillance in public spaces unless in support of law enforcement activities and either: (1) a court-issued warrant targeting an individual and permitting the use of facial recognition services for that specific, individualized surveillance during a specified limited time frame has been obtained; or (2) there is an emergency involving imminent danger or risk of death to a person, in which case facial recognition may be used for the limited duration of the emergency.

Liability and Enforcement.

Violations of these provisions are enforceable under the Consumer Protection Act.

The Office of Privacy and Data Protection.

The OPDP, in consultation with the Attorney General, must clarify definitions as necessary. The OPDP may create rules for granting controllers one-year exemption waivers to continue processing for specified purposes.

Additionally, the OPDP must:

Amended Bill Compared to Second Substitute Bill:

The amended bill:

  1. sets forth the principle that consumers retain ownership interest in their personal data, including personal data that undergoes processing, and enumerates specific consumer rights with regard to processing of personal data;

  2. modifies several key definitions, including "business purpose", "deidentified data", and "facial recognition", and creates new definitions, such as "privacy harm";

  3. eliminates the thresholds that a legal entity must meet in order for the obligations set forth in the bill to apply to that entity;

  4. exempts certain entities and information subject to enumerated federal and state laws from the provisions of the bill;

  5. modifies responsibilities of the controllers and processors, and specifies responsibilities of third parties that receive data from controllers or processors;

  6. modifies the provisions related to consumer rights by specifying when controllers must comply with consumer requests to exercise rights, when certain exemptions apply, or what controllers may take into consideration when taking action on consumer requests;

  7. adds to the list of information that a privacy notice must contain, such as a statement that the controller processes personal data of a consumer only pursuant to the consumer's consent and solely for the purposes disclosed to the consumer under the privacy notice;

  8. requires controllers to develop, implement, and make publicly available an annual plan for complying with the obligations under the bill, and authorizes controllers to report compliance metrics on their public websites;

  9. modifies the provisions related to risk assessments, such as by including additional circumstances when processing data for business purposes is not presumed permissible;

  10. modifies the provisions related to deidentified data, such as by requiring controllers or processors to make a public commitment not to reidentify deidentified data;

  11. modifies the exemptions provisions and authorizes the OPDP to grant one-year waivers to permit processing for certain purposes;

  12. prohibits controllers from selling any personal data processed pursuant to an exemption or a waiver;

  13. sets forth additional requirements and prohibitions for controllers and processors that use or provide facial recognition services, such as by prohibiting the use of facial recognition for profiling or to make decisions that have legal effects;

  14. modifies the enforcement provisions by removing prohibition on the private cause of action; and

  15. modifies the rule-making authorization for the OPDP and directs the OPDP to conduct several studies on topics related to consumer data privacy and to report its findings to the Legislature.

–––––––––––––––––––––––––––––––––

Appropriation: None.

Fiscal Note: Available. New fiscal note requested on April 4, 2019.

Effective Date of Amended Bill: The bill takes effect July 30, 2020, except for section 15, which takes effect 90 days after adjournment of the session in which the bill is passed.

Staff Summary of Public Testimony:

(In support) The underlying Senate bill presents a thoughtful and deliberate approach by establishing consumer rights and placing affirmative obligations on companies to be responsible stewards of personal data. The exemptions listed in the underlying Senate bill recognize that this bill will be implemented on top of the already-existing legal framework; the bill is also flexible enough to recognize that sometimes there are compelling reasons to continue retaining data.

The underlying Senate bill strikes the right balance for allowing the use of facial recognition technology with proper guidelines; the striking amendment, by contrast, sets up an impossible standard by requiring consent beyond public notice.

The underlying Senate bill is a good compromise document. The private cause of action provided for in the striking amendment is of concern.

(Opposed) Neither the underlying Senate bill nor the striking amendment present a meaningful step forward in terms of privacy protections. As evidenced by academic papers and recent court decisions, there is an expectation of privacy in a public place. There should be a complete moratorium on the use of facial recognition technology which poses unique civil rights concerns. The bias problems with facial recognition are well-known. This technology provides unprecedented surveillance capabilities, and this bill continues to lay the groundwork for a government surveillance infrastructure that produces biased outcomes and unfairly targets vulnerable communities. Setting a low standard with this bill will set a low standard for the whole country.

This bill is weaker than both the European Union's General Data Protection Regulation and California's Consumer Privacy Act and would set a bad precedent. The bill serves the interests of powerful data collectors rather than consumers.

The facial recognition provisions create an expectation of privacy in a public space, which has an implication on the "plain view" doctrine. Consumers would be best served by a federal law that applies across industry sectors rather than a patchwork of state laws. The Attorney General should have the sole authority to enforce the state privacy law.

Requiring health care information to be in compliance with the applicable federal and state laws in order to be exempt sets an impossible standard. Compliance is a process and not as black and white as some would like it to be; health information may be out of compliance without the covered entities being aware of it. The conditional language in the striking amendment does not recognize the existing regulatory structure for medical entities and health care information.

(Other) The striking amendment improves the enforcement mechanisms by adding a private cause of action. Controllers should not use facial recognition to make decisions that produce legal effects, even if a human review is involved. The definition of "sensitive data" should include information about a person's citizenship or immigration status.

The striking amendment removes jurisdictional thresholds, which may create an administrative burden on small businesses. There should be a temporal limitation on the personal data a business processes before that business is subject to the obligations of this bill. Alternatively, there needs to be a limited exemption for land titles and insurance companies. Publicly available data should be excluded from the provision of the bill. "Safe harbor" language should be added in so that a reasonable mistake does not lead to strict liability.

Agencies that use facial recognition software in public safety efforts may suffer setbacks because seeking a warrant every time would be impractical.

The underlying Senate bill provides clarity for consumers and clear directions for businesses, and sets forth a reasonable enforcement mechanism. The tiered private right of action in the striking amendment is of concern. Providing for a private cause of action will not improve compliance but will have a chilling effect on the industries' relationship with regulators.

Persons Testifying: (In support) Senator Carlyle, prime sponsor; Michael Parham, RealNetworks; Ryan Harkins, Microsoft; Julia Gorton, Washington Hospitality Association; Alex Alben, Washington State Office of Privacy and Data Protection; and Mike Hoover, TechNet.

(Opposed) James McMahon, Washington Association of Sheriffs and Police Chiefs; Mark Johnson, Washington Retail; John Christiansen, Christiansen Information Technology Law; Zosia Stanley, Washington State Hospital Association; Shankar Narayan, American Civil Liberties Union of Washington; Samroz Jakvani, Muslim Association of Puget Sound; Eli Goss, OneAmerica; Elise Orlick, WashPIRG; Maureen Mahoney, Consumer Reports; Jevan Hutson, University of Washington Law; Geoff Froh, Densho; Anna Lauren Hoffmann, University of Washington Information School; and Russell Brown, Washington Association of Prosecuting Attorneys.

(Other) Trent House, Washington Bankers Association and United Financial Lobby; Stuart Halsan and Sean Holland, Washington Land Title Association; Michael Transue, Axon; Rose Feliciano, Internet Association; Brad Tower, Toy Association; Bob Battles, Association of Washington Business; Diana Carlen, Relx, Inc; and Emilia Jones, Attorney General's Office.

Persons Signed In To Testify But Not Testifying: None.

HOUSE COMMITTEE ON APPROPRIATIONS

Majority Report: Do pass as amended by Committee on Appropriations and without amendment by Committee on Innovation, Technology & Economic Development. Signed by 19 members: Representatives Ormsby, Chair; Bergquist, 2nd Vice Chair; Robinson, 1st Vice Chair; Cody, Dolan, Fitzgibbon, Hansen, Hudgins, Jinkins, Macri, Pettigrew, Pollet, Ryu, Senn, Springer, Stanford, Sullivan, Tarleton and Tharinger.

Minority Report: Do not pass. Signed by 14 members: Representatives Stokesbary, Ranking Minority Member; MacEwen, Assistant Ranking Minority Member; Rude, Assistant Ranking Minority Member; Caldier, Chandler, Dye, Hoff, Kraft, Mosbrucker, Schmick, Steele, Sutherland, Volz and Ybarra.

Staff: Meghan Morris (786-7119) and Yelena Baker (786-7301).

Summary of Recommendation of Committee On Appropriations Compared to Recommendation of Committee On Innovation, Technology & Economic Development:

All substantive provisions of the bill are removed and replaced with legislative intent regarding consumer privacy.

Appropriation: None.

Fiscal Note: Available for the Substitute Senate Bill.

Effective Date of Amended Bill: The bill takes effect 90 days after adjournment of the session in which the bill is passed.

Staff Summary of Public Testimony:

(In support) Consumer data privacy is more important than ever. Many diverse stakeholders came together to craft a strong policy, so the underlying Senate bill, with some additional modifications, is preferable.

Technology is not inherently good or bad; there are good applications of technology and bad, or questionable, applications of technology. Facial recognition technology is ubiquitous. The underlying Senate bill requires disclosure that facial recognition technology is being used and allows consumers to opt out from the use of that technology. The underlying Senate bill also requires all platforms, such as Amazon, Google, and Facebook, to provide access to their core platforms in a neutral way so that independent third parties can assess the efficacy and the quality of each core platform. The underlying Senate bill provides that there has to be a meaningful human review before a facial recognition system makes a decision that impacts a person's life.

(Opposed) The bill as amended by the Committee on Innovation, Technology, and Economic Development (ITED) is better than the underlying Senate bill, which is a nonstarter. But even the ITED bill sets up a permissive regime and gives companies, not consumers, control of personal data. Neither bill comes close to either the European Union's or California's privacy law, which is why national consumer privacy groups are opposed to both the underlying Senate bill and the ITED bill.

The provisions related to the use of facial recognition by state and local government agencies are too permissive and would allow law enforcement to use facial surveillance without a meaningful discussion of the proper place of this technology in our democracy. Many people, from workers to civil society to tech researchers, are strongly opposed to selling this technology to or condoning its use by law enforcement. Facial recognition technology has been shown to be biased; we should declare a moratorium on this technology rather than greenlight it. Simply putting up a sign to notify people that this technology is in use does not constitute an effective opt-out because it is impossible to leave one's face at home.

Provisions related to consumer rights purport to give consumers certain rights, but are riddled with loopholes, such as subjecting consumer requests to verification by companies, and allowing companies to continue processing data if they can demonstrate a legitimate ground. This bill serves the interests of powerful data collectors and undermines individual privacy rights.

The ITED committee amendment would impose significant burdens on consumers; its compliance requirements for companies are extensive and difficult to manage. The underlying Senate bill better balances consumer control of information and operational workability for businesses. Consumers would be better served by a national law that would apply across industry sectors and across state lines. This would protect consumers from multiple conflicting standards that undermine consumers' expectations and trust.

The Attorney General (AG) is in the best position to help everyone understand the law. The bill should not leave it to individual judges to declare legislative intent and to fill in the gaps through expensive litigation of private actions. A private right of action would undermine the privacy protections in the bill because data privacy companies incur a significant compliance requirement and need to rely on guidance from the AG and the Office of Privacy and Data Protection to guide their actions. Allowing a private right of action will undermine the effectiveness of that framework.

Facial recognition provisions limit law enforcement's ability to use this technology in the interests of public safety. Facial recognition is not a perfect technology and law enforcement cannot use it by itself to establish probable cause. The constitution provides a reasonable expectation of privacy. The ITED committee amendment would effectively ban facial recognition technology, a technology that has many beneficial uses. The underlying Senate bill imposes more appropriate and balanced levels of regulation and allows facial recognition technology to continue being used in a variety of beneficial ways, such as finding missing children and providing security at large events, while imposing rigorous requirements on industry to address concerns regarding bias, accuracy, and discrimination.

(Other) If this bill does not pass, it would not be much of a loss because it is a weak bill. If the bill were to pass, there must be a private right of action in it.

Persons Testifying: (In support) Senator Carlyle, prime sponsor.

(Opposed) Shankar Narayan, American Civil Liberties Union of Washington; Elise Orlick, Washington Public Interest Research Group; Jevan Hutson, University of Washington School of Law; Mark Johnson, Washington Retail Association; Diana Carlen, Relx Incorporated; James McMahan, Washington Association of Sheriffs and Police Chiefs; Irene Plenefisch, Microsoft; Tom McBride, CompTIA; Michael Transue, Axon; Julia Gorton, Washington Hospitality Association; Mel Sorensen, American Property Casualty Insurance Association, American Council of Life Insurance, and Consumer Data Industry Association; Bob Battles, Association of Washington Business; and Russell Brown, Washington Association of Prosecuting Attorneys.

(Other) Sheila Dean.

Persons Signed In To Testify But Not Testifying: None.