HOUSE BILL REPORT

HB 1655

This analysis was prepared by non-partisan legislative staff for the use of legislative members in their deliberations. This analysis is not a part of the legislation nor does it constitute a statement of legislative intent.

As Reported by House Committee On:

Innovation, Technology & Economic Development

Title: An act relating to establishing guidelines for government procurement and use of automated decision systems in order to protect consumers, improve transparency, and create more market predictability.

Brief Description: Establishing guidelines for government procurement and use of automated decision systems in order to protect consumers, improve transparency, and create more market predictability.

Sponsors: Representatives Hudgins, Shea, Morris, Kloba and Valdez.

Brief History:

Committee Activity:

Innovation, Technology & Economic Development: 2/6/19, 2/22/19 [DPS].

Brief Summary of Substitute Bill

  • Directs the Office of the Chief Information Officer to report to the Legislature with an algorithmic impact inventory that provides certain information about automated decision systems used by state agencies.

  • Requires the Chief Privacy Officer to adopt rules regarding the use of automated decision systems by state agencies.

  • Expands Washington's Law Against Discrimination to prohibit discrimination by automated decision systems.

HOUSE COMMITTEE ON INNOVATION, TECHNOLOGY & ECONOMIC DEVELOPMENT

Majority Report: The substitute bill be substituted therefor and the substitute bill do pass. Signed by 8 members: Representatives Hudgins, Chair; Kloba, Vice Chair; Smith, Ranking Minority Member; Boehnke, Assistant Ranking Minority Member; Slatter, Tarleton, Van Werven and Wylie.

Staff: Yelena Baker (786-7301).

Background:

Automated decision systems are data-driven algorithmic tools that are used to analyze and support decisionmaking in a variety of government settings, including policing, criminal sentencing, business management and risk assessment, and administration of public programs. Government use of automated decision systems is not regulated by any specific state or federal laws.

The Office of the Chief Information Officer (OCIO) is housed within the Consolidated Technology Services Agency. The OCIO prepares and leads the implementation of a strategic direction and enterprise structure for information technology (IT) for state government. The OCIO also establishes standards and policies for the consistent and efficient operation of IT services throughout state government. In addition, the OCIO is required to establish security standards and policies to ensure the confidentiality and integrity of information transacted, stored, or processed in the state's information technology systems and infrastructure. Each state agency must adhere to the OCIO's security standards and policies.

In 2016 the Office of Privacy and Data Protection (OPDP) was created in the OCIO to serve as a central point of contact for state agencies on policy matters involving data privacy and data protection. The Chief Privacy Officer serves as the director of the OPDP. The primary duties of the OPDP with respect to state agencies include conducting privacy reviews and trainings, coordinating data protection, and articulating privacy principles and best policies.

Washington's Law Against Discrimination prohibits discrimination based on race, creed, color, national origin, sex, marital status, and other enumerated factors. Unfair (discriminatory) practices are prohibited in the areas of employment, commerce, labor union membership, credit and insurance transactions, access to public places, and real property transactions.

–––––––––––––––––––––––––––––––––

Summary of Substitute Bill:

"Automated decision system" is defined to mean any algorithm, including one incorporating machine learning or other artificial intelligence techniques, that uses data-based analytics to make or support government decisions, judgments, or conclusions.

The Office of the Chief Information Officer (OCIO) must review and inventory all automated decision systems that are being used, developed, or procured by state agencies. The OCIO must report to the Legislature annually on the progress of the review and inventory process until an algorithmic impact inventory is completed.

By December 1, 2020, the OCIO must report to the Legislature with the algorithmic impact inventory and provide certain information regarding each automated decision system, including a description of the automated decision system's general and reasonably foreseeable capabilities; the types of data used by the system and how that data is collected and processed; whether the system makes decisions affecting constitutional or legal rights; and whether the system provides notice to the individuals affected by its decisions that the automated decision system is in use.

The Chief Privacy Officer is directed to adopt rules, by January 1, 2020, regarding the development, procurement, and use of automated decision systems by state agencies. The rules must address any issues of bias identified in the algorithmic impact inventory.

Washington's Law Against Discrimination is expanded to prohibit discrimination by automated decision systems against any individual on the basis of one or more factors enumerated in the Law Against Discrimination.

Substitute Bill Compared to Original Bill:

The substitute bill:

–––––––––––––––––––––––––––––––––

Appropriation: None.

Fiscal Note: Preliminary fiscal note available.

Effective Date of Substitute Bill: The bill takes effect 90 days after adjournment of the session in which the bill is passed.

Staff Summary of Public Testimony:

(In support) Algorithms either make or assist in many important government decisions, but there is no transparency with regard to these systems. These algorithms produce or assist in consequential decisions that, in some cases, no human being reviews. The public is rarely notified that these algorithms even exist and are being used to make government decisions. Vendors include aggressive nondisclosure agreements and use litigation to prevent the public from understanding how these algorithms work.

There are many examples of bias in these algorithms, such as faulty risk assessments that recommend different sentences for the same crime or assess people of color as a higher risk than other people. These algorithms rely on historical data that are often skewed as a result of historic discrimination. For example, predictive policing systems are supposed to determine where law enforcement should deploy resources, but the decisions of these systems largely reflect historic over-policing of neighborhoods of color.

These systems are often untested or poorly designed and carry substantial risk of error in important government decisions, such as access to healthcare, disability benefits, and teacher employment decisions. In one example from Texas, not a single employee of a school district could explain the determinations of an algorithmic decision system, and the teachers who sought to contest the determinations were told that the decision system was simply to be believed and could not be questioned.

The Legislature has in the past stepped forward to create reasonable rules for technology to protect Washingtonians while also ensuring that innovation can thrive. This bill creates those reasonable rules for automated decision systems and responds to a number of problems that have been recognized by scholars in this field.

(Opposed) None.

(Other) It is not clear how this bill may impact public safety. Humans are to be the ones making the decisions, and that is what is happening currently in law enforcement. There are no known examples where a law enforcement agency uses an output from an algorithm as a call to action; at best, it is an investigatory lead. A scoring tool called Static-99R is used to conduct risk assessments of sex offenders; some agencies have automated this process, and it is unclear whether this automated process would fall under the scope of this bill.

Persons Testifying: (In support) Representative Hudgins, prime sponsor; Shankar Narayan, American Civil Liberties Union of Washington; Masih Fouladi, Council on American-Islamic Relations Washington; Jevan Hutson, University of Washington School of Law; and Katherine Pratt, University of Washington Electrical and Computer Engineering.

(Other) James McMahan, Washington Association of Sheriffs and Police Chiefs.

Persons Signed In To Testify But Not Testifying: None.