HOUSE BILL 1655
State of Washington
2019 Regular Session
ByRepresentatives Hudgins, Shea, Morris, Kloba, and Valdez
Read first time 01/25/19.Referred to Committee on Innovation, Technology & Economic Development.
AN ACT Relating to establishing guidelines for government procurement and use of automated decision systems in order to protect consumers, improve transparency, and create more market predictability; adding a new section to chapter 49.60
RCW; and adding a new chapter to Title 43
BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF WASHINGTON:
NEW SECTION. Sec. 1. The legislature finds that:
(1) Washington is a technology leader on a national and global level and holds a distinctive position in creating frameworks around technology that enhance innovation while protecting consumers and promoting fairness, accountability, and transparency for all Washingtonians.
(2) Automated decision systems are rapidly being adopted to make or assist in core decisions in a variety of government and business functions, including criminal justice, health care, education, employment, public benefits, insurance, and commerce.
(3) These automated decision systems are often deployed without public knowledge, are unregulated, and vendors selling the systems may require restrictive contractual provisions that undermine government transparency and accountability.
(4) The use of these systems to make core government and business decisions raises concerns around due process, fairness, accountability, and transparency, as well as other civil rights and liberties.
(5) Reliance on automated decision systems without adequate transparency, oversight, or safeguards can undermine market predictability, harm consumers, and deny historically disadvantaged or vulnerable groups the full measure of their civil rights and liberties.
(6) In order to enhance innovation and ensure the use of these systems in ways that benefit Washington residents, the legislature intends to ensure the fair, transparent, and accountable use of automated decision systems.
NEW SECTION. Sec. 2. The definitions in this section apply throughout this chapter unless the context clearly requires otherwise.
(1) "Algorithm" means a computerized procedure consisting of a set of steps used to accomplish a determined task.
(2) "Algorithmic accountability report" means the report with content enumerated in section 5(2) of this act.
(3) "Automated decision system" means any algorithm, including one incorporating machine learning or other artificial intelligence techniques, that uses data-based analytics to make or support government decisions, judgments, or conclusions.
(4) "Automated final decision system" means an automated decision system that makes final decisions, judgments, or conclusions without human intervention.
(5) "Automated support decision system" means an automated decision system that provides information to inform the final decision, judgment, or conclusion of a human decision maker.
NEW SECTION. Sec. 3. By January 1, 2020, the chief privacy officer appointed in RCW 43.105.369 shall adopt rules pursuant to chapter 34.05 RCW regarding the development, procurement, and use of automated decision systems by a public agency. These rules must incorporate the minimum standards and procedures set forth in sections 4 and 5 of this act with respect to automated decision systems.
NEW SECTION. Sec. 4. The following provisions apply to a public agency's development, procurement, or use of an automated decision system:
(1) A public agency may not develop, procure, or use an automated decision system that discriminates against an individual, or treats an individual less favorably than another, in whole or in part, on the basis of one or more factors enumerated in RCW 49.60.010
. A public agency may not develop, procure, or use an automated final decision system to make a decision impacting the constitutional or legal rights, duties, or privileges of any Washington resident, or to deploy or trigger any weapon.
(2) A public agency shall develop, procure, or use an automated decision system only after the public agency first completes an algorithmic accountability report and that report is approved by the chief privacy officer appointed in RCW 43.105.369
, as set forth in section 5 of this act.
(3) A public agency that develops, procures, or uses an automated decision system must follow any conditions set forth in the relevant approved algorithmic accountability report. In addition, the public agency must, at a minimum:
(a) Give clear notice to an individual impacted by the automated decision system of the fact that the system is in use; the system's name, vendor, and version; what decision or decisions it will be used to make or support; whether it is a final or support decision system; what policies and guidelines apply to its deployment; and how the individual can contest any decision made involving the system;
(b) Ensure the automated decision system and the data used in the system are made freely available by the vendor before, during, and after deployment for agency or independent third-party testing, auditing, or research to understand its impacts, including potential bias, inaccuracy, or disparate impacts;
(c) Ensure that any decision made or informed by the automated decision system is subject to appeal, immediate suspension if a legal right, duty, or privilege is impacted by the decision, and potential reversal by a human decision maker through a timely process clearly described and accessible to an individual impacted by the decision; and
(d) Ensure the agency can explain the basis for its decision to any impacted individual in terms understandable to a layperson including, without limitation, by requiring the vendor to create such an explanation.
(4) A procurement contract for an automated decision system entered into by a public agency must ensure the minimum standards set forth in this section can be effectuated without impairment, including requiring the vendor to waive any legal claims that may impair these minimum standards. Such a contract may not contain nondisclosure or other provisions that prohibit or impair these minimum standards.
NEW SECTION. Sec. 5. (1) A public agency intending to develop, procure, or use an automated decision system must produce an algorithmic accountability report for that system, and that system must be approved by the chief privacy officer appointed in RCW 43.105.369 prior to deployment. The agency must submit the algorithmic accountability report to the chief privacy officer. The chief privacy officer must post the algorithmic accountability report on the chief privacy officer's office's public web site and invite public comment on the algorithmic accountability report for a period of no less than thirty days. After receiving public comment, the chief privacy officer must determine whether the intended use of the automated decision system meets the minimum standards set forth in section 4 of this act. On the basis of that determination, the chief privacy officer may approve the algorithmic accountability report, deny it, or make changes to it prior to approval.
(2) Each algorithmic accountability report must include clear and understandable statements of the following:
(a) The automated decision system's name, vendor, and version; a description of its general capabilities, including reasonably foreseeable capabilities outside the scope of the agency's proposed use;
(b) The type or types of data inputs that the technology uses; how that data is generated, collected, and processed; and the type or types of data the system is reasonably likely to generate;
(c) A description of the purpose and proposed use of the automated decision system, including what decision or decisions it will be used to make or support; whether it is a final or support decision system; and its intended benefits, including any data or research demonstrating those benefits;
(d) A description of how the agency plans to comply with each requirement set forth in section 4 of this act;
(e) A clear use and data management policy, including protocols for the following:
(i) How and when the automated decision system will be deployed or used and by whom, including but not limited to: The factors that will be used to determine where, when, and how the technology is deployed; and other relevant information, such as whether the technology will be operated continuously or used only under specific circumstances. If the automated decision system will be operated or used by another entity on the agency's behalf, the algorithmic accountability report must explicitly include a description of the other entity's access and any applicable protocols;
(ii) Any additional rules that will govern use of the automated decision system and what processes will be required prior to each use of the automated decision system;
(iii) How automated decision system data will be securely stored and accessed, and whether an agency intends to share access to the automated decision system or the data from that automated decision system with any other entity, and why; and
(iv) How the agency will ensure that all personnel who operate the automated decision system or access its data are knowledgeable about and able to ensure compliance with the use and data management policy prior to use of the automated decision system;
(f) A description of any public or community engagement held and any future public or community engagement plans in connection with the automated decision system;
(g) A description of any potential impacts of the automated decision system on civil rights and liberties and potential disparate impacts on marginalized communities, and a mitigation plan; and
(h) A description of the fiscal impact of the automated decision system, including: Initial acquisition costs; ongoing operating costs such as maintenance, licensing, personnel, legal compliance, use auditing, data retention, and security costs; any cost savings that would be achieved through the use of the technology; and any current or potential sources of funding, including any subsidies or free products being offered by vendors or governmental entities.
NEW SECTION. Sec. 6. Any person who is injured by a material violation of this chapter, including denial of any government benefit on the basis of an automated decision system that does not meet the standards set forth in this chapter, may institute proceedings against the public agency deploying the automated decision system in a court of competent jurisdiction for injunctive relief, including restoration of the government benefit in question, declaratory relief, or a writ of mandate to enforce this chapter.
NEW SECTION. Sec. 7.
A new section is added to chapter 49.60
RCW to read as follows:
(1) It is an unfair practice for any automated decision system to discriminate against an individual, or to treat an individual less favorably than another, in whole or in part, on the basis of one or more factors enumerated in RCW 49.60.010
(2) For the purposes of this section, "automated decision system" has the same meaning as defined in section 2 of this act.
NEW SECTION. Sec. 8. Sections 1 through 6 of this act constitute a new chapter in Title 43 RCW.
--- END ---