Automated Decision Systems in General. Automated decision systems (ADS) are data-driven algorithmic tools that may be used to analyze and support decision-making in a variety of settings.
Office of the Chief Information Officer. The Office of the Chief Information Officer (OCIO) is housed within the Consolidated Technology Services agency, commonly referred to as WaTech. The director of the agency serves as the state chief information officer. The OCIO has certain primary duties related to state government information technology, which include establishing statewide enterprise architecture, and standards for consistent and efficient operation.
Washington's Law Against Discrimination. Washington's Law Against Discrimination (WLAD) prohibits discrimination based on race, creed, color, national origin, sex, marital status, and other enumerated factors. Discriminatory practices are prohibited in the areas of employment, commerce, credit and insurance transactions, access to public places, and real property transactions.
Automated Decision Systems Work Group Report. In 2021 the Legislature directed, in SB 5092, WaTech to convene a work group to examine how ADS can best be reviewed before adoption and while in operation and periodically audited to ensure that such systems are fair, transparent, accountable and do not improperly advantage or disadvantage Washington residents. The work group report was distributed on December 1, 2021.
The bill as referred to committee not considered.
Automated Decision Systems. ADS is defined as any computerized procedure consisting of a set of steps to accomplish a determined task, including one incorporating machine learning or other artificial intelligence techniques, that uses data-based analysis or calculations to make or support government decisions, judgments, or conclusions that cause a Washington resident or business to be treated differently than another Washington resident or business, or results in statistically significant disparities with other classes of persons or businesses in the nature or amount of governmental interaction with that individual or business.
Minimum Standards. Subject to the staged review provisions the following minimum standards should apply to an agency's development, procurement, or use of an ADS:
A person injured by a public agency's material violations of these provisions may institute proceedings for injunctive or declaratory relief, or both, to compel compliance with these provisions.
Staged Review. Agencies already using an ADS must provide a list of ADS to the algorithmic accountability review board (board) by January 1, 2023, and use the prioritization framework to identify the order in which to complete an algorithmic accountability report by January 1, 2025. If an algorithmic accountability report is not completed by January 1, 2025, the agency must immediately cease use of the unevaluated ADS until the report is provided or an extension is granted by the board. The board will grant an extension for continued use of a system if the agency has established a reasonable timeline for completion and there is no apparent likelihood of bias regarding the system. The board must report annually on agency compliance and any extensions granted.
Agencies intending to develop or procure an ADS for use between the effective date and January 1, 2025 must, as a condition of use,file an algorithmic accountability report for the ADS with the OCIO at least one month prior to procurement, or if internally developed, implementation of the system.
Beginning January 1, 2025, public agencies intending to develop or procure an ADS must,as a condition of use,submit an algorithmic accountability report for that ADS and obtain a finding by the board. After the report has been available for public comment at least 30 days, the board must conduct a review and issue a finding as to whether the agency's algorithmic accountability report reasonably shows the ADS meets the minimum standards. If the board finds that the agency fails to meet minimum standards it must provide a reasonably detailed description of the reasons to the agency. The agency may revise the information provided, the system, or the procedures for use of the system and submit a revised report for further review.
Each algorithmic accountability report must include certain statements such as a description and purpose of the proposed ADS and its use; types of decisions the ADS will make; information on whether the ADS has been reviewed for inaccuracies or bias; description of any community engagement regarding the system including whether and how people affected by the system can review and challenge system decisions; data management policy that includes protocols for deployment, security, and training; and fiscal impact.
Beginning January 1, 2024, agencies using ADS must publish online annual metrics regarding the number of requests for human review of a decision rendered by the ADS and the outcome of that review.
Beginning January 1, 2025, agencies must conduct annual audits on ADS that have legal effects on people and report to the board any findings. The audit must include whether agencies have complied with the terms of approved algorithmic accountability reports; descriptions of violations; any systematic issues raised by use of ADS; and any recommended revisions to the algorithmic accountability report.
Algorithmic Accountability Review Board. The board is created within the OCIO. The board will conduct:
Board Membership. The board must represent diverse stakeholders and consist of the following voting members:
Initial appointments must be made by January 1, 2023. After the initial appointments are made, members will serve three year terms. Members will be reimbursed for travel expenses.
Office of the Chief Information Officer. OCIO must:
Washington's Law Against Discrimination. Except to the extend an ADS utilizes a criterion specifically authorized or mandated by state or federal law or regulation, it is an unfair practice for any ADS to discriminate against any individual on the basis of one or more factors enumerated in the WLAD.
The committee recommended a different version of the bill than what was heard. PRO: Racial disparities have risen to the forefront on issues and ADS or artificial intelligence are at the center of that, making decisions regarding how we do everyday things. Artificial intelligence is intended to streamline processes but these tools impact the vulnerable communities at an alarming rate. ADS cannot read social context. There is no shortage of data that shows that there are disparities in how systems treat people of color. These programs are the definition of institutional racism due to imbedded disparities or internalized racism that writers may have when writing the programs or analyzing datasets and even life or death decisions about people's lives. There are many cases across the United States of agencies adopting faulty and bias algorithms. This bill bans certain dangerous uses and provides transparency with regard to the use of these systems. This would be the first bill of its kind in the United States and would set a sound standard for transparency and fairness.
OTHER: Still trying to understand what systems would qualify as an ADS under the bill. There are many legitimate public uses for this technology and so there needs to be more conversation regarding what is included. Some examples of what might be included under the bill are use of: polygraph machines for screening of qualified applicants; crime lab DNA, fingerprint, and firearm analysis; crime reports that use algorithms to suggest where to allocate resources; red light cameras; speed zones; and employee screening data such as years of services. Agree with the goals of the legislation and the need but want to make sure that there are not unintended consequences on systems that are standard.