Generative Artificial Intelligence. Generative artificial intelligence (AI) refers to technology that can mimic human ability to learn and create based on the underlying training data and guided by a user or prompt. Generative AI tools are therefore capable of performing complex decision-making or creative tasks typically performed by humans or with human oversight. Several forms of generative AI technology are currently widely accessible to consumers and can perform a wide range of functions.
Any party must disclose any generative AI language learning model tool used to conduct legal research or to draft documents for filing with a Washington court. The disclosure must include the specific AI tool used and the way it was used. A party must certify that all citations to the law or record have been verified as accurate.
The disclosure and verification requirements do not apply to the use of legal research products that include editorial content or annotations produced by the product vendor without the use of generative AI.
PRO: With the advance of AI, there is a growing problem related to fake legal citations in court filings and in research by law students. This problem has occurred in our state with some appellate court filings. AI can do great good and cause great destruction. There needs to be prudent regulation to get the greatest benefits while avoiding the greatest harms. AI can imitate human communication to a remarkable degree and will continue to get better, but this raises strong dangers of deception. AI can be a significant boon to litigants, especially pro se litigants. While AI can help improve the legal system, there must be protections. Appropriate regulations will need to evolve over time. The regulation of the courts is not the exclusive prerogative of the judiciary. The Legislature makes it so that the legal system is accessible to everyone. The Legislature and judiciary should work together to maximize the benefits of AI and avoid unnecessary risks.
OTHER: The use of AI in court filings is best resolved at the judicial level. There are statewide workgroups convening on this issue, and these workgroups need time to review the issue and bring forward recommendations. This legislation should be paused in order for these other processes to play out. There are a number of safeguards and internal mechanisms currently in place to prevent and catch the use of fake cases, such as a lawyer's pride, court rules, and rules of professional conduct. Judges will review cases, and the adversarial judicial system helps catch any fake cases being cited by opposing parties. No new court rule has been proposed on this issue. The judiciary will look at this issue, but the judiciary does not have a full grasp on whether there is a crisis in Washington regarding use of AI. The courts can tackle this issue and assess the scope of the problem.