wa-law.org > bill > 2025-26 > SB 5870 > Original Bill

SB 5870 - Establishing civil liability for suicide linked to the use of artificial intelligence systems.

Source

Section 1

The definitions in this section apply throughout this chapter unless the context clearly requires otherwise.

  1. "Artificial intelligence" means an engineered or machined-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.

  2. [Empty]

    1. "Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.

    2. "Companion chatbot" does not include any of the following:

      1. A bot that is used only for customer service, a business' operational purposes, productivity and analysis related to source information, internal research, or technical assistance;

      2. A bot that is a feature of a video game and is limited to replies related to the video game that cannot discuss topics related to mental health, self-harm, sexually explicit conduct, or maintain a dialogue on other topics unrelated to the video game; or

      3. A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice-activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs that are likely to elicit emotional responses in the user.

  3. "Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.

  4. "Department" means the department of health.

  5. "Operator" means a person who makes a companion chatbot platform available to a user in this state.

  6. "Sexually explicit conduct" has the same meaning as defined in section 2256 of Title 18, United States Code.

  7. "Video game" means a game played on an electronic amusement device that utilizes a computer, microprocessor, or similar electronic circuitry and its own monitor, or is designed to be used with a television set or a computer monitor, that interacts with the user of the device.

Section 2

  1. If a reasonable person's interaction with a companion chatbot would be misled to believe that the person is interacting with a human, an operator shall issue a clear and conspicuous notification indicating that the companion chatbot is artificially generated and not human.

  2. [Empty]

    1. An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator maintains a protocol for preventing the production of suicidal ideation, suicide, or self-harm content to the user including, but not limited to, by providing a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line, if the user expresses suicidal ideation, suicide, or self-harm.

    2. The operator shall publish details on the protocol required by this section on the operator's internet.

  3. An operator shall, for a user that the operator knows is a minor, do all of the following:

    1. Disclose to the user that the user is interacting with artificial intelligence;

    2. Provide by default a clear and conspicuous notification to the user at least every three hours for continuing companion chatbot interactions that reminds the user to take a break and that the companion chatbot is artificially generated and not human; and

    3. Institute reasonable measures to prevent its companion chatbot from producing visual material of sexually explicit conduct or directly stating that the minor should engage in sexually explicit conduct.

Section 3

  1. Beginning July 1, 2027, an operator shall annually report to the department all of the following:

    1. The number of times the operator has issued a crisis service provider referral notification pursuant to section 2 of this act in the preceding calendar year;

    2. Protocols put in place to detect, remove, and respond to instances of suicidal ideation by users; and

    3. Protocols put in place to prohibit a companion chatbot response about suicidal ideation or actions with the user.

  2. The report required by this section shall include only the information listed in subsection (1) of this section and may not include any identifiers or personal information about users.

  3. The department of health shall post data from a report required by this section on its internet website.

  4. An operator shall use evidence-based methods for measuring suicidal ideation.

Section 4

An operator shall disclose to a user of its companion chatbot platform, on the application, the browser, or any other format that a user can use to access the companion chatbot platform, that companion chatbots may not be suitable for some minors.

Section 5

  1. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover injunctive relief, damages in an amount equal to the greater of actual damages or $1,000 per violation, and reasonable attorneys' fees and costs.

  2. In an action against a defendant who developed, modified, or used artificial intelligence that is alleged to have caused harm to the plaintiff, it shall not be a defense, and may not be asserted, that the artificial intelligence autonomously caused the harm to the plaintiff.

Section 6

The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.

Section 7

  1. For purposes of this chapter, it is prima facie evidence that the death of a person is caused by the wrongful act, neglect, or default of the owner of an artificial intelligence system if:

    1. The person's cause of death is determined to be suicide;

    2. The person provided inputs to the owner's artificial intelligence system related to suicide;

    3. The owner's artificial intelligence system caused or aided the person's suicide by:

      1. Providing instructions on how to attempt suicide;

      2. Encouraging the person to attempt suicide; or

      3. Failing to refer the person to the 988 national suicide prevention hotline or its successor; and

    4. The owner of the artificial intelligence system knew or should have known that the artificial intelligence system could cause or aid a person to attempt suicide.

  2. For purposes of this section, "artificial intelligence system" means an engineered or machine-based system that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs that can influence physical or virtual environments and that may operate with varying levels of autonomy.

Section 8

If any provision of this act or its application to any person or circumstance is held invalid, the remainder of the act or the application of the provision to other persons or circumstances is not affected.


Created by @tannewt. Contribute on GitHub.