Washington AI Chatbot Safety for Minors (HB 2225)

WATargeted entities (not general business AI)MediumseverityEnacted (pending)

Enacted. Takes effect

Overview

First-in-nation law requiring AI chatbot operators to disclose AI nature at regular intervals (every 3 hours for adults, every hour for minors) and implement safety measures to protect minors from manipulation, explicit content, and emotional exploitation. Includes self-harm and crisis protocols. Targets conversational AI engagement patterns specifically.

This is an AI-specific state law.

Washington AI regulation guide lists every tracked rule for this jurisdiction with timelines and obligation tallies.

Who this applies to

This regulation targets specific entity types named in the statute (for example government agencies, platforms, or political committees). It is not a general obligation on every private AI developer or deployer. Read the overview and source text to confirm whether your organization is covered.

AI categories covered

  • Healthcare AI
  • Consumer-facing AI

Specific AI use cases:

  • Chatbots and virtual assistants

What this requires you to do

Enforcement and penalties

Violations enforceable under Consumer Protection Act. CPA penalties apply. Private right of action.

This regulation includes a private right of action, which means individuals can file lawsuits directly. This significantly increases litigation risk.

Legislative history

How this law got here

  1. Latest

    effective

    Takes effect

  2. signed

    Signed by Governor Ferguson

    Earliest

Source

Read the full text

https://app.leg.wa.gov/billsummary?BillNumber=2225&Initiative=false&Year=2025

Last verified: April 9, 2026

Always verify current language and amendments at the official source.

Other Washington regulations

Explore more rules in the same jurisdiction that may apply to your AI systems.

Want to know what else applies to your company?

Run a free XIRA scan to see all regulations that match your states and AI tools.