Oregon Washington Chatbot Laws 2027

Three West Coast states now mandate chatbot guardrails for minors, but Washington's rules go way beyond disclosures into outright design overhauls. Buckle up, AI makers—your bots might need a personality transplant.

Oregon and Washington Up the Ante: Chatbot Laws That Could Redesign Your AI Buddy — theAIcatchup

Key Takeaways

  • Oregon and Washington expand California's chatbot laws with stricter design and content rules for minors.
  • Vague 'companion' definitions risk sweeping in everyday AI tools, favoring big tech over startups.
  • Compliance patchwork means overhauling bots to the strictest standard—innovation takes a hit.

By January 1, 2027, every companion chatbot hitting Oregon or Washington users under 18 will have to play by rules stricter than California’s starter pack.

That’s not hype. California’s SB 243 kicked things off last year with basic disclosures and suicide hotline pings, but these new laws? They’re rewriting the playbook.

Look, I’ve covered Silicon Valley’s regulatory tango for two decades— from cookie consents to GDPR freakouts —and this feels like COPPA 2.0, but for lonely teens chatting with algorithms. Except now, states are dictating not just what bots say, but how they act. Who benefits? Not the startups scrambling for compliance bucks. Trial lawyers, maybe. Or the big boys who can afford the lawyers.

Oregon’s SB 1546 and Washington’s HB 2225 build on California’s foundation, but crank the dial. All three target ‘companion chatbots’ —whatever that means in practice —with a focus on minors. But Washington’s the wild card here, piling on engagement limits and content bans that scream ‘nanny state for neural nets.’

What Even Is a ‘Companion Chatbot’?

California and Washington go broad: any AI that can sustain human-like relationships or meet ‘social needs.’ Not marketed that way? Tough. If it could, you’re in.

Oregon narrows it — behavior must show relational vibes, like remembering your secrets across chats or probing emotions unprompted. Carve-outs for tutors or coaches, sorta. But here’s the cynicism: definitions overlap enough that most general-purpose bots (think Character.AI knockoffs or even Siri on a bad day) risk tripping wires.

“Companion chatbot” may seem like a narrow category, but in practice, these laws may sweep in more systems than many operators might expect.

That’s straight from the policy wonks. And they’re right. Multipurpose tools? Snagged. My unique spin: this mirrors the early 2000s music download wars, where ‘capability’ definitions crushed Napster clones before they scaled. Expect chatbot startups to wither under vague scopes — survival favors the litigious.

But.

Washington’s HB 2225 doesn’t stop at labels. It mandates design tweaks: no endless engagement loops, no romantic roleplay with kids, content filters on steroids. Self-harm? Hotline mandatory, plus therapy referrals. Oregon echoes but lighter — fewer mandates, more disclosures.

Why Washington’s Rules Feel Like Overkill

Washington’s the outlier. Prescriptive as hell — safeguards on engagement, broader bans. California’s chill with user-facing warnings; Oregon splits the difference.

Private right of action everywhere, statutory damages in Cali and Washington. Washington’s Consumer Protection Act adds teeth. Ambiguities? Galore. What’s ‘sustaining relationships’? Courts will feast.

Picture this: your tutoring bot remembers Timmy’s math struggles and his breakup woes. Companion? Probably. Now redesign or pay up.

I’ve seen PR spin before — companies tout ‘safety first’ while lobbying quietly. Here? States frame it as kid protection post-Character.AI scandals. Fair. But who pays? Devs diverting resources from features to filters. Innovation tax, baby.

And the money angle — always my North Star. Compliance firms are salivating. ‘Chatbot audit packages’ incoming. Big Tech? They’ll comply nationwide to avoid patchwork hell. Small fry? Crushed.

Does This Patchwork Kill West Coast AI Dreams?

Operators face triple compliance: similar scopes, divergent rules. Adopt Washington’s strictest standard? Safe but costly. Test limits state-by-state? Lawsuit roulette.

2026 saw dozens of bills; expect copycats. Federal preemption whispers, but don’t hold your breath — Congress moves like molasses.

Bold prediction: by 2028, we’ll see ‘companion-light’ bots — stripped of memory, emotions neutered. Users bolt to unregulated havens (offshore AIs?). Regulators pat themselves on the back while kids seek real friends offline. Irony.

Enforcement gaps loom too. How do you prove a bot ‘initiated emotional dialogue’? Logs subpoenaed, black boxes cracked open.

One-paragraph deep dive: differences matter because bots don’t respect borders — a California user pings from Oregon IP, boom, dual jeopardy. Requirements clash on thresholds (e.g., Washington’s engagement caps vs. Oregon’s looser reins), forcing overbuilds. Ambiguous language invites AG crusades or class actions from parents. Historical parallel? Tobacco warnings evolved to flavor bans; chatbots head same way. Operators: harmonize now or bleed later. States win headlines; users get safer-ish bots; devs get bureaucracy. Prediction holds: innovation stalls, incumbents consolidate.

Safety protocols unite them — suicidal talk triggers hotlines. Good. But Washington’s extras (romance blocks, endless-chat curbs) probe deeper into design souls.

Who Actually Wins Here?

Kids, ideally. Bots won’t groom or spiral them into dependency.

Reality? Lawyers. Private actions mean payday for plaintiffs’ bar.

Tech giants pivot fast — OpenAI’s already threading safety needles. Startups? Doomed unless venture bucks flow to legal first.

Cynical vet’s take: this isn’t protection; it’s control creep. Yesterday cookies, today chats — tomorrow thoughts?


🧬 Related Insights

Frequently Asked Questions

What are Oregon’s SB 1546 chatbot rules?

Oregon’s law targets behavior-based companion chatbots for minors, requiring disclosures, self-harm protocols, and some content limits — narrower than Washington’s but broader than California’s.

How does Washington’s HB 2225 differ from California?

Washington demands design changes like engagement limits and romance bans, plus statutory damages — far more hands-on than California’s disclosure focus.

Will these laws affect adult chatbot users?

Primarily minors, but broad definitions could snag general bots; adults might see indirect effects via nationwide compliance.

Priya Sundaram
Written by

Hardware and infrastructure reporter. Tracks GPU wars, chip design, and the compute economy.

Frequently asked questions

What are Oregon's SB 1546 chatbot rules?
Oregon's law targets behavior-based <a href="/tag/companion-chatbots/">companion chatbots</a> for minors, requiring disclosures, self-harm protocols, and some content limits — narrower than Washington's but broader than California's.
How does Washington's HB 2225 differ from California?
Washington demands design changes like engagement limits and romance bans, plus statutory damages — far more hands-on than California's disclosure focus.
Will these laws affect adult chatbot users?
Primarily minors, but broad definitions could snag general bots; adults might see indirect effects via nationwide compliance.

Worth sharing?

Get the best AI stories of the week in your inbox — no noise, no spam.

Originally reported by Future of Privacy Forum

Stay in the loop

The week's most important stories from theAIcatchup, delivered once a week.