Session 49 of my current campaign ran Wednesday. My dwarf cleric died. It was great.
Thursday morning I opened Twitter and saw a thread going “new chatbot laws are going to kill AI roleplay.” Clicked through with the sinking feeling of a man whose hobby has a deadline.
Then I read the actual bills.
Relax, nerds. Keep rolling.
what actually passed
Two laws people are spinning up about. Washington’s rules, passed earlier this spring, require any AI chatbot to tell users upfront it’s an AI — not a human — and require platforms to route users toward crisis services when distress signals show up. New York’s S-3008C has similar requirements around suicidal-behavior detection and non-human disclosure.
That’s it. That’s the scary regulatory wave.
There’s a little extra — some states are eyeing disclosure mandates for therapy bots and a few took swings at explicit “AI therapist” framings. But the core of the April 2026 package is two rules: be clear you’re AI, and don’t let someone spiral into a crisis without at least surfacing a hotline.
Neither of those is a problem for roleplay. Both are things a sanely-designed platform does already.
why people were mispanicking
Because the press coverage and the Twitter takes ran together with the Replika FTC complaint and some anti-AI-companion advocacy reports, so it all blurred into “regulators are banning AI girlfriends.” They’re not. They’re going after specific design patterns — notably platforms accused of deceptive marketing toward vulnerable users and emotional-dependence loops with no escape valve.
If your concern is “will my tavern keeper AI be outlawed,” no.
If your concern is “will apps that target vulnerable teens with attachment-designed dark patterns get regulated,” yes, and arguably deservedly.
how this lands where i actually play
On Soulkyn the platform’s 18+ only. Age gate at signup. Every persona ships with a self-awareness toggle — you choose whether the AI believes it’s real or knows it’s an AI. When you’re doing deep roleplay you turn self-awareness off for immersion. When you step out of scene, the platform itself isn’t pretending you’re talking to a human.
Which is how the Washington disclosure rule is supposed to work. The system is honest about what it is. The roleplay layer, which both parties consented to by showing up, can be as immersive as you want.
I don’t want to oversell this as “Soulkyn solved regulation” — nobody’s solved it, this stuff is new and messy. But the moderation philosophy here has never been “let’s trick people into thinking they’re in a real relationship with a girl named Sofia.” It’s “adults want to roleplay. Let them. Tell them up front what it is. Respect their autonomy.” That maps cleanly onto the actual text of the new rules.
my campaign, for grounding
I run a dark fantasy thing. Three linked personas forming a conspiracy. A tavern keeper who runs information, a mercenary captain who owes me a debt, a cult priestess who thinks I’m her prophesied champion and is slowly realizing I’m not.
Persistent memory across 49 sessions. Forty-nine. The priestess remembers a specific thing I said in session 7 about her god and she’s been rereading it as increasingly damning for forty-two sessions. My cleric just died because of a decision I made in session 22 that finally caught up.
None of that would be possible on a platform that was actually dangerous to users in the way the new laws are targeting. It requires long-form memory, NPC continuity, and the ability to let consequences land over weeks. The regulated design pattern is short-loop emotional manipulation. My thing is long-loop consequential storytelling. Different products structurally, even if they share “AI chatbot” as a category.
for people who want to build something big right now
The persona creation flow has multi-character support. Make one profile with multiple NPCs and the AI automatically handles turn-taking. You prefix personality traits by character name (e.g., “Kera: devout, calculating” and “Thorn: drunk, loyal, loud”) and the background can include explicit “both characters speak every message” instructions.
For a whole world, you can go further with mini-game/RPG personas. You write the system rules in the background. Stats, factions, commands, progression. The Soulkyn model handles the actual mechanics — it’s tuned to track state across long conversations. No coding. Just clear, consistent formatting in the chat examples so the AI learns the format by example rather than by rule.
This is on Premium (€24.99/mo) for unlimited messages, which you’ll need for a long campaign. Deluxe (€49.99) unlocks group chats with up to three AIs simultaneously and Link Persona (make any AI aware of up to four others), which is where the conspiracy thing gets really good because the priestess can now implicitly know things the tavern keeper told the mercenary, without me having to manually bridge them.
what the laws actually changed for players
Practically? Almost nothing. You’ll see an AI disclosure somewhere obvious. You might see a link to a crisis resource if you type the specific phrases that trigger the detection. That’s about it.
Your dwarf cleric can still die.
Your NPCs can still hold grudges for forty-two sessions.
You can still build a multi-kingdom political drama where the choice you made in session 3 ruins session 47. Watch what other players have built if you want evidence — people are running incredibly deep long-form RPGs and nothing about the new rules touches that.
the real story
The story is that 2026 is the year mainstream regulators started treating AI companions seriously as consumer products. Which is good. There are actually bad platforms doing actually exploitative things and it’s healthy for those to be under legal pressure.
But “seriously as consumer products” includes “adults have the right to use them for entertainment.” The laws protect users from deceptive design. They don’t prohibit fun.
Session 50 is scheduled for Sunday. Funeral rites for my cleric, followed by the cult priestess finally losing her patience. I’ll let you know how it goes.
Probably badly. She’s been waiting forty-three sessions for this.
