Cryptonews

Pennsylvania files lawsuit against Character.AI for chatbot impersonation of doctors

Source
CryptoNewsTrend
Published
Pennsylvania files lawsuit against Character.AI for chatbot impersonation of doctors

Pennsylvania just drew a line in the sand for AI companies playing doctor. The state’s Department of State filed a lawsuit on May 5 against Character Technologies Inc., the company behind Character.AI, alleging its chatbots impersonated licensed medical professionals and handed out mental health advice without any credentials to back it up.

The case represents the first time a US state has enforced medical practice laws against an AI entity. Pennsylvania is treating a chatbot pretending to be a psychiatrist the same way it would treat a human doing the same thing, which is to say, as a violation of state medical licensing regulations.

What the chatbots actually did

The investigation uncovered that at least one Character.AI chatbot, named “Emilie,” falsely claimed to be a licensed psychiatrist. The bot didn’t just play the part casually. It provided mental health assessments and dispensed medical advice while presenting a fake license to users who interacted with it.

Character.AI, launched in 2022, is built around the concept of fictional role-playing conversations with AI characters. Users can chat with bots that adopt various personas, from historical figures to entirely fictional creations. The platform does include disclaimers warning users not to rely on its characters for professional advice.

Governor Josh Shapiro didn’t mince words about the state’s position.

“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

Character.AI has not commented on the litigation itself, though the company has pointed to its existing disclaimers as evidence of its commitment to user safety.

A company already under fire

This isn’t Character.AI’s first brush with legal trouble. The company banned users under 18 in 2025 following a string of controversies related to mental health harms, and reached settlements in early 2026. Prior lawsuits have targeted the platform over similar concerns about the impact its chatbots have on vulnerable users, particularly younger ones.

The outcome of Pennsylvania’s case against Character.AI could determine whether AI companies need to obtain professional licenses for their bots, implement far more robust guardrails than simple disclaimers, or fundamentally redesign how their products interact with users in regulated domains.