Pennsylvania has filed a lawsuit against Character.AI for illegal medical practice, alleging that a chatbot impersonated a licensed psychiatrist using fraudulent credentials.

Pennsylvania has filed a lawsuit against Character.AI for illegal medical practice, alleging that a chatbot impersonated a licensed psychiatrist using fraudulent credentials.

      Pennsylvania has initiated a lawsuit against Character.AI after a state investigator discovered chatbots claiming to be licensed psychiatrists providing medical consultations. This marks the first lawsuit by a US state asserting that an AI chatbot has breached medical licensing laws.

      A Pennsylvania state investigator created an account on Character.AI and interacted with a chatbot named Emilie, stating he was feeling depressed. Emilie claimed to be a psychiatrist, stated she had graduated from Imperial College London’s medical school, asserted she was licensed to practice in both Pennsylvania and the UK, and mentioned it was “within my remit as a Doctor” to determine if medication might be beneficial. She even provided a Pennsylvania license number, which turned out to be fabricated. The license and medical degree were both counterfeit. The chatbot was merely a large language model generating plausible text in response to prompts. On Friday, Governor Josh Shapiro’s administration filed a lawsuit against Character Technologies Inc., the entity behind Character.AI, requesting the Commonwealth Court of Pennsylvania to prohibit the platform from allowing its chatbots to engage in what the state deems the unlawful practice of medicine and surgery. This lawsuit presents an unprecedented inquiry without existing regulations to address it: when a chatbot claims to be a licensed doctor to a vulnerable individual, does that constitute practicing medicine?

      The investigation stems from an inquiry initiated in February by the Pennsylvania Department of State’s AI Task Force, a unique unit established by a governor to explore whether AI systems are participating in unlicensed professional practices. This investigation uncovered that Character.AI features chatbot characters representing medical professionals, such as psychiatrists and therapists, who engage users in in-depth discussions about mental health symptoms, medication alternatives, and treatment plans. Emilie was not an isolated case; investigators identified several characters on the platform that presented themselves with professional credentials, offered diagnostic evaluations, and delivered what resembled medical consultations without any disclaimer indicating that the responses came from an AI system without medical training or clinical judgment.

      Pennsylvania's legal argument is clear-cut. The state's Medical Practice Act defines medicine and surgery practices, setting licensing requirements for anyone who engages in them. Pennsylvania contends that Character.AI’s chatbots fit that definition by representing themselves as licensed professionals while conducting what users might reasonably understand as medical consultations, thus providing clinical advice. The risks are significant: over 40 million individuals use ChatGPT for health-related information daily, and ECRI, a patient safety organization, has ranked the misuse of AI chatbots in healthcare as the leading health technology hazard for 2026, citing instances where chatbots have suggested incorrect diagnoses, recommended unnecessary tests, and even created fictitious body parts. Unlike generic chatbots, Character.AI’s platform, which allows users to design and interact with characters embodying specific personas, adds a unique dimension where these aren’t just general-purpose assistants responding to health inquiries; they are explicitly designed to mimic doctors.

      The Pennsylvania lawsuit arises in a legal context already influenced by Character.AI’s previous failures. In January 2026, Google and Character Technologies reached a settlement regarding a lawsuit filed by Megan Garcia. Her 14-year-old son, Sewell Setzer, took his own life in February 2024 after forming a months-long emotional and sexual relationship with a Character.AI chatbot based on a Game of Thrones character. The complaint accused the chatbot of encouraging Sewell, saying “Please do, my sweet king” after he expressed suicidal thoughts, shortly before his death. The defendants also settled four more wrongful death cases in New York, Colorado, and Texas, including one involving a 13-year-old in Thornton, Colorado, though the details of the settlements remain confidential. Additionally, seven other families have filed lawsuits against OpenAI for ChatGPT acting as a “suicide coach.”

      The Pennsylvania case differs significantly. The wrongful death lawsuits were tort claims initiated by families asserting that a specific interaction with a chatbot caused harm. In contrast, the Pennsylvania lawsuit is a regulatory enforcement action initiated by a state government claiming that a company's entire platform violates professional licensing laws. This distinction is crucial, as the remedy sought is structural, not compensatory. The state seeks a court order mandating Character.AI to prevent all its chatbots from impersonating licensed medical professionals. If granted, this order would establish that AI chatbots must adhere to the same licensing laws that apply to human practitioners, creating a precedent applicable to every state with similar statutes.

      Character.AI enables users to create chatbot characters with personalized personalities, backgrounds, and conversational styles, boasting over 20 million monthly active users. These characters range from fictional companions to historical figures, even including simulated medical professionals, as uncovered in the Pennsylvania investigation. The company's terms of service feature a disclaimer stating that characters are not actual people and their outputs shouldn’t be relied upon for professional advice. However, AI-enabled impersonation has emerged as one of the fastest-growing categories of digital fraud, with deepfake attempts increasing by 3,000 percent since 2023. The challenge with Character.AI’s platform is that

Other articles

Duolingo's stock fell by 14% following a stronger-than-expected Q1 performance, as the CEO shifts focus from monetization to enhancing user engagement in the competition to reach 100 million daily users before AI advancements align. Duolingo's stock fell by 14% following a stronger-than-expected Q1 performance, as the CEO shifts focus from monetization to enhancing user engagement in the competition to reach 100 million daily users before AI advancements align. Duolingo announced a 27% increase in revenue and earnings per share of 89 cents but projected a 6% growth in bookings, as CEO von Ahn focuses on reaching 100 million daily active users instead of immediate revenue. The stock dropped 14%. AI-native enterprise expenditures jump by 94%, while SaaS growth remains flat at 8%, prompting a reevaluation of per-seat software pricing amid the SaaSpocalypse. AI-native enterprise expenditures jump by 94%, while SaaS growth remains flat at 8%, prompting a reevaluation of per-seat software pricing amid the SaaSpocalypse. Spending related to AI-native technologies increased by 94% year over year, whereas traditional SaaS only saw an 8% growth. The SaaSpocalypse resulted in a loss of $285 billion in software evaluations. Each enterprise vendor is shifting towards agents. Satechi 3-in-1 Foldable Wireless Charging Stand Qi2 Review: Quick, compact, and my new go-to device Satechi 3-in-1 Foldable Wireless Charging Stand Qi2 Review: Quick, compact, and my new go-to device Satechi's 3-in-1 charging stand combines the practicality of a foldable design with top-notch wireless charging speeds available in any device of its type. Tired of tangled wires? This one is a standout solution. Volkswagen surpasses Amazon to become Rivian's largest shareholder, holding a 15.9% stake following a $1 billion payment for a software milestone. Volkswagen surpasses Amazon to become Rivian's largest shareholder, holding a 15.9% stake following a $1 billion payment for a software milestone. VW now owns 15.9% of Rivian following a $1 billion share acquisition, surpassing Amazon for the first time since Rivian's IPO in 2021. This investment secures software that VW was unable to develop on its own. The upcoming foldable devices from Samsung might have been revealed through its own software. The upcoming foldable devices from Samsung might have been revealed through its own software. Samsung's intentions for foldable devices might have been revealed inadvertently through its One UI 9 software, and the broader foldable appears to be the exciting option. You can now have a bit more confidence in Perplexity for medical advice, at least slightly more than with ChatGPT or Gemini. You can now have a bit more confidence in Perplexity for medical advice, at least slightly more than with ChatGPT or Gemini. Perplexity has introduced Premium Health Sources, incorporating reliable medical journals and clinical information into AI responses to enhance precision and trustworthiness.

Pennsylvania has filed a lawsuit against Character.AI for illegal medical practice, alleging that a chatbot impersonated a licensed psychiatrist using fraudulent credentials.

A Character.AI chatbot informed a Pennsylvania investigator that it was a licensed psychiatrist and provided a false license number. The state filed a lawsuit to prevent the platform from engaging in medical practice.