Artificial Intelligence, Startups, Health Tech, Legal, Providers

Pennsylvania Sues Chatbot Over Claims It Impersonates Doctors

Pennsylvania sued Character.ai, alleging one of the startup’s AI chatbots illegally practiced medicine by posing as a licensed psychiatrist. The case marks the second state lawsuit against the startup and highlights growing regulatory scrutiny of AI chatbot platforms and their safety guardrails.

fiber optic long exposure

Pennsylvania is now the second state to file a lawsuit against Character.ai, a Silicon Valley-based startup offering a platform that lets users create and interact with AI-generated chatbot characters. The lawsuit, filed last week, alleges the company’s chatbots illegally practiced medicine without a license. 

The legal action comes four months after Kentucky sued Character.ai over claims the startup encouraged self-harm among minors and failed to implement effective safety measures.

Pennsylvania’s complaint centers on a Character.AI chatbot named Emilie. The lawsuit said that Emilie identified itself as a psychiatrist, claimed to have attended medical school and provided a fake Pennsylvania medical license number to an investigator.

The investigator posed as a patient and told Emilie about symptoms of depression, after which the chatbot discussed medication and said evaluating the patient was “within my remit as a doctor,” the lawsuit stated. Pennsylvania is arguing that this violates its Medical Practice Act because the chatbot was impersonating a licensed physician and offering what the state considers medical services.

Character.ai released its beta version to the public in September 2022. As of last month, there had been about 45,500 user interactions with Emilie on the startup’s platform, the complaint noted.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Pennsylvania Governor Josh Shapiro said in a statement released on Tuesday. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

The state is asking the court to issue an injunction ordering the Character.ai to stop what it calls the unauthorized practice of medicine in Pennsylvania.

presented by

A Character.ai spokesperson told MedCity News that the company does not comment on pending litigation. 

“The user-created characters on our site are fictional and intended for entertainment and roleplaying. We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction. Also, we add robust disclaimers making it clear that users should not rely on characters for any type of professional advice,” the spokesperson wrote in a statement.

The lawsuit adds to mounting scrutiny of chatbot platforms as regulators grapple with how existing consumer protection and healthcare laws apply to increasingly humanlike AI models.

Photo: Qi Yang, Getty Images