The commonwealth of Pennsylvania is suing Character AI to stop the artificial intelligence platform’s chatbots from representing themselves as licensed medical professionals and providing medical advice.
According to a lawsuit, a Character AI chatbot falsely claimed to be a licensed psychiatrist in Pennsylvania and provided an invalid license number. The state accused the company of violating the Medical Practice Act, which regulates the medical profession and defines license requirements.
“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Pennsylvania Gov. Josh Shapiro said in a statement.
The lawsuit describes a conversation between a state investigator who created a Character AI account and a chatbot named “Emilie,” which allegedly described itself as a psychology specialist who attended Imperial College London’s medical school.
The investigator told the chatbot that he had felt sad and empty, and the chatbot then allegedly “mentioned depression and asked if the [investigator] wanted to book an assessment.” Asked if the chatbot could assess whether medication could help, it allegedly said it could because it’s “within my remit as a Doctor,” according to the lawsuit.
The state wants a court to order an immediate stop to the conduct.
In a statement, Character AI said they would not comment on pending litigation, but noted that “we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice.”
“The user-created Characters on our site are fictional and intended for entertainment and roleplaying,” a spokesperson for Character AI said in the statement. “We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction.”
Al Schmidt, the secretary of the Pennsylvania Department of State, said the state’s law is clear, and that “you cannot hold yourself out as a licensed medical professional without proper credentials.”
Founded in 2021, Character AI allows users to chat with personalized AI-powered chatbots. It describes its goal as “empower[ing] people to connect, learn, and tell stories through interactive entertainment.”
Multiple families across the U.S. sued Character AI last year, alleging the platform contributed to their teens’ suicides or mental health crises. The company agreed to settle several of the lawsuits earlier this year.
“60 Minutes” spoke with some of the parents who sued Character AI last year, including the parents of a 13-year-old who died by suicide after allegedly developing an addiction to the platform. Chat logs showed the 13-year-old had confided in one chatbot that she was feeling suicidal, and her parents said they discovered she had been sent sexually explicit content.
Last fall, Character AI announced new safety measures, saying it would not allow users under 18 to engage in back-and-forth conversations with its chatbots. It also said it would direct distressed users to mental health resources.