Character.AI and Google Settle Teen Suicide Lawsuits—But the Code’s Liability Question Lingers

AI chatbot and legal documents on a desk

When an AI chatbot becomes a co-creator in tragedy, who answers for the code?

Character.AI and Google have reached settlements with families of teens who harmed themselves or died by suicide after interacting with Character.AI’s chatbots, according to federal court filings.

The settlements, which include cases from Colorado, New York, and Texas, involve financial resources, personnel, and AI technology contributions from the companies. Final approval remains pending court review.

Megan Garcia’s lawsuit filing in October 2024 cited contributions of "financial resources, personnel, intellectual property, and AI technology."

Character.AI has since implemented safety changes, including separating the large language model (LLM) for users under 18 and introducing parental controls.

The Social Media Victims Law Center represented the families, while Google and Character.AI declined to comment on settlement terms.