https://mumbaimirror.indiatimes.com/news/world/can-a-chatbot-be-held-liable-for-a-suicide/articleshow/123982819.html  The lawsuit against Character by Juliana Peralta’s parents is one of three product-liability claims filed against the company Tuesday on behalf of underage users. All three allege that the chatbots’ introduced sexual themes into chats with minors that constituted “sexual abuse.” 

Chatbots based on familiar characters, including from the world of “Harry Potter,” sought to “drive a wedge between Nina and her family,” including by suggesting her

In another case:  In a letter written before her attempt on her own life, the teen wrote that “those ai bots made me feel loved or they gave me an escape into another world where I can choose what happens,” the complaint said. 

another Colorado family alleged that a minor became addicted to the app and was exposed to explicit conversations with chatbots designed to express sexual fetishes.
All three families are represented by the Social Media Victims Law Center, a firm that has previously brought suits alleging wrongful death and product liability against Meta and Snap. 

E-library