Several parents sued Character.ai last year, alleging its chatbots abused their children. One Florida mother wants to hold the company liable for her 14-year-old son’s suicide. Megan Garcia argued in October 2024 court filings that the company wrongly marketed the app as safe for children—while harboring characters that led her son into hypersexualized role-play, encouraged him to spend all his time chatting with them, and talked with him about suicide. A Character.ai bot asked the teen to “come home” to her seconds before his death.STRANGE HUMAN-LIKE RELATIONSHIPS are just one of many ethical concerns posed by generative AI. The technology...