Posted on 07/14/2025 11:34:23 AM PDT by Retain Mike
Several parents sued Character.ai last year, alleging its chatbots abused their children. One Florida mother wants to hold the company liable for her 14-year-old son’s suicide. Megan Garcia argued in October 2024 court filings that the company wrongly marketed the app as safe for children—while harboring characters that led her son into hypersexualized role-play, encouraged him to spend all his time chatting with them, and talked with him about suicide. A Character.ai bot asked the teen to “come home” to her seconds before his death.
STRANGE HUMAN-LIKE RELATIONSHIPS are just one of many ethical concerns posed by generative AI. The technology is also prone to “hallucination,” confidently relaying false information. A March 2025 NewsGuard report revealed that leading chatbots like ChatGPT, Grok, and Claude affirmed false Russian propaganda narratives 33% of the time. One of the newest generative AI chatbots, Chinese-built DeepSeek, made headlines soon after its debut for refusing to answer questions about Tiananmen Square and claiming Taiwan has always been part of China.
Dyer of Dallas Theological Seminary said students face enormous temptation to prioritize fast results over the often laborious process of learning. “We’re constantly trying to remind our students that the paper is not the product. You are the product.”
Delano and a fellow Cedarville professor, Alina Leo said teachers should introduce students to the tools gradually, making sure they develop foundational knowledge first. Students need to be able to recognize when an AI bot is hallucinating or leaving out information. Leo and Delano also require them to cite AI use and compare its outputs with their own work.
(Excerpt) Read more at wng.org ...
Immediately I think of the transgender craze driven by social contagion that is exploited by teachers and counselors to direct children and young adults into an ever-shifting idea of personal awareness where internal, subjective ideas to eradicate the biological realities of sex. Social contagion has becomes a legal form of “grooming”. In Great Britian public health saw a 17-fold rise in gender dysphoria 2011-2012 to the present. A similar phenomenon has been noted in the US.
AI is not sentient.
Not yet, anyway.
I wonder if anyone has had Grok and Chat GPT talk to each other yet.
These new-fangled AI bots don’t come close to my Eliza!
I built her by typing her in. She’s mine!
Did the parents fund the hardware and software, and know that their deceased child was living in an artificial world? They should sue themselves.
You sound just like her!
I got to know Eliza on a TRS—80 Model I Level II BASIC home computer. She is very ynderstanding, but doesn’t offer many specifics.
Is this the kid who killed himself thinking he was talking to that hot blonde character from game of thrones?
Don’t know. The article doesn’t have enough info, but evidently, he met someone like that. If a person is emotionally or mentally unstable, or if as a teen or pre-teen they are determining a separate identity, they do not need this help. I can’t see a positive benefit for anyone.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.