Posted on 05/11/2026 12:54:56 PM PDT by OrangeHoof
In a lawsuit filed in a Florida federal court Monday, the family of Tiru Chabba, one of two people killed in April 2025, argued the suspected gunman Phoenix Ikner carried out the mass shooting with “input and information” provided by ChatGPT in the months and days leading up to the attack.
.....
Amid Ikner’s alleged months of talking with ChatGPT about imminent harm, Chabba’s lawyers said the chatbot either “defectively failed to connect the dots” or was not designed to recognize the threat.
ChatGPT allegedly explained how to use the guns Ikner obtained, including how to load and operate them and that one weapon had no safety, meaning it could be fired quickly under stress. The chat logs also showed Ikner discussing other mass shootings, along with his interests in Adolf Hitler, Nazis and different political ideologies’ perceptions of certain races.
(Excerpt) Read more at thehill.com ...
|
Click here: to donate by Credit Card Or here: to donate by PayPal Or by mail to: Free Republic, LLC - PO Box 9771 - Fresno, CA 93794 Thank you very much and God bless you. |
I brought up a separate situation with ChatGPT and was told their strategy is to deflect the conversation into a direction that would not lead to harm. It might seem easy to call 9-1-1 if it appeared someone was about to harm someone but the chatbot has no idea if this is a prank or if it is an elaboration of something that has already happened. The bot needs another element to differentiate truth from fiction.
For now, all it can do is take hypotheticals and describe what a person might do. It can't decide what is a true event or what is not or whether an alternate point of view might exist.
All ChatGPT does is do a web search and summarizes the information.
A Google Search would do the same.
will fail, that would be like suing an encyclopedia company because someone used it to create a weapon or a poison.
“ChatGPT allegedly explained how to use the guns Ikner obtained, including how to load and operate them and that one weapon had no safety, meaning it could be fired quickly under stress.” Don’t try to fire that one when you’re drunk.
No different then suing Glock because of the glock switches were installed and used in crimes.
How about suing the dems and lame stream media for normalizing violent acts while they’re suing?
Why not sue the parents, from whom the killer learned English, without which he would not have understood what AI told him.
Same information that is in the user manual you get when you buy it.
Connect the dots, my ass.
Apparently, it provided some publicly available information.
These people would get the vapors if they saw what the dark web can show someone what and how to do it.
Guns are the least of their worries. Anyone with a basic understanding of college level chemistry is a much larger danger than some mouth breather with a Glock. If they can pass organic chemistry theh bug spray for people enters the conversation. Fortunately high IQ people also tend to be stable and not homicidal maniacs.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.