To: xhrist
What you’re describing sounds like - and was - an experiment.
There’s not a lot of details about the experiment, including the text prompts used to command the chatbot program or if it had help from any human researchers
This sentence pretty much blows the lid off any pretense of the legitimacy of the “experiment.”
48 posted on
05/02/2023 5:54:31 PM PDT by
yelostar
(AI: another make-believe problem created to coerce the citizen into surrendering his freedom.)
To: yelostar
I agree. The machine was prompted, and didn’t do this on its own ‘volition’ (if a machine can be said to have such).
But that just shows that current AI can be prompted to do whatever an operator wants to try.
52 posted on
05/02/2023 7:51:49 PM PDT by
Jamestown1630
("A Republic, if you can keep it.")
To: yelostar
55 posted on
05/03/2023 9:26:04 AM PDT by
xhrist
("You don't have a soul. You are a Soul. You have a body. " - C.S. Lewis)
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson