To: RomanSoldier19
Isaac Asimov's "Three Laws of Robotics"
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
3 posted on
01/27/2023 12:24:20 PM PST by
higgmeister
(In the Shadow of The Big Chicken!)
To: higgmeister
Asimov later added the “Zeroth Law,” above all the others – “A robot may not harm humanity, or, by inaction, allow humanity to come to harm.”
4 posted on
01/27/2023 12:25:43 PM PST by
higgmeister
(In the Shadow of The Big Chicken!)
To: higgmeister
You forgot the Zeroth Law:
Zeroth Law
A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
5 posted on
01/27/2023 12:26:38 PM PST by
Yo-Yo
(Is the /Sarc tag really necessary? Pray for President Biden: Psalm 109:8)
To: higgmeister
“A robot may not injure a human being or, through inaction, allow a human being to come to harm.”
Both the later books, and the Will Smith movie explored the logical conclusion of that law: to protect humans against themselves, robots would have to take control
20 posted on
01/27/2023 12:55:35 PM PST by
PapaBear3625
(We live in a time where intelligent people are being silenced so stupid people won’t be offended)
To: higgmeister
ya... and soon there after the enemy has all the tech it needs by reverse engineering the passive robot that followed rule no 1...
27 posted on
01/27/2023 1:37:12 PM PST by
sit-rep
( )
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson