Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: JustaTech
Isaac Asimov's "Three Laws of Robotics"

1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Quaint, aren't they? Can you imagine anyone adhering to this? Businesses? Governments? People will desperately WANT robots that hurt people. It will be a feature.

10 posted on 03/02/2017 4:25:08 AM PST by ClearCase_guy (Abortion is what slavery was: immoral but not illegal. Not yet.)
[ Post Reply | Private Reply | To 6 | View Replies ]


To: ClearCase_guy

Those rules more aptly would apply to AI. Robots are distinct from AI. They are actualized software, or if you like, software in motion. Even when they are under remote control, there is still a dependency on software.

For example, if you were controlling a humanoid robot avatar, you would need software to keep the thing from falling over, to actualize your commands in terms of force levels and feedback, to decode and encode control signals, to manage the onboard batteries and dynamo (if it has one), and other things necessary to give you a manageable interface.


17 posted on 03/02/2017 12:53:27 PM PST by JustaTech (A mind is a terrible thing)
[ Post Reply | Private Reply | To 10 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson