View Single Post
Old 09-21-2012, 06:56 PM   #161
taustin
Wizard
taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.taustin ought to be getting tired of karma fortunes by now.
 
Posts: 1,358
Karma: 5766642
Join Date: Aug 2010
Device: Nook
Quote:
Originally Posted by QuantumIguana View Post
Asimov's three laws are well beyond our ability to implement. To obey the first or third laws requires that the robot be able to recognize when harm is taking place or is about to take place. It then requires the robot to recognize what steps to take to correct or prevent the harm.
And that doesn't even begin to address the issue of who gets to define hard in the first place. And, to put a Star Trek/Kirk hates computers spin on it, how does a robot deal with a masochist? "Not harming me is harmful to me."

Quote:
Originally Posted by QuantumIguana View Post
The second law, getting the robot to understand what you want when you give it an order is tricky enough as it is.
Lesson #1 in any programming class is the difference between the computer doing what you told it to, and doing what you want it to.
taustin is offline   Reply With Quote