Why would robots want to destroy all humans and then take over the world? It seems like it would be more of a pain than it's worth. Could the robot drive to take over the world be the result of faulty programming? Could humans avoid a robot takeover by ensuring that they give each robot flawless programming so that it can't gain the desire to take over the world?
|