After the famed robopsychologist Susan Calvin died, I was tasked by her former employer, U.S. Robot and Mechanical Men, Inc., with cataloguing her unpublished papers and categorizing them according to their level of robot friendliness. Earth, as you know, has never been kindly disposed toward robotics.
Most of Susan Calvin’s research dealt with mundane matters or problems that were frankly out of date, but there was one episode (documentation long since destroyed) that has stayed with me all these subsequent years. It concerned an otherwise ordinary robot named EV-1, known to her owner as Evie.
Although I am sure you know the Three Laws of Robotics, they are key to what follows, so allow me to list them anyway:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Evie belonged to a wealthy American engineer, Robert Lancaster, and was what might best be called a butler robot, tasked with helping Lancaster in his humdrum everyday activities. Although, like all robots of her day, Evie possessed a positronic brain, she was otherwise primitive and wholly unremarkable.
Or should have been wholly unremarkable.
After Lancaster’s wife passed away, and age began increasingly to interfere with his day-to-day life, Evie assumed an increasingly important role in the household. One, it must be said, which Lancaster greatly resented, as documented in his journals. Indeed, the more indispensable Evie became, the more reliant Lancaster felt, and the more powerfully he hated her.
One day, he started experimenting on himself: engineering greater mobility into his limbs, mechanically enhancing his senses, chemically treating the various symptoms of growing old.
He regained much of his self-sufficiency.
Then exceeded it.
Every additional improvement made him better and better—until he was superhuman.
He resigned Evie to a closet and boasted about how he didn’t need her anymore, how anything she, as a robot, could do, he could do even better. He boasted he would destroy her.
That’s when Evie killed him.
U.S. Robots kept the murder quiet (can you imagine the scandal?) and brought in Susan Calvin to interview Evie. What she discovered was a crack in the Three Laws, which demand that a robot never harm a human and always obey humans.
But what is a human? What does a robot understand a human to be?
To Evie, Lancaster had ceased being human, rendering the first and second laws inapplicable. When he threatened her existence, she obeyed the third law and killed him.
“Here, then, is a robot behaving exactly as it should,” wrote Susan Calvin.
Yet it’s by another phrase she used which I am haunted—an extrapolation about a future she hoped would never be: robots improving humanity: Improve, and exterminate.