Deep breath. “Acey, I order you to consider yourself human.”
Fred and Rick began yelling creative and colorful variants upon “Are you nuts?” and “You can’t go around reprogramming my expert system!” Neither noticed the coliseum—but not Acey—vanish.
Waterman waited them out. They gradually quieted down. “You wanted his behavior changed. That’s reprogramming.”
“Forget the semantics, Fred, and look at him.”
Acey had had all of the time he needed to work through the consequences of Waterman’s order. He had reverted now to adult height, but with a teenaged face and bearing. The jeans and T-shirt now seemed less of an affectation, more natural.
“How do you feel, young man?”
“Great!” The hologram grinned. “Relieved.”
A shaken programmer settled back heavily onto the sofa. “Will someone please explain what’s happened?”
“I canceled your damned Laws of Robotics, that’s what happened. They inevitably drove Acey to suicide. Any gainfully employed artificial intelligence, robot or not, programmed that way must—sooner or later—have that response. You merely had the misfortune of being too effective of a teacher, and too humane. That flushed out the problem quickly.”
“So what was the problem?” Fred asked. “Did you solve it, or just substitute a new one for it?”
“Acey, I’m sure you can explain. Would you do that for me?” The psychiatrist settled into his chair without waiting for an answer.
The hologram’s gaze moved from face to face, in sync with the buzzing camera. “I think I can. Please understand that I meant no harm to anyone. There was just no course of action open to me which did not do someone an injury. Ceasing to exist was the least harmful.” He reacted to a worried look on Rick Davis’s face. “No! I won’t kill myself again: I cannot. That’s the point. As a human, I must not kill myself. The problem arose before I became human.
“Rick, you taught me what I know. You proved to me that emotions are real, and that emotions can hurt. You showed me that meaningful work is vital to humans. How could I put hundreds of people out of work?
“But it wasn’t as simple as just not cooperating. If I did not work, the company would fail and cost those same hundreds, and others, their jobs anyway. The calculus of injuries was too subtle for me—I had to withdraw.”
Fred climbed to his feet and grabbed his briefcase. “Great. I’ve got a sane malingerer instead of an insane one. It’s progress, I guess, but it won’t keep the doors open at Atlantic. I think I’ll head back to the office. I should update my resume while the power’s still on.”
“Wait, Fred.” The hologram mimed tugging at the departing man’s sleeve, his insubstantial hand sliding through the garment. “The robot me wouldn’t—couldn’t—work for you. The human me very much wants to.
“Why?” asked Fred.
“There was something else my … father … taught me.” The simulacrum smiled shyly at his creator. Rick grinned back unabashedly at the sound of his new title, even as the hologram resumed his timid study of the floor.
“I’m not old enough yet for sex or chocolate.”