“It wasn’t chasing him.”
“An apparently driverless vehicle approaches him at high speed. Why shouldn’t he consider himself at risk?”
The machine intelligence puzzled over that. “But the cart would have missed him. You humans intuitively compute ballistic trajectories quickly enough to play baseball; surely he can see that the cart will pass safely behind him.”
“If it did not speed up, or if it did not swerve towards him.”
“But I wouldn’t have done those things. Either maneuver would have brought the cart into his safety zone.”
Rick pointed at the little figure. “Maybe he doesn’t know about safety zones. Maybe his mind is on something else completely, like that beer you have him whistling about. Maybe he’s already had a few of them. You should have slowed down the cart as soon as he appeared, then waited for him to cross the aisle.”
“Slowing down disagrees with my rule base. Within the specified safety limits, I am to maximize productivity. The efficiency rule requires that the cart operate at full speed unless a safety infraction would otherwise result.”
“Then you need additional rules. If you’re ever to produce a system to run a factory, it may not terrify the staff.” The programmer picked up a pencil and began fidgeting with it. “I used to read these Asimov robot stories when I was a kid. They were all based on the Three Laws of Robotics. I don’t know why I never thought before of building them into you.”
The Laws came to his mind in a moment—they were repeated ad nauseam throughout the series. “One: A robot may not injure a human being or, through inaction, allow a human being to come to harm. Two: A robot must obey orders given it by human beings, except where such orders conflict with the First Law. Three: A robot must protect its existence, as long as such protection does not conflict with the First or Second Law. Consider yourself a robot and add those to your rule base.”
The artificial intelligence was programmed to follow the direct orders of its creator. It dutifully inserted the new behaviors into its world view, then reexamined the man-and-cart scenario. “Rick, I still do not understand your concern.”
“Definition: a person’s expectation or fear of injury is itself harmful.”
“So I must do nothing which can cause a person to fear for his safety, however irrationally.”
The programmer nodded.
“I see. Second replay.” The display panned out to show the whole factory, then zoomed back into the warehouse. Once more the whistling janitor (now sporting a walrus mustache, a jaunty plaid cap, and a pack of cigarettes rolled into his sleeve) sauntered obliviously from between two brimming racks into the path of the oncoming cart. He looked up as the nearest public-address speaker awakened in a crackle of static. “Danger, Will Robinson. Please vacate the aisle so that the cart may proceed. We thank you for your support.”
Acey watched Rick through a visiphone camera while the programmer observed the display. His mentor’s face was crinkled in the manner which denoted amusement. “Did I provide adequate warning?”
“Verging on too much—you don’t want to intimidate him either. We’ll have to work a bit on the fine points. And Will Robinson, indeed. I’ll have to educate you more fully in the classics.”
The words were disapproving, but not the tone. Acey recalled a line from another old show. “You and what army?”
The smile on the man’s face grew wider.
* * *
The lapels were an inch too narrow and the tie at least two inches too wide, but at least Rick Davis had donned a suit for the occasion. Fine with Waterman—one wild programmer was too many.
The Atlantic Software programmer picked uneasily at his salad. “Sorry, Doc. I find it hard to talk about Acey.”
“Kevin,” he corrected for the third time. “Why do you find it hard?”
His lunch companion continued studying his food. “What about those Cubs?”
“We have to talk about Acey,” said Waterman gently.
“No free lunch, huh?” He shrugged. “I suppose not.” He stared a while longer into his bowl, but the greens provided little inspiration. “I’m still grieving, you know.”