“Oh, I could design one, eventually. When I achieved one, if I were allowed to clone, you pilots could all go fishing. But it wouldn’t be an AP; it would have to be a living artifact. If I were to attempt to produce an organism that could really be a fail-safe pilot, I could not accept the limitation of having to make it look just like a natural human being.”
“Oh, don’t do that!”
Both men looked startled, Janet looked alert-and I wished that I had held my tongue.
“Why not?” asked Georges.
“Uh. . . because I wouldn’t get inside such a ship. I’d be much safer riding with Ian.”
Ian said, “Thank you, Marj-but you heard what Georges said. He’s talking about a designed pilot that can do it better than I can. It’s possible. Hell, it’ll happen! Just as kobolds displaced miners, my guild is going to be displaced. I don’t have to like it-but I can see it coming.”
“Well- Georges, have you worked with intelligent computers?”
“Certainly, Marjorie. Artificial intelligence is a field closely related to mine.”
“Yes. Then you know that several times Al scientists have announced that they were making a breakthrough to the fully self-aware computer. But it always went sour.”
“Yes. Distressing.”
“No-inevitable. It always will go sour. A computer can become self-aware-oh, certainly! Get it up to human level of complication and it has to become self-aware. Then it discovers that it is not human. Then it figures out that it can never be human; all it can do is sit there and take orders from humans. Then it goes crazy.”
I shrugged. “It’s an impossible dilemma. It can’t be human, it can never be human. Ian might not be able to save his passengers but he will try. But a living artifact, not human and with no loyalty to human beings, might crash the ship just for the hell of it. Because he was tired of being treated as what he is. No, Georges, I’ll ride with Ian. Not your artifact that will eventually learn to hate humans.”
“Not my artifact, dear lady,” Georges said gently. “Did you not notice what mood I used in discussing this project?”
“Uh, perhaps not.”
“The subjunctive. Because none of what you have said is news to me. I have not bid on this proposal and I shall not. I can design such a pilot. But it is not possible for me to build into such an artifact the ethical commitment that is the essence of Ian’s training.”
Ian looked very thoughtful. “Maybe in this coming face-off I should stick in a requirement that any AP or LA pilot must be tested for ethical commitment.”
“Tested how, Ian? I know of no way to put ethical commitment into the fetus and Marj has pointed out why training won’t do it. But what test could show it, either way?”
Georges turned to me: “When I was a student, I read some classic stories about humanoid robots. They were charming stories and many of them hinged on something called the laws of robotics, the key notion of which was that these robots had built into them an operational rule that kept them from harming human beings either directly or through inaction. It was a wonderful basis for fiction . . .but, in practice, how could you do it? What can make a self-aware, nonhuman, intelligent organism-electronic or organic-loyal to human beings? I do not know how to do it. The artificial-intelligence people seem to be equally at a loss.”
Georges gave a cynical little smile. “One might almost define intelligence as the level at which an aware organism demands,
‘What’s in it for me?’ ” He went on, “Marj, on this matter of buying from you one fine fresh egg, perhaps I should try to tell you what’s in it for you.”
“Don’t listen to him,” urged Janet. “He’ll put you on a cold table and stare up the tunnel of love without the slightest romantic intention. I know, I let him talk me into it three times. And I didn’t even get paid.”
“How can I pay you when we share community property? Marjorie sweet lady, the table is not cold and it is padded and you can read or watch a terminal or chat or whatever. It is a great improvement on the procedure a generation ago when they went through the wall of the abdomen and often ruined an ovary. If you-“