Heinlein, Robert A – Friday

“Oh, I could design one, eventually. When I achieved one, if I were allowed to clone, you pilots could all go fishing. But it wouldn’t be an AP; it would have to be a living artifact. If I were to attempt to produce an organism that could really be a fail-safe pilot, I could not accept the limitation of having to make it look just like a natural human being.”

“Oh, don’t do that!”

Both men looked startled, Janet looked alert-and I wished that I had held my tongue.

“Why not?” asked Georges.

“Uh. . . because I wouldn’t get inside such a ship. I’d be much safer riding with Ian.”

Ian said, “Thank you, Marj-but you heard what Georges said. He’s talking about a designed pilot that can do it better than I can. It’s possible. Hell, it’ll happen! Just as kobolds displaced miners, my guild is going to be displaced. I don’t have to like it-but I can see it coming.”

“Well- Georges, have you worked with intelligent computers?”

“Certainly, Marjorie. Artificial intelligence is a field closely related to mine.”

“Yes. Then you know that several times Al scientists have announced that they were making a breakthrough to the fully self-aware computer. But it always went sour.”

“Yes. Distressing.”

“No-inevitable. It always will go sour. A computer can become self-aware-oh, certainly! Get it up to human level of complication and it has to become self-aware. Then it discovers that it is not human. Then it figures out that it can never be human; all it can do is sit there and take orders from humans. Then it goes crazy.”

I shrugged. “It’s an impossible dilemma. It can’t be human, it can never be human. Ian might not be able to save his passengers but he will try. But a living artifact, not human and with no loyalty to human beings, might crash the ship just for the hell of it. Because he was tired of being treated as what he is. No, Georges, I’ll ride with Ian. Not your artifact that will eventually learn to hate humans.”

“Not my artifact, dear lady,” Georges said gently. “Did you not notice what mood I used in discussing this project?”

“Uh, perhaps not.”

“The subjunctive. Because none of what you have said is news to me. I have not bid on this proposal and I shall not. I can design such a pilot. But it is not possible for me to build into such an artifact the ethical commitment that is the essence of Ian’s training.”

Ian looked very thoughtful. “Maybe in this coming face-off I should stick in a requirement that any AP or LA pilot must be tested for ethical commitment.”

“Tested how, Ian? I know of no way to put ethical commitment into the fetus and Marj has pointed out why training won’t do it. But what test could show it, either way?”

Georges turned to me: “When I was a student, I read some classic stories about humanoid robots. They were charming stories and many of them hinged on something called the laws of robotics, the key notion of which was that these robots had built into them an operational rule that kept them from harming human beings either directly or through inaction. It was a wonderful basis for fiction . . .but, in practice, how could you do it? What can make a self-aware, nonhuman, intelligent organism-electronic or organic-loyal to human beings? I do not know how to do it. The artificial-intelligence people seem to be equally at a loss.”

Georges gave a cynical little smile. “One might almost define intelligence as the level at which an aware organism demands,

‘What’s in it for me?’ ” He went on, “Marj, on this matter of buying from you one fine fresh egg, perhaps I should try to tell you what’s in it for you.”

“Don’t listen to him,” urged Janet. “He’ll put you on a cold table and stare up the tunnel of love without the slightest romantic intention. I know, I let him talk me into it three times. And I didn’t even get paid.”

“How can I pay you when we share community property? Marjorie sweet lady, the table is not cold and it is padded and you can read or watch a terminal or chat or whatever. It is a great improvement on the procedure a generation ago when they went through the wall of the abdomen and often ruined an ovary. If you-“

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176

Leave a Reply 0

Your email address will not be published. Required fields are marked *