Realtime Interrupt by James P. Hogan

The main cause of Lilly’s distress and anger was not so much the deception—she was a military volunteer, and things like that happened and could be compensated for—but the twelve years that she saw as stolen from her life. And who could blame her for that? But what he knew, and she almost certainly would not, was that those twelve years were also an illusion. The system coupled directly into post-sensory brain centers, which enabled data to be coded in a prereduced, highly compressed form that eliminated delays associated with preprocessing in the perceptual system. This meant that time inside the simulation ran about two hundred times faster than real time in the world outside. Hence, the actual time that they had spent hooked into the virtual world would be closer to three weeks than the twelve years that they remembered subjectively. Although even that was longer than the durations projected for the test runs that Corrigan had expected to be taking part in, it wasn’t outrageous. They were all scientists and volunteers, after all. They would have had little problem agreeing to something like that.

He hoped that if he could find her and reassure her of at least that much, she would see things in a different light and be less likely to start doing anything rash that might disrupt the experiment. There was no reason for the test conditions to be affected by the mere fact of their knowing what they knew now, as long as they continued to act as if nothing had changed. The system could only monitor external behavior: what a surrogate did and said. Since nobody possessed the knowledge to tell it how, the system was not able to decode inner thought processes from deep inside the brain and read minds. If it could, there would have been no need for Project Oz in the first place.

The whole idea had been that the system would learn to make its animations more lifelike by imitating the behavior of real people injected as surrogate selves into the simulation. It had no way of knowing why the surrogates that it watched behaved in the ways they did—any more than they frequently did themselves. At the end of the experiment nobody would know, let alone have been able to specify beforehand, the precise structure of software structures and linkages that had self-organized to make such mimicking possible. The neural structures responsible for the complexity of human behavior in the real world had evolved by principles that were appropriate to carbon chemistry. Trying to duplicate them in code would have been as misguided as building airplanes that flapped feathers. Oz was designed to build, in ways appropriate to software, whatever structures it needed to achieve similar results. Nobody needed to know exactly what the final structures were, or how they worked. The aim was to achieve directed coevolution: the end-product, not the mechanism for attaining it, was the important thing.

That had been the theory, anyway. Whether it would work was what Oz had been set up to test. And from the bizarre goings-on going on in the world around him, Corrigan’s first conclusion had to be that as far as its prime goal was concerned, the project had wandered somewhat off the rails. For, far from modeling themselves on the surrogates, the system animations seemed to be going off into self-reinforcing behavior patterns of their own, while—if his own and Lilly’s cases were anything to go by—the surrogates had become misfits. That in itself didn’t trouble him unduly. This was research, after all; perfection could hardly be expected from a first-time run—and especially in an undertaking as unprecedented and as ambitious as this.

Hence, it was no surprise that the animations fell short of true human emulation in some aspects. What was astounding was that they came so close. The empty stares and “flatness” were minor flaws compared to the extraordinary degree of realism—even if it did tend somewhat toward the eccentric—with which the personas that he encountered daily were able to act out their affairs and effect the continuity of leading consistent background existences offstage. So what if the system had overstepped the boundary of neurosis when it tried to make Jonathan Wilbur an embodiment of human criteria for personal success and failure; or if Maurice at the Camelot couldn’t master a value system that didn’t reduce to a simple profit-and-loss calculus? They had fooled Corrigan. It was sobering to realize just how effective the combined weight of suggestion and authority had been in persuading him that the defects he had perceived in the early stages were in himself and not in the world around him. Now so much seemed so obvious.

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146

Leave a Reply 0

Your email address will not be published. Required fields are marked *