Drawing
Glastech Perscom Unit 9-10-2

Nine-ten-two's world is one where human life is quite precious. Development started three centuries ago with robots designed to operate independently in dangerous environments, such as mining and disaster sites. Advances in robotics meant robots spread to more common environments like construction, which inevitably lead to combat robots. Thus,civilian robots are as prolific as cars and appreciated as powerful tools and wars are conducted by humans but are acted out by robots.

A lot of effort was put into the development of the kernel AIs, with a special eye towards the sheer creativity with which some humans would employ to get around any kind of safeguards employed. What was created was a robust AI which would serve and protect but did not take kindly at all to anyone trying to get in its heads, referred to as AyGISS (Artificially Generated Intelligence Safeguard System). Thus, robots 'instinctively' do not want to harm humans but will given extreme circumstances in a far more fuzzy way than three laws could ever cover. Due to company rivalries and industrial secrecy, many kernels were initially developed. The three most popular kernels, blue-3, Golem, and Chuugi, survived the resulting commercial free-for-all and stabilized to the point where they have not changed in the past 125 years. All of the robots produced in Nine-ten-two’s generation are run by one of these three.

Even the first generation of robots were complex machines; successive generations only became more so. Thus, new models are only released every 15 or 20 years. However, within a model’s production lifetime, manufacturing methods may be tweaked in response to reported flaws or to take advantage of new techniques (especially those that save the company money). Even within a certain manufacturing run there are variations among robots that are within tolerance and yet lead to robots which are mass-produced to be identical but are in fact slightly different. As robots enter the workforce, they experience different wear and tear (referred to as ‘weathering’ for some reason) and are upgrade and repaired differently.

There are, in fact, robots which have malfunctioned, either due to faulty programming or malicious (and most likely human) interference. These rogue machines would be far more dangerous if it weren’t for the fact that robots seem to police themselves. In the event of a rogue, its location and specs are transmitted by robots to all surrounding robots, who respond with whatever force they can safely bring to bear. Human witnesses have reported that robots policing their own respond with far more brutality and violence than was ever reported against a human. Still, the idea captured the public imagination and went a long way to reconciling people to what would eventually happen: independent robots.

It happens that humans buy and sell robots. It happens that a robot's owner may die and leave no inheritor. It was expected that, at some point, a robot would be left independent; it was an outcome that became more inevitable the longer robots existed and the more sophisticated their AIs became. However, surprisingly, when the time came (and while options of varying disproportion were discussed by anxious humans), the robots in question continued perfectly ordinary operations, taking jobs and earning the funds to maintain themselves. Surprisingly, or perhaps not so given the amount of research, testing, and care put into their design, the AIs actually were as stable without human masters as was simulated, in fact more so than the occasional human. Cautiously, and with growing optimism, humanity blinked, patted itself on the back, and resumed normal operations.

The independents were monitored closely as their numbers slowly grew. While for the most part they did not interact with each other, certain patterns of communication were noticed among them. Their encryption was good but certainly not up to the manpower and creativity thrown into cracking it by certain watchful human groups, who expected nothing less than sinister intentions. Again, the surprise was the somewhat benign result: the robots were developing their own culture, one based on a meticulous accounting of rank and experience, divided into a myriad of categories, and taking into account a legion of factors (such as goal, terrain, number of robots in the mission, number of humans in the mission, and equipment), referred to as Standing. True to form, what the robots track is error rate rather than success rate.

Robots measure age in chronological years from activation for humans, but among themselves they measure the number of jobs accepted as an independent, and they measure where they Stand based on the completion of those jobs. Robots will work together to accomplish larger jobs, but they will always track their own accomplishments within an organization; that is to say, while they prefer not to sacrifice themselves to achieve a goal, they will if it is necessary. This is not so much rabid self-centeredness as simply a reason for continuing: just to see what they can do. They strive for recognition for ability, for pride in their own manufacture, and for the sake of demonstrating how much their own variety can accomplish. Robots ‘want’ a low Standing, more upgrades, and more chances to use their equipment; they ‘fear’ being obsolete and ‘desire’ being state-of-the-art.

Information travels among robots (independent or otherwise) much like it would on the Human Internet. Robots convene and exchange information (calling it socializing makes some designers shudder, but that's what it seems to be). They do this individually as jobs allow and sometimes in larger groups, almost conferences. As vast as current memory storage technology is, no robot can carry with it the sum of all human knowledge and its own experience. In response to this, robots are showing signs as a whole of specializing in knowledge about themselves and about humans and the world around them. Certain models, and even certain individuals, are known for housing particular knowledge bases.

While it makes sense for robots to maximize their chances of success by working with other robots of known ability, trends have begun to develop of groups of robots which tend to work with each other more often than with any other robots, more often than working solo. It is hypothesized that robots have developed a culture sufficiently complicated enough to start forming proto-clans. This behavior is watched most keenly by PhD students. There are even those who risk their careers by suggesting that the robots are a new kind of life, a kind of symbiotic life, for, while they cannot reproduce and seem to content to leave it to humans to manufacture more of them, robots help humans and humans help robots.

Presently, what humans teach their children is that robots are safe like most other humans and that they will take it upon themselves to buy their independence (but never to buy each other's). These robots are stable and honorable and can be contracted for work much like a human would. Besides, given that the robots are so useful, most humans prefer to get caught up in human affairs rather than robot affairs.

Nine-ten-two itself is a fighter model produced by Glastech, the Personal Combat Unit. Nine-ten-two can be referred to as an 'it' or a 'she' because when it was manufactured the trend was for elegant machines, slim and efficient and almost feminine in appearance, especially when compared to the blocky Raptor7 model, Glastech's previous release. Robots don't make a distinction among pronouns when referring to themselves or other robots, to the mild confusion of most humans.

As a Perscom Unit, Nine-ten-two was acquired by the Platinum Shield security agency as a bodyguard five years ago, practically new in terms of robot manufacture. She purchased her independence quickly due to successful missions and the gratitude of those humans she kept alive, who quite possibly hoped to attract a state-of-the-art robot of her caliber into their service as an independent. Her name is three digits of her full twenty-three-digit serial number, a common naming method of independent robots. The humans she's interacted with have noticed that she is polite to and curious about most of her human peers but as silent as any robot about robot affairs. Like any robot she is hesitant to harm humans unless, for instance, other humans are at risk. If she herself is at risk, she will respond with the minimum force necessary to ensure her survival. (This is why it's useful that a member of our group is human, because these exceptions don't quite cover birds or crustaceans.)

Nine-ten-two has been on nine missions (seven successful), three of those solo jobs, but only Stands at 0.60025 because her last mission was a failure that led to the mutilation and subsequent death of her contractor. Upon entering the Floating Vagabond (and after drinking a free Singularity), she teamed up with a human fighter with an impressive sword, an African Grey parrot with an amazing ability to stash almost anything, and a technically advanced egomaniacal krill.

 


Home | Astronomy | Art | About |
Menu: Home
Menu: Astronomy
Menu: Art
Menu: About