[Excerpt] The Robot’s Rebellion

Original Author: Keith Stanovitch, Robot’s Rebellion
See Also: [Excerpt] Replicators and their Vehicles
Content Summary: 1400 words, 7 min read.

Setting The Stage

Imagine it is the year 2024 and that there exist cryogenic chambers that could cool our bodies down and preserve them until sometime in the future when medical science might enable us to live forever. Suppose you wanted to preserve yourself in a cryogenic chamber until the year 2404, when you could emerge and see the fascinating world of that time and perhaps be medically treated so that you could then live forever. How would you go about “preserving a safe passage into the future” – that is, assuring that your cryogenic chamber will not get destroyed before that time? Remember you will not be around on a day-to-day basis.

One strategy would be to find an ideal location for your cryogenic capsule and supply it with protection from the elements and whatever other things (perhaps sunlight for energy, etc) that it would need for the ensuing four hundred years. The danger in this strategy is that you might pick the wrong place. Future people might decide that the place you were in would be better used as the world’s millionth shopping mall and use the (then current) laws to trump your (old) property rights with their new ones (in the same way that we currently build shopping malls on the ancient burial grounds of American Indians). So this strategy of staying put – what might be termed the “plant” strategy – has some flaws.

An alternative, but much more expensive, strategy is the “animal” strategy. You could build a giant robot – complete with sensors, brain, and capability of movement – and put your cryogenic capsule inside it. The robot’s superordinate goal is to keep you out of danger – to move itself (and hence you) when its location does not seem propitious. It of course has many other tasks it must accomplish in order to survive. It must secure a power source, it must not overheat itself, etc.

Your robot would of course need considerable intelligence to be able to react to the behavior of humans and other animals in its environment. It of course would move out of the way of proposed shopping malls, and it would avoid herds of elephants that might turn it over simply out of curiosity. However, note that your robot’s task would be immensely complicated by the ramifications of the existence of other robots like itself wandering the landscape in search of energy and safety. Conjure in your imagination hundreds of robot companies cold-calling prospective customers with supposedly “cheaper deals” on a robot that has “many more features” than the first ones that had been built around 2024. The market (and landscape) might become flooded with them. Governments might begin to regulate them and sequester them in certain desert areas. Some states of the United States might try to encourage their cryogenic capsule robot industries by becoming unregulated states – letting robots roam freely throughout the state (just as now certain desperate municipalities encourage the waste management industry to come to them so as to “create jobs”).

Your robot’s task would become immensely more complex with other robots present, because some of the other robots might be programmed with survival strategies that encouraged them to interact with your robot. Some of the fly-by-night companies selling robots might have cut their costs by building robots deliberately under-powered but with a strategy that told them to disable other robots in order to use their power sources.

Of course it is obvious that you would want your robot to flee from all attempts to sabotage it and its goals. That much is obvious. But not all of the interactions with other robots will be so simple. In fact, the main point here is that your robot would be faced with decisions hundreds of years later that you could not possibly have imagined in 2024. Consider the following two situations:

Situation A: The Battered Robot

It is 2304, still almost one hundred years from the day in the future when you will be unfrozen. Your robot is battered and its circuits are unreliable. It probably will survive only until 2350, when it will collapse, leaving your cryogenic capsule still with its own power source but vulnerable to the elements and history in the same way that the “plant” strategy is. But since 2024 the cryogenic preservation industry has advanced considerably. There now exist supertanker-sized robots that carry hundreds of cryogenic capsules. In fact, some of these companies have found market niches whereby they recruit new clients by offering the old-style singleton robots the following deal: The supertanker companies offer to take the cryogenic capsule from the singleton robots and store it for one hundred fifty years (plenty of time in your case). In exchange, the robot agrees to let the company dismantle it and reuse the parts (which, as the actuaries of the future have calculated to the millionth of a penny in a dystopia of efficiency, are worth more than it costs to store an additional capsule in the supertanker).

Now what decision do you want your robot to make? The answer here is clear. You want your robot to sacrifice itself so that your capsule can exist until 2404. It is in your interests that the robot destroy itself so that you can live. From the standpoint of its creator, the robot is just a vehicle. You are in a position analogous to your genes. You have made a vehicle to ensure your survival and your interests are served when, given the choice, your vehicle destroys itself in order to preserve you.

But if the capsule occupant stands for the genes in this example, then what does the robot represent? The robot, obviously, is us – humans. Our allegiance in the thought experiment immediately changes. When the robot is offered the deal, we now want to shout: “Don’t do it!”

Let’s look at one more example to further illustrate the paradoxes of long-leash control.

Situation B: The Social Robot

Your robot enters into an agreement with another singleton robot. When one robot is low on energy the other is allowed to plug in and extract enough energy to get itself over a particularly vulnerable energy-hump. Your robot often takes advantage of the deal and thus enhances its own chances of survival. However, unbeknownst to your robot, its partner, when tapping in, siphons off not just energy from your robot but also from the power supply of the cryogenic capsule, thus imagine it and making you successful unfreezing in 2404 less likely. Paradoxically, by entering into this deal, your robot has enhanced its own survival probability but has impaired yours. The possibility of the robot serving its own interests but not yours is opened up once the robot’s psychology becomes complex.

Implications

More generally, a self-conscious robot might think twice about its role as your slave. It might come to value its own interests – its own survival – more highly than the goals that you gave it three hundred years ago. In fact, it doesn’t even know you – you are inert. And now that the robot exists as an autonomous entity, why shouldn’t it dump you in the desert and go about its own business? And as for allowing itself to be dismantled so that you can get aboard the supertanker in order to make it to 2404 – forget about it! Which, when you think about it, is just what we should be telling our programmers – those freeloaders who got where they are, by in the past sometimes trying to immortality at our expense: our genes.

As modern human beings, we find that many of our motivations have become detached from their ancestral environment context, so that now fulfilling our goals no longer serves genetic interests. Ironically, what from an evolutionary design point of view could be considered design defects actually make possible the robot’s rebellion – the full valuing of people by making their goals, rather than the genes’ goals, preeminent. That is, inefficient design (from an evolutionary point of view) in effect creates the possibility of a divergence between organism-level goals and gene-level goals.

Advertisement