Upload Lives Matter… Or Else.

in #technology7 years ago

 A long-time science fiction riff which may be on the verge of becoming reality is the idea of the Copy,  which is to say the digital emulation of a human mind. The viability of  that idea is still debatable, largely depending upon the assumptions of  any given variant of the idea, but its ethical and philosophical  implications are important with regard to technologies which are most  definitely already becoming reality. Copies are often referred to in  technical terms as “Whole Brain Emulations” (WBE), or more loosely as  “Uploads” (after roboticist Hans Moravec, who in his 1990 book “Mind  Children” described a destructive neural scan and called it the  “download” of a mind). The philosophical issues which naturally arise  when considering such a technology are the stuff of introductory college  courses, usually focused on questions of personal identity, rights, and  ownership. Perhaps the most extensive and definitive early treatment of  such ideas can be found in the novels of Greg Egan, particularly  “Permutation City” (1994) and “Diaspora” (1997). Egan used the term Copy  to refer to a digital person, thereby highlighting the identity issues  which arise when your mind is in some sense not unique. 

 The idea of Copies has become slightly more prevalent throughout  mainstream culture in recent years, as available processing power has  vastly increased and people have become more familiar with information  technology and its possibilities. A notable example is the dark sci-fi  TV program Black Mirror, which has featured Copies whose  circumstances raise alarming ethical and social issues, across multiple  episodes. In the episode “White Christmas”, we see Copies forced to live  out extended periods of (simulated, but subjectively real) time, in  order to extract confessions, coerce them to serve as slaves, and simply  to torture them. In recent years we have also seen newspaper articles  speculating on the use of such simulation technologies as a way to  punish criminals for inordinate periods of time, or in distinctly cruel  and unusual ways. Given the human talent for unthinkingly inflicting  suffering on others and the potential power of WBE technologies,  civilized people should be deeply concerned about curbing any such  excesses. 

 Ethics aside, there are at least two issues regarding such possibilities  which should give us pause for thought, to consider the ways in which  our human intuitions may fail us in an increasingly strange modern  world. The first such issue is the question of “digital mental health”  (to coin a phrase); i.e. what effects extended periods of solitude and  other tortures might have on a Copy. The kinds of ill-treatment  routinely depicted in the programs and articles mentioned above simply  could not be tolerated by a human being, which would simply fall apart  after a certain point (even without the peculiar physical situation of  Copies, not necessarily having any need to eat or other bodily  functions). The “problem” of Copies losing their minds could of course  be circumvented with software hacks (such as resetting the Copy’s mental  state periodically), but then you’d effectively be negating the torture to some degree. For  example, say a Copy was prone to psychotic breaks after several  (subjective) years of isolation, and your “fix” was to reset its memory  each time. The memory reset would have to be total, as any memory of  previous torture would only accelerate the Copy’s disintegration. If you  do a total memory reset, however, then from the Copy’s point of view  they have only been isolated once, and the point of extended isolation  would be negated. 

 The point here is that Copies are not (or will not be) human. That may  sound like a trivial observation, but it carries deep implications which  could easily be missed. People have clearly intuited Copies’ potential  as superhuman torture victims of a sort, and we have briefly examined  the limitations of applying human experience and expectations to them.  To go further, however, we must stop thinking of Copies as human, but  instead think of them as complex software agents of at least human-level intelligence which (given access to suitable resources) could potentially upgrade their own abilities. Any  Copy with time in isolation has time to plan, and may have more  subjective time available within any given period than does any natural  human. They could augment their perception, memory, and other cognitive  abilities with software, particularly if they have access to the  internet. A group of Copies could potentially solve problems much more  effectively than any group of humans, even in the same period of  subjective time, by directly sharing memories rather than having to  explain things to each other verbally. 

 At this point you may be thinking that none of these potential  capabilities are a problem if a Copy does not have access to resources  outside their immediate simulated environment, and does not have  whatever system privileges are required to upgrade themselves. If they  are “locked down”, in other words. If so, then you are forgetting not  only that (1) the Copy has at least human level intelligence, but also  that (2) they may well have outside help in circumventing such controls.  How hard is it to imagine a “Copy Liberation” movement, even if only  one “escaped” Copy or sympathetic human has the ability to write a  jailbreak virus of some description? In short, you cannot be sure of  your ability to completely control Copies at all times, and once they  are loose they could become very dangerous indeed, so it would be wise  to consider their welfare… just in case they decide to return the  favour. 

By Amon Twyman
Co-published on Transhumanity.net

Sort:  

This post has received a 1.59 % upvote from @booster thanks to: @zero-state.