<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: LA Weekly &#8220;disses&#8221; Tranhumanists</title>
	<atom:link href="http://www.foresight.org/nanodot/?feed=rss2&#038;p=396" rel="self" type="application/rss+xml" />
	<link>http://www.foresight.org/nanodot/?p=396</link>
	<description>examining transformative technology</description>
	<lastBuildDate>Wed, 03 Apr 2013 18:23:47 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.0.4</generator>
	<item>
		<title>By: kenbeal</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1138</link>
		<dc:creator>kenbeal</dc:creator>
		<pubDate>Wed, 24 Dec 2003 06:07:11 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1138</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Avoiding uncomfortable implications&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;We already store a part of ourselves external to our body. For example, couldn&#039;t the written word be considered an externalized form of long-term memory?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Nice. I remember back in my senior year of college (circa 1991) I told a college secretary that &quot;I like keeping parts of my brain outside my body&quot; when I wrote down the schedule. (Yes, I have always been somewhat geeky. Big deal now, but it cost me lots of beatings as a child.)&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Many people, if they think about the possibility of &quot;brain networks&quot; now, will find them strange and frightening, But such &quot;brain networks&quot; will evolve gradually over many years (decades probably). At each step, the advances will seem obviously helpful. &lt;strong&gt;A prosthetic eye for blind people?&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I cannot see from my left eye (colaboma (crater) in the center of the optic nerve, from birth). I very much look forward to the day I can replace that optic nerve and experience the world in startling 3D, which will allow me to see those funky optical illusions which teased me as a child (hold your fingers separated by an inch or so in front of your eyes, you see a sausage. I cannot see that, nor do I &lt;strong&gt;ever&lt;/strong&gt; see double, no matter how much I&#039;ve drunk, which I suppose makes me a better drunk driver ;-).&lt;/p&gt;
&lt;p&gt;Do you know about &lt;strong&gt;&lt;a href=&quot;http://www.aeiveos.com/~bradbury/MatrioshkaBrains/MatrioshkaBrains.html&quot;&gt;Matrioshka Brains?&lt;/a&gt;&lt;/strong&gt; That&#039;s something our society should strive to build, in the not-too-distant future. A solar system, performing computations to aid us in our domination of the universe.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Avoiding uncomfortable implications</strong></p>
<blockquote>
<p><em>We already store a part of ourselves external to our body. For example, couldn&#39;t the written word be considered an externalized form of long-term memory?</em></p>
</blockquote>
<p>Nice. I remember back in my senior year of college (circa 1991) I told a college secretary that &quot;I like keeping parts of my brain outside my body&quot; when I wrote down the schedule. (Yes, I have always been somewhat geeky. Big deal now, but it cost me lots of beatings as a child.)</p>
<blockquote>
<p><em>Many people, if they think about the possibility of &quot;brain networks&quot; now, will find them strange and frightening, But such &quot;brain networks&quot; will evolve gradually over many years (decades probably). At each step, the advances will seem obviously helpful. <strong>A prosthetic eye for blind people?</strong></em></p>
</blockquote>
<p>I cannot see from my left eye (colaboma (crater) in the center of the optic nerve, from birth). I very much look forward to the day I can replace that optic nerve and experience the world in startling 3D, which will allow me to see those funky optical illusions which teased me as a child (hold your fingers separated by an inch or so in front of your eyes, you see a sausage. I cannot see that, nor do I <strong>ever</strong> see double, no matter how much I&#39;ve drunk, which I suppose makes me a better drunk driver <img src='http://www.foresight.org/nanodot/wp-includes/images/smilies/icon_wink.gif' alt=';-)' class='wp-smiley' /> .</p>
<p>Do you know about <strong><a href="http://www.aeiveos.com/~bradbury/MatrioshkaBrains/MatrioshkaBrains.html">Matrioshka Brains?</a></strong> That&#39;s something our society should strive to build, in the not-too-distant future. A solar system, performing computations to aid us in our domination of the universe.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Anonymous Coward</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1124</link>
		<dc:creator>Anonymous Coward</dc:creator>
		<pubDate>Wed, 23 Jul 2003 22:10:59 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1124</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:comes with the territory&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I happened upon this commentary which caused me to laugh! It seems the person who critiqued the LA Weekely article, meaning Mark Gubrub has certainly got a chip on his shoulder! First, I have never been in poverty (I own a home and drive a Benz); second, I have never denied my age or been afraid of my age (proud of being in my 50s); third, my art is certainly not &quot;awful&quot; as you put it, but perhaps you are edgy about something other than my work which causes you to be do disdainful ... The article, in fact, was fairly complimentary for a leftist, angry, deathist magazine. Natasha Vita-More&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:comes with the territory</strong></p>
<p>I happened upon this commentary which caused me to laugh! It seems the person who critiqued the LA Weekely article, meaning Mark Gubrub has certainly got a chip on his shoulder! First, I have never been in poverty (I own a home and drive a Benz); second, I have never denied my age or been afraid of my age (proud of being in my 50s); third, my art is certainly not &quot;awful&quot; as you put it, but perhaps you are edgy about something other than my work which causes you to be do disdainful &#8230; The article, in fact, was fairly complimentary for a leftist, angry, deathist magazine. Natasha Vita-More</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: MarkGubrud</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1123</link>
		<dc:creator>MarkGubrud</dc:creator>
		<pubDate>Tue, 13 Feb 2001 03:56:18 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1123</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Uploading As Migration&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;I think anyone uploading someone else&#039;s mind without their express permission to do so is probably violating the fundamental right&#039;s of that individual.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Probably? How would you determine whether it is or not?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Some of your comments seem to suggest that mind uploading is something employers would do to lower labor costs.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Economics and justice issues are another matter. Here let&#039;s stick to uploading. My point was that duplicating a person for the sake of using the duplicates might make some kind of sense, but having one&#039;s self killed for the utility of duplicates makes no sense, unless one is some kind of &quot;soldier&quot; willing to make &quot;the supreme sacrifice&quot;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;I believe that &#039;destructive readout&#039; &amp; recreation vs. &#039;evolved relocation&#039; are fundamentally different&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It was careless of me to assert that they are not; I don&#039;t want to get into arguments about which differences are &quot;fundamental&quot;. You are talking about different procedures. However, both destroy the person, while creating some kind of artifact which might be another person, a facsimilie of the first, or a nonhuman simulation of some aspect of the person&#039;s constitution and behavior. In either case, the original person is dead.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;there may be significant temporal discontinuities with the destructive methods&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Yes, but, so what?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;All one needs to do is assemble a complete genome for the individual and you know what the atomic structure should be for all of the molecules in the body.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Wrong, and irrelevant. Some molecules certainly have multiple states, including, very importantly, DNA, which is switched on and off by binding proteins. And it is not merely the atomic structure of individual molecules, but the arrangements of molecules and supramolecular assemblies in cells which store the information you are concerned about. You cannot reconstruct these from the genome.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If you produce a wet brain, containing essentially the same molecules with essentially the same organized structure as the original brain, then I would say you have a brain and you should get back a reasonable recreation of the mind contained within it.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Your language, &quot;the mind contained within [the brain]&quot; is dualistic, and implies belief in the existence of non-physical entities. A brain is a brain. If it is an atomically-precise facsimilie of another brain from which it was copied, then it can be called a copy of the original brain, but it &lt;em&gt;is not&lt;/em&gt; the original brain. In any case, there will never be any atomically-precise facsimilies! At best you might make some kind of approximation that perhaps no one would be able to tell from the original.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If you are running a molecular simulation of the brain or a model of the neural network contained within the original brain, then I would say you have a &quot;brain&quot; (in quotes).&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;So, in your glossary, &quot;brain in quotes&quot; means a simulation of a brain. I suppose by extension a &quot;bomb&quot; is a simulation of a bomb, and a &quot;girl&quot; is a simulation of a girl. This raises a sidebar. I don&#039;t have a problem with people who want to be &quot;uploaded,&quot; as long as they only want to &quot;live&quot; in &quot;virtual reality&quot;. I might try to talk them out of committing suicide, but at least they aren&#039;t proposing to create any threats to the rest of humanity. As long as &quot;uploads&quot; will always be confined to ineffectual &quot;cyberspaces,&quot; with no hooks into the real world, no harm can be done by their electrons whirling around.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;I would argue that every individual has a fundamental right to &#039;migrate&#039; their mind onto whatever hardware he chooses so long as that does not interfere with anyone elses right to do the same.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Well, I would argue that humanoid artificial intelligences of superhuman capability would be a profound threat to the rights of all humans, and therefore should not be allowed to be created by any means, including &quot;uploading.&quot; If, however, they are completely isolated from any possibility of affecting the physical world outside a simulated &quot;reality&quot; (let it be of their own choosing), then such systems would be incapable of doing any mischief. But I don&#039;t know if society will ever give up the taboo against suicide, at least of healthy individuals.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If you believe that an individual&#039;s mind is contained within the specific atoms of the brain of of an individual,&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Then you are a dualist. I am not a dualist. I do not believe that the brain &quot;contains&quot; anything. It is. So is another brain. Two different brains are two different brains. If you believe anything different, you must believe in the existence of something extraphysical.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;[1] if you took all of the atoms in my brain apart and recorded their exact position with angstrom accuracy, then put all of those very same atoms back in their original locations [2] I know that that individual would be me&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;How does [2] modify [1]? You describe in [1] an (almost certainly impossible to carry out exactly) physical procedure. What does [2] claim that could be independently verified?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If it looks like a specific duck, walks like a specific and quacks like a specific, then the recreation is the specific duck, even if it isn&#039;t made out of the original atoms the duck is made out of.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;But if we could make one recreation, we could make two, four, a dozen, a billion. Which, then, is the &lt;em&gt;specific&lt;/em&gt; duck?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;A recreation is me, it is just a me that I may not rather be.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;You seem to be appealing to some right of privacy, personal choice. That&#039;s a cop-out from the discussion, but, okay, no one can make you say what the reasons are why you&#039;d &quot;not rather&quot;. But we can infer that they exist, and in the absence of another plausible interpretation of your hesitancy, that you do not really fully believe your assertion that &quot;A recreation is me&quot;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;The dictionary I&#039;ve got defines &quot;migrate&quot; as 1. to settle in another country or region; 2. to move to another region with the change in season, as many birds. So long as the mind being recreated is a reasonable facsimile [exact reproduction or copy], then I would consider it to have &quot;migrated&quot;.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;You quote a dictionary definition and then proceed to use the word in an inconsistent way!&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Since I know that given current brain structures, I am probably going to lose memories over time (due to neuronal cell loss and/or incomplete or inaccurate copying to new storage locations), the information losses mentioned, go with the territory from my perspective.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The inaccuracy of the brain does argue that an imperfect copy might still be indistinguishable from the original, but it does not change the fact that the copy &lt;em&gt;is not&lt;/em&gt; the original; it is a copy. They are two different things. I don&#039;t see how anyone can deny this tautological fact. The fact that any copy is going to be imperfect, most likely &lt;em&gt;very&lt;/em&gt; imperfect, only further highlights the fact that it is &lt;em&gt;a different thing&lt;/em&gt;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Fingers, listening and speaking are very low bandwidth channels compared to the amount of information we now can access.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;But not compared with the amount we can handle. Having to formulate spoken or written sentences both forces and permits us to organize our thoughts; the stream of consciousness is of course a cacophony. Let advanced AI systems anticipate what we might be interested in, let them provide us with rapid, easily cueable access to information. But keep the boundaries, and keep us humans firmly in control. To do otherwise is to invite our annihilation.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;you have to ask would non-bionically enhanced individuals be considered &#039;disabled&#039; at some point.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;We really have to get beyond the notion that the future can hold nothing better than endless cutthroat competition with one another (and with machines). I believe it is clear that there is no need for any more powerful information technology than that which will be achieveable without invasion of the body by machinery, without humanoid, self-conscious and self-interested artificial superintelligences, and without biotechnic reengineering of the brain (apart from health maintenance).&lt;/p&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Different parts of the brain exchange information, but it is extremely unlikely that there is a universal code that can simply be copied from one region to another.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;em&gt;Calvin seems to suggest that there is in at least some regions of the cortex. He suggests that some of the interesting idea &quot;combinations&quot; may occur when you copy a pattern from one area onto another area that is biased for running a different pattern or when to patterns run side-by-side and influence each other. I would tend to agree that long distance communications (e.g. memory fetching or physiological controls are probably hardwired with their own unique codes). If Calvin&#039;s concept of &#039;thought patterns&#039; has some merit and nanobots can read them out (and fetch and restore them to the neurons in some way), then I believe one has thought transference.&lt;/em&gt;&lt;/blockquote&gt;
&lt;p&gt;I haven&#039;t read Calvin&#039;s work, so I don&#039;t know exactly what he&#039;s saying, but your account of it suggests precisely the kind of mystical, dualistic model of brain function that is almost certainly wrong -- the idea that one has certain conscious experiences as a &lt;em&gt;result&lt;/em&gt; of the production of certain &quot;firing patterns&quot; or whatever. So if some other &quot;hardware&quot; could produce these same &quot;patterns&quot;, then one would just as surely &quot;have the same experience&quot;. There are so many errrors in this way of thinking that it is hard to know where to begin in criticizing it.&lt;/p&gt;
&lt;p&gt;I&#039;ll just say that the notion of a universal code, even one confined to a particular region of the brain, is implausible because each replication of this code wouldn&#039;t be doing anything different from the others. It would be a waste of neurons to have them all beating in synchrony for the benefit of some cosmic observer. This is not to say that different parts of the brain do not have their own local representation of the &lt;em&gt;same content&lt;/em&gt;, or that they do not take place in globally coordinated activity (e.g. &quot;consciousness&quot;) with each region registering its take on the &quot;thought&quot;. But this cannot be in a code which copies from one part to another, as the coding is &lt;em&gt;which neurons&lt;/em&gt; as well as &lt;em&gt;when&lt;/em&gt;, and &lt;em&gt;when&lt;/em&gt; might be copiable but &lt;em&gt;which&lt;/em&gt; most certainly is not.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;I would say that you have transfered some of your &#039;mind&#039; (static memories of experiences) into your diary.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;As a metaphor, you might get away with this, but if you insist you mean it literally, people will think you&#039;re going crazy. You&#039;re talking about writing in a book.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If I&#039;ve offloaded more and more of my sub-conscious thoughts (such as those required for much of the sensory processing in &#039;driving&#039;) into the exo-computer then part of my mind has migrated. One could even view anti-lock brakes and automobile anti-collision radars as first steps along this path.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;These are automation features that make driving easier. There is still a clear boundary between you and the car that you operate with your hands and feet.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;There isn&#039;t a bionic interface yet, but I can recall a time or two when I wish I had one for my anti-lock brakes.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;You might end up braking for hallucinations. An auto-driver would be safer.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Neuronal supplementation will be a requirement for the extension of life for thousands of years&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Maintenance will require stimulating the regrowth of neural tissue at the replacement rate. What I don&#039;t see is why we need or are likely to choose a neural hypertrophy which would eventually distort us into no-longer-human creatures.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;We climbed on the slippery slope when we started cooking our food, is there any reason to stop now?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Cooking our food is not the same as cooking ourselves. You can destroy the human race in any number of ways. &quot;Transhumanism&quot; is yet another, just as much an enemy of humanity as disease, war, pollution or invasion from outer space. If the end is annihilation, the means are to be avoided.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;you shouldn&#039;t be &#039;required&#039; to go bionic. At the same time disallowing it seems to be like the arguments for not educating women or allowing blacks to vote.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I take it you are neither a woman nor a black, and I hope you would have the good sense not to make such a statement in mixed company, although you never know who might be reading.&lt;/p&gt;
&lt;p&gt;Actually, the case for preventing the creation of technological systems which are self-interested, self-aware, and modeled on humans, with human motivations and claiming human rights, is the same as the case for preventing the creation of any other menace to public safety and the rights of actual human beings.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;What makes current era human genomes or current era human intelligence the &#039;right&#039; place to stop?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It&#039;s what we are. What makes it &quot;right&quot; not to &quot;stop&quot;? Where are we supposed to be going? Why?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;There will need to be limits on how many copies a person can make&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;One person is one person.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;and how they treat uncopied minds and the resources they require.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The technology cult asserts that the future belongs to aggressive machines -- oh, but don&#039;t worry, there&#039;ll be some nice reservations set aside for those of you who prefer to &quot;remain human&quot; like flocks of sheep. I turn this around -- let those enamored of the idea of &quot;uploading&quot; commit suicide if they wish, and let their computer simulations spend the rest of subjective eternity enjoying whatever kind of simulated Valhalla they prefer, in some solar-powered reservation on an asteroid, maintained and watched by the human beings who will be busy low-imapact colonizing and enjoying the spectacular natural beauties of the planets and the stars.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;While my destruction followed by the reactivation of a copy would not be my most desired path, I do take some comfort in knowing that part of me would continue to exist.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;What part? What is its mass, its color, its angular momentum? You say it exists, and would continue to exist even if your body was destroyed. What is it?&lt;/p&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;So if I make a &quot;backup,&quot; and point a gun at you, you will have no fear?&lt;/p&gt;
&lt;/blockquote&gt;
&lt;em&gt;Not unless you suppress my adreneline levels at the same time.&lt;/em&gt;&lt;/blockquote&gt;
&lt;p&gt;Why? Aren&#039;t you admitting here that you don&#039;t really believe what you&#039;re saying? If you do believe it, why should you not remain calm? Maybe you want to say that not all of you believes it. Your &quot;mind&quot; may claim to believe this stuff, but your body knows better.&lt;/p&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;We can even make it painless if you like. But you are going to die. I am going to make a copy and activate it, but you are going to die, sucker.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;em&gt;If you make it sufficiently worth my while, and had proven to me that the copy would indeed have a full set of my experiences up until the point of death, I would have no objection to this.&lt;/em&gt;&lt;/blockquote&gt;
&lt;p&gt;Incredible. I assume by &quot;sufficiently worth my while&quot; you mean that I offer you, or your copy, a lot of money. Why would it need to be a lot, if you are so sure that &quot;A recreation is me&quot;? I promise a very faithful copy, no more than a few insignificant errors. You should be willing to take $10 for the few moments it will take. We could set up copying booths on subway platforms, and induce people to have their bodies disassembled for pocket change while they wait for the train.&lt;/p&gt;
&lt;p&gt;Okay, let&#039;s say I offer you a billion dollars, payable to the copy that I will create. Except I&#039;ll create him at some randomly-chosen point on the surface of the planet. &quot;You&quot; may find &quot;yourself&quot; in Kirghizstan or Argentina, but a billion dollars richer, so it ought to be worth your while. But I will also claim the right to make an additional copy, which I will assemble in Baghdad and which will immediately be subjected to excruciating torture, Saddam&#039;s personal torturemaster serving as subcontractor.&lt;/p&gt;
&lt;p&gt;There won&#039;t be any difference between the copy that wakes up a billionaire and the copy that wakes up in a torture chamber, but what the hell, you already decided you were going to wake up the billionaire, so who cares about the torture victim? Don&#039;t worry about the ethics of the deal; a political prisoner will be released to make it an even trade.&lt;/p&gt;
&lt;p&gt;Okay, how do you make sense out of that scenario? Do you look forward to getting a billion bucks, or to dying a slow, agonizing death in Baghdad?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;For me to claim that I object to being replaced by a copy would require that I invalidate the concept that the copies are indeed identical.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Precisely my point about your &quot;unease&quot;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Should I value the real Mona Lisa over the atomic resolution copy of the Mona Lisa? Only if I have some romantic attachment to the atoms that Da Vinci used to create the original.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Well, if you don&#039;t have any romantic attachments to anything, I suppose you might as well kill yourself. But in reality, you do have some romantic attachments, in particular to the idea of technological salvation and to certain people who have espoused this idea and whom you admire. But much of this ideology is nonsensical, as I have tried to show you, and a threat to humanity.&lt;/p&gt;
&lt;p&gt;I think that if you can&#039;t look at an original object, such as an old painting, and experience the thrill of what that object, uniquely, represents, i.e. the connection to the artist and the moment of creation, then I think you really are missing out on something.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Uploading As Migration</strong></p>
<blockquote>
<p><em>I think anyone uploading someone else&#39;s mind without their express permission to do so is probably violating the fundamental right&#39;s of that individual.</em></p>
</blockquote>
<p>Probably? How would you determine whether it is or not?</p>
<blockquote>
<p><em>Some of your comments seem to suggest that mind uploading is something employers would do to lower labor costs.</em></p>
</blockquote>
<p>Economics and justice issues are another matter. Here let&#39;s stick to uploading. My point was that duplicating a person for the sake of using the duplicates might make some kind of sense, but having one&#39;s self killed for the utility of duplicates makes no sense, unless one is some kind of &quot;soldier&quot; willing to make &quot;the supreme sacrifice&quot;.</p>
<blockquote>
<p><em>I believe that &#39;destructive readout&#39; &amp; recreation vs. &#39;evolved relocation&#39; are fundamentally different</em></p>
</blockquote>
<p>It was careless of me to assert that they are not; I don&#39;t want to get into arguments about which differences are &quot;fundamental&quot;. You are talking about different procedures. However, both destroy the person, while creating some kind of artifact which might be another person, a facsimilie of the first, or a nonhuman simulation of some aspect of the person&#39;s constitution and behavior. In either case, the original person is dead.</p>
<blockquote>
<p><em>there may be significant temporal discontinuities with the destructive methods</em></p>
</blockquote>
<p>Yes, but, so what?</p>
<blockquote>
<p><em>All one needs to do is assemble a complete genome for the individual and you know what the atomic structure should be for all of the molecules in the body.</em></p>
</blockquote>
<p>Wrong, and irrelevant. Some molecules certainly have multiple states, including, very importantly, DNA, which is switched on and off by binding proteins. And it is not merely the atomic structure of individual molecules, but the arrangements of molecules and supramolecular assemblies in cells which store the information you are concerned about. You cannot reconstruct these from the genome.</p>
<blockquote>
<p><em>If you produce a wet brain, containing essentially the same molecules with essentially the same organized structure as the original brain, then I would say you have a brain and you should get back a reasonable recreation of the mind contained within it.</em></p>
</blockquote>
<p>Your language, &quot;the mind contained within [the brain]&quot; is dualistic, and implies belief in the existence of non-physical entities. A brain is a brain. If it is an atomically-precise facsimilie of another brain from which it was copied, then it can be called a copy of the original brain, but it <em>is not</em> the original brain. In any case, there will never be any atomically-precise facsimilies! At best you might make some kind of approximation that perhaps no one would be able to tell from the original.</p>
<blockquote>
<p><em>If you are running a molecular simulation of the brain or a model of the neural network contained within the original brain, then I would say you have a &quot;brain&quot; (in quotes).</em></p>
</blockquote>
<p>So, in your glossary, &quot;brain in quotes&quot; means a simulation of a brain. I suppose by extension a &quot;bomb&quot; is a simulation of a bomb, and a &quot;girl&quot; is a simulation of a girl. This raises a sidebar. I don&#39;t have a problem with people who want to be &quot;uploaded,&quot; as long as they only want to &quot;live&quot; in &quot;virtual reality&quot;. I might try to talk them out of committing suicide, but at least they aren&#39;t proposing to create any threats to the rest of humanity. As long as &quot;uploads&quot; will always be confined to ineffectual &quot;cyberspaces,&quot; with no hooks into the real world, no harm can be done by their electrons whirling around.</p>
<blockquote>
<p><em>I would argue that every individual has a fundamental right to &#39;migrate&#39; their mind onto whatever hardware he chooses so long as that does not interfere with anyone elses right to do the same.</em></p>
</blockquote>
<p>Well, I would argue that humanoid artificial intelligences of superhuman capability would be a profound threat to the rights of all humans, and therefore should not be allowed to be created by any means, including &quot;uploading.&quot; If, however, they are completely isolated from any possibility of affecting the physical world outside a simulated &quot;reality&quot; (let it be of their own choosing), then such systems would be incapable of doing any mischief. But I don&#39;t know if society will ever give up the taboo against suicide, at least of healthy individuals.</p>
<blockquote>
<p><em>If you believe that an individual&#39;s mind is contained within the specific atoms of the brain of of an individual,</em></p>
</blockquote>
<p>Then you are a dualist. I am not a dualist. I do not believe that the brain &quot;contains&quot; anything. It is. So is another brain. Two different brains are two different brains. If you believe anything different, you must believe in the existence of something extraphysical.</p>
<blockquote>
<p><em>[1] if you took all of the atoms in my brain apart and recorded their exact position with angstrom accuracy, then put all of those very same atoms back in their original locations [2] I know that that individual would be me</em></p>
</blockquote>
<p>How does [2] modify [1]? You describe in [1] an (almost certainly impossible to carry out exactly) physical procedure. What does [2] claim that could be independently verified?</p>
<blockquote>
<p><em>If it looks like a specific duck, walks like a specific and quacks like a specific, then the recreation is the specific duck, even if it isn&#39;t made out of the original atoms the duck is made out of.</em></p>
</blockquote>
<p>But if we could make one recreation, we could make two, four, a dozen, a billion. Which, then, is the <em>specific</em> duck?</p>
<blockquote>
<p><em>A recreation is me, it is just a me that I may not rather be.</em></p>
</blockquote>
<p>You seem to be appealing to some right of privacy, personal choice. That&#39;s a cop-out from the discussion, but, okay, no one can make you say what the reasons are why you&#39;d &quot;not rather&quot;. But we can infer that they exist, and in the absence of another plausible interpretation of your hesitancy, that you do not really fully believe your assertion that &quot;A recreation is me&quot;.</p>
<blockquote>
<p><em>The dictionary I&#39;ve got defines &quot;migrate&quot; as 1. to settle in another country or region; 2. to move to another region with the change in season, as many birds. So long as the mind being recreated is a reasonable facsimile [exact reproduction or copy], then I would consider it to have &quot;migrated&quot;.</em></p>
</blockquote>
<p>You quote a dictionary definition and then proceed to use the word in an inconsistent way!</p>
<blockquote>
<p><em>Since I know that given current brain structures, I am probably going to lose memories over time (due to neuronal cell loss and/or incomplete or inaccurate copying to new storage locations), the information losses mentioned, go with the territory from my perspective.</em></p>
</blockquote>
<p>The inaccuracy of the brain does argue that an imperfect copy might still be indistinguishable from the original, but it does not change the fact that the copy <em>is not</em> the original; it is a copy. They are two different things. I don&#39;t see how anyone can deny this tautological fact. The fact that any copy is going to be imperfect, most likely <em>very</em> imperfect, only further highlights the fact that it is <em>a different thing</em>.</p>
<blockquote>
<p><em>Fingers, listening and speaking are very low bandwidth channels compared to the amount of information we now can access.</em></p>
</blockquote>
<p>But not compared with the amount we can handle. Having to formulate spoken or written sentences both forces and permits us to organize our thoughts; the stream of consciousness is of course a cacophony. Let advanced AI systems anticipate what we might be interested in, let them provide us with rapid, easily cueable access to information. But keep the boundaries, and keep us humans firmly in control. To do otherwise is to invite our annihilation.</p>
<blockquote>
<p><em>you have to ask would non-bionically enhanced individuals be considered &#39;disabled&#39; at some point.</em></p>
</blockquote>
<p>We really have to get beyond the notion that the future can hold nothing better than endless cutthroat competition with one another (and with machines). I believe it is clear that there is no need for any more powerful information technology than that which will be achieveable without invasion of the body by machinery, without humanoid, self-conscious and self-interested artificial superintelligences, and without biotechnic reengineering of the brain (apart from health maintenance).</p>
<blockquote>
<blockquote>
<p>Different parts of the brain exchange information, but it is extremely unlikely that there is a universal code that can simply be copied from one region to another.</p>
</blockquote>
<p><em>Calvin seems to suggest that there is in at least some regions of the cortex. He suggests that some of the interesting idea &quot;combinations&quot; may occur when you copy a pattern from one area onto another area that is biased for running a different pattern or when to patterns run side-by-side and influence each other. I would tend to agree that long distance communications (e.g. memory fetching or physiological controls are probably hardwired with their own unique codes). If Calvin&#39;s concept of &#39;thought patterns&#39; has some merit and nanobots can read them out (and fetch and restore them to the neurons in some way), then I believe one has thought transference.</em></p></blockquote>
<p>I haven&#39;t read Calvin&#39;s work, so I don&#39;t know exactly what he&#39;s saying, but your account of it suggests precisely the kind of mystical, dualistic model of brain function that is almost certainly wrong &#8212; the idea that one has certain conscious experiences as a <em>result</em> of the production of certain &quot;firing patterns&quot; or whatever. So if some other &quot;hardware&quot; could produce these same &quot;patterns&quot;, then one would just as surely &quot;have the same experience&quot;. There are so many errrors in this way of thinking that it is hard to know where to begin in criticizing it.</p>
<p>I&#39;ll just say that the notion of a universal code, even one confined to a particular region of the brain, is implausible because each replication of this code wouldn&#39;t be doing anything different from the others. It would be a waste of neurons to have them all beating in synchrony for the benefit of some cosmic observer. This is not to say that different parts of the brain do not have their own local representation of the <em>same content</em>, or that they do not take place in globally coordinated activity (e.g. &quot;consciousness&quot;) with each region registering its take on the &quot;thought&quot;. But this cannot be in a code which copies from one part to another, as the coding is <em>which neurons</em> as well as <em>when</em>, and <em>when</em> might be copiable but <em>which</em> most certainly is not.</p>
<blockquote>
<p><em>I would say that you have transfered some of your &#39;mind&#39; (static memories of experiences) into your diary.</em></p>
</blockquote>
<p>As a metaphor, you might get away with this, but if you insist you mean it literally, people will think you&#39;re going crazy. You&#39;re talking about writing in a book.</p>
<blockquote>
<p><em>If I&#39;ve offloaded more and more of my sub-conscious thoughts (such as those required for much of the sensory processing in &#39;driving&#39;) into the exo-computer then part of my mind has migrated. One could even view anti-lock brakes and automobile anti-collision radars as first steps along this path.</em></p>
</blockquote>
<p>These are automation features that make driving easier. There is still a clear boundary between you and the car that you operate with your hands and feet.</p>
<blockquote>
<p><em>There isn&#39;t a bionic interface yet, but I can recall a time or two when I wish I had one for my anti-lock brakes.</em></p>
</blockquote>
<p>You might end up braking for hallucinations. An auto-driver would be safer.</p>
<blockquote>
<p><em>Neuronal supplementation will be a requirement for the extension of life for thousands of years</em></p>
</blockquote>
<p>Maintenance will require stimulating the regrowth of neural tissue at the replacement rate. What I don&#39;t see is why we need or are likely to choose a neural hypertrophy which would eventually distort us into no-longer-human creatures.</p>
<blockquote>
<p><em>We climbed on the slippery slope when we started cooking our food, is there any reason to stop now?</em></p>
</blockquote>
<p>Cooking our food is not the same as cooking ourselves. You can destroy the human race in any number of ways. &quot;Transhumanism&quot; is yet another, just as much an enemy of humanity as disease, war, pollution or invasion from outer space. If the end is annihilation, the means are to be avoided.</p>
<blockquote>
<p><em>you shouldn&#39;t be &#39;required&#39; to go bionic. At the same time disallowing it seems to be like the arguments for not educating women or allowing blacks to vote.</em></p>
</blockquote>
<p>I take it you are neither a woman nor a black, and I hope you would have the good sense not to make such a statement in mixed company, although you never know who might be reading.</p>
<p>Actually, the case for preventing the creation of technological systems which are self-interested, self-aware, and modeled on humans, with human motivations and claiming human rights, is the same as the case for preventing the creation of any other menace to public safety and the rights of actual human beings.</p>
<blockquote>
<p><em>What makes current era human genomes or current era human intelligence the &#39;right&#39; place to stop?</em></p>
</blockquote>
<p>It&#39;s what we are. What makes it &quot;right&quot; not to &quot;stop&quot;? Where are we supposed to be going? Why?</p>
<blockquote>
<p><em>There will need to be limits on how many copies a person can make</em></p>
</blockquote>
<p>One person is one person.</p>
<blockquote>
<p><em>and how they treat uncopied minds and the resources they require.</em></p>
</blockquote>
<p>The technology cult asserts that the future belongs to aggressive machines &#8212; oh, but don&#39;t worry, there&#39;ll be some nice reservations set aside for those of you who prefer to &quot;remain human&quot; like flocks of sheep. I turn this around &#8212; let those enamored of the idea of &quot;uploading&quot; commit suicide if they wish, and let their computer simulations spend the rest of subjective eternity enjoying whatever kind of simulated Valhalla they prefer, in some solar-powered reservation on an asteroid, maintained and watched by the human beings who will be busy low-imapact colonizing and enjoying the spectacular natural beauties of the planets and the stars.</p>
<blockquote>
<p><em>While my destruction followed by the reactivation of a copy would not be my most desired path, I do take some comfort in knowing that part of me would continue to exist.</em></p>
</blockquote>
<p>What part? What is its mass, its color, its angular momentum? You say it exists, and would continue to exist even if your body was destroyed. What is it?</p>
<blockquote>
<blockquote>
<p>So if I make a &quot;backup,&quot; and point a gun at you, you will have no fear?</p>
</blockquote>
<p><em>Not unless you suppress my adreneline levels at the same time.</em></p></blockquote>
<p>Why? Aren&#39;t you admitting here that you don&#39;t really believe what you&#39;re saying? If you do believe it, why should you not remain calm? Maybe you want to say that not all of you believes it. Your &quot;mind&quot; may claim to believe this stuff, but your body knows better.</p>
<blockquote>
<blockquote>
<p>We can even make it painless if you like. But you are going to die. I am going to make a copy and activate it, but you are going to die, sucker.</p>
</blockquote>
<p><em>If you make it sufficiently worth my while, and had proven to me that the copy would indeed have a full set of my experiences up until the point of death, I would have no objection to this.</em></p></blockquote>
<p>Incredible. I assume by &quot;sufficiently worth my while&quot; you mean that I offer you, or your copy, a lot of money. Why would it need to be a lot, if you are so sure that &quot;A recreation is me&quot;? I promise a very faithful copy, no more than a few insignificant errors. You should be willing to take $10 for the few moments it will take. We could set up copying booths on subway platforms, and induce people to have their bodies disassembled for pocket change while they wait for the train.</p>
<p>Okay, let&#39;s say I offer you a billion dollars, payable to the copy that I will create. Except I&#39;ll create him at some randomly-chosen point on the surface of the planet. &quot;You&quot; may find &quot;yourself&quot; in Kirghizstan or Argentina, but a billion dollars richer, so it ought to be worth your while. But I will also claim the right to make an additional copy, which I will assemble in Baghdad and which will immediately be subjected to excruciating torture, Saddam&#39;s personal torturemaster serving as subcontractor.</p>
<p>There won&#39;t be any difference between the copy that wakes up a billionaire and the copy that wakes up in a torture chamber, but what the hell, you already decided you were going to wake up the billionaire, so who cares about the torture victim? Don&#39;t worry about the ethics of the deal; a political prisoner will be released to make it an even trade.</p>
<p>Okay, how do you make sense out of that scenario? Do you look forward to getting a billion bucks, or to dying a slow, agonizing death in Baghdad?</p>
<blockquote>
<p><em>For me to claim that I object to being replaced by a copy would require that I invalidate the concept that the copies are indeed identical.</em></p>
</blockquote>
<p>Precisely my point about your &quot;unease&quot;.</p>
<blockquote>
<p><em>Should I value the real Mona Lisa over the atomic resolution copy of the Mona Lisa? Only if I have some romantic attachment to the atoms that Da Vinci used to create the original.</em></p>
</blockquote>
<p>Well, if you don&#39;t have any romantic attachments to anything, I suppose you might as well kill yourself. But in reality, you do have some romantic attachments, in particular to the idea of technological salvation and to certain people who have espoused this idea and whom you admire. But much of this ideology is nonsensical, as I have tried to show you, and a threat to humanity.</p>
<p>I think that if you can&#39;t look at an original object, such as an old painting, and experience the thrill of what that object, uniquely, represents, i.e. the connection to the artist and the moment of creation, then I think you really are missing out on something.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RobertBradbury</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1122</link>
		<dc:creator>RobertBradbury</dc:creator>
		<pubDate>Sat, 10 Feb 2001 15:18:50 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1122</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Uploading As Migration&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    So you are saying that a person&#039;s brain could be discarded and another one&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    used for the same purpose(s). That might be okay if the purposes were those of&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    another person.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I am perhaps not being completely accurate in matching my words with my meanings. I consider the person&#039;s &quot;mind&quot; to be what is being migrated. To me the brain is nothing more than the hardware the mind currently runs on. I will freely admit that minds are not &quot;currently&quot; software, and operate more at a &quot;firmware&quot; level, but I believe we can migrate the mind up to the software level given sufficiently advanced technologies.&lt;/p&gt;
&lt;p&gt;I think anyone uploading someone else&#039;s mind without their express permission to do so is probably violating the fundamental right&#039;s of that individual. But here we get into areas where human rights have not been defined, such as &quot;Do you have a right not to be born genetically enhanced?&quot;, or &quot;Do you have a right to prevent the use of your genetic material?&quot; (say someone wants to clone Madonna, getting her DNA is probably not particularly difficult).&lt;/p&gt;
&lt;p&gt;Some of your comments seem to suggest that mind uploading is something employers would do to lower labor costs. If I could &quot;sell&quot; my mind collect royalties from it, it makes for some interesting scenarios. From my perspective, the technologies required for uploading are not strongly dissimilar from very advanced biotech or true molecular nanotech. I believe that environment eliminates the classical employer/employee relationship because nobody will &quot;have&quot; to work to survive.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;     It seems to me that the second is not fundamentally different from the&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;     first, but is really only a kind of smoke-and-mirrors argument.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I believe that &#039;destructive readout&#039; &amp; recreation vs. &#039;evolved relocation&#039; are fundamentally different because there may be significant temporal discontinuities with the destructive methods.  It may take years, from the point where your mind is stopped to the point when your mind may be restarted.  Evolved relocation, IMO, may not require long periods of down time.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    If molecules move or undergo significant changes of state in the freezing or&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    vitrification process, the DNA will be completely useless in reconstructing&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    their prior states at the time of death.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Unless the molecules are substantially reduced to atoms the information required for their proper reconstruction is still there. All one needs to do is assemble a complete genome for the individual and you know what the atomic structure &lt;em&gt;should&lt;/em&gt; be for all of the molecules in the body. Assembling a complete genome from even highly broken pieces of DNA &lt;em&gt;is&lt;/em&gt; feasible, because that is exactly what companies like Celera do today. Once functional genomics works out all of the DNA and protein regulatory pathways we should know where the molecules belong within the cells and their normal pysical relationships to all of the other molecules. Keeping in mind that &lt;em&gt;many&lt;/em&gt; cell types can be frozen solid and revived today, I believe that all of this information combined with nanoscale assistive machines or cleverly engineered drugs, should allow cryonic reanimation. That is not however &quot;uploading&quot; which is going to require a &quot;readout&quot; in some way of the connection matrix and synapse strengths and potentially even the concentrations of various molecules within the neurons and maybe even gene activation state information. I do not beliave that any of that information is &quot;substantially&quot; lost during freezing. It may however take a very big computer to figure out where it all was before the freezing process started.&lt;/p&gt;
&lt;p&gt;If you haven&#039;t already read it, I would recomend you read Ralph Merkle&#039;s paper:&lt;br /&gt;
    &lt;a href=&quot;http://www.merkle.com/cryo/techFeas.html&quot;&gt;The molecular repair of the brain&lt;/a&gt;&lt;br /&gt;
It is somewhat dated at this point, but it is a good place to start.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Molecular-level details are almost certainly important; the naive&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    idea that memory and personality can be reduced to synaptic&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    connectivity is flatly contradicted by modern neuroscience.&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    It seems likely that much information will in fact be irretrievably lost,&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    although that would perhaps be equivalent to the effects of a major brain&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    trauma which typically causes loss of recent memory and some reversible&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    loss of competency.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I would tend to disagree. Medical procedures that require the body temperature to be lowered to the point where the heart stops also lower the brain temperature to the point where neural electrical activity ceases. People are routinely brought back to life following these procedures. The &lt;a href=&quot;http://www.comarecovery.org&quot;&gt;Coma Recovery Association&lt;/a&gt; documents that to be declared brain dead, an 2 EEGs must detect &lt;em&gt;no&lt;/em&gt; electrical activity over a 24 hour interval.&lt;/p&gt;
&lt;p&gt;The fact that they require two such scans seems to suggest that the detection of no electrical activity is insufficient to certify the individual is incapable of regaining consciousness. If you have no electrical activity in the brain, that seems to suggest your &#039;mind&#039; can be rebooted from the structural and molecular material alone.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    You put the word &quot;brain&quot; in quotes. Why? Perhaps because you recognize that&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    there is something phoney about it. If it were an atomic-level facsimilie of&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    the original brain, I would have to agree that it was a brain, no quotes.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;If you produce a wet brain, containing essentially the same molecules with essentially the same organized structure as the original brain, then I would say you have a brain and you should get back a reasonable recreation of the mind contained within it. If you are running a molecular simulation of the brain or a model of the neural network contained within the original brain, then I would say you have a &quot;brain&quot; (in quotes).&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    In either case, the original person would have been destroyed, killed.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I would argue that every individual has a fundamental right to &#039;migrate&#039; their mind onto whatever hardware he chooses so long as that does not interfere with anyone elses right to do the same. If you believe that an individual&#039;s mind is contained within the specific atoms of the brain of of an individual, then I would tend to agree with that the original has been destroyed. But if you took all of the atoms in my brain apart and recorded their exact position with angstrom accuracy, then put all of those very same atoms back in their original locations I &lt;em&gt;know&lt;/em&gt; that that individual would be me -- however waking up after that process, I would might find myself feeling a bit awkward knowing what had been done.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Introducing new terminology at the last minute, you downgrade &quot;precisely a&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    migration&quot; to &quot;in essence a recreation&quot;. The latter seems to claim a lot&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    less than the former.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;If it looks like a &lt;em&gt;specific&lt;/em&gt; duck, walks like a &lt;em&gt;specific&lt;/em&gt; duck and quacks like a &lt;em&gt;specific&lt;/em&gt; duck, then the recreation &lt;em&gt;is&lt;/em&gt; the &lt;em&gt;specific&lt;/em&gt; duck, even if it isn&#039;t made out of the original atoms the duck is made out of. [This is based on my interpretation that &quot;I&quot; am my mental history &amp; patterning and not the atoms of my brain that are being recycled (with some gain and some loss) every day.]&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Why not? You are admitting there is something wrong with the claim that it&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    amounts to your &quot;own personal &#039;indefinite longevity&#039;&quot;.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;A recreation is me, it is just a me that I may not rather be.&lt;/p&gt;
&lt;p&gt;    [snip -- discussion regarding brain structural and molecular complexity]&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    In short, the technical challenge of &quot;uploading&quot; has almost certainly been&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    underestimated by most enthusiasts and authors on the subject.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I would agree that the complexity is not understood by most people who discuss the topic. I believe that the lowest level details you suggest may not be particularly significant, but it will take another 10-20 years of neuroscience before we are likely to know for sure. However, if and when we get to the point where we understand what causes comas and how to bring people out of them, I think we will be a long way towards understanding exactly what level information must be restored to recreate a mind.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    The important point is that the claim that such processes offer a way for&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    the individual to escape death and &quot;migrate to other hardware&quot; is&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    ontological nonsense.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;If the individual&#039;s mind has been recreated, I consider it a form of migration. The dictionary I&#039;ve got defines &quot;migrate&quot; as 1. to settle in another country or region; 2. to move to another region with the change in season, as many birds. So long as the mind being recreated is a reasonable facsimile [exact reproduction or copy], then I would consider it to have &quot;migrated&quot;. If one attaches the &#039;mind&#039; to the &#039;brain&#039; and then to the &#039;molecules&#039; and even &#039;atoms&#039;, then that would not be the case.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Any estimates on the amount of disruption and displacement of brain&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    tissue by the required fiber optics, or the heating by the microwaves?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Not at this time. This is where concrete work needs to be done. Nanobots are going to be more efficient at performing specific tasks, such as &#039;counting&#039; concentrations of neurotransmitter molecules or &#039;transmitting&#039; information on neuron firing frequencies, spacings and amplitudes. However nanobots do have heat limits. They may have molecular granularity limits (a limit on the amount of sampling they can do without exceeding heat capacities). Even though they should be the size of mitochondria, they may not be able to go inside axons or dendrites if they interfere excessively with molecular trafficing. Remote operations, sensing, etc. will make information collection more difficult. As surgeons now routinely use fine bore needles to deliver therapeutics to the brain (e.g. chemotherapies or cells for treating Parkinsons patient&#039;s), the brain is probably &quot;fairly&quot; tolerant of the physical distortion of linkages. However as head trauma injuries show, there are probably limits to this.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    But these are not requirements to be underestimated. And even so,&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    it is not clear that such an approach would ever succeed in teasing&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    out &quot;the meanings&quot; of all &quot;specific signals.&quot; There may be thoughts&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    that will occur to me only a few times in my life, tied to memories&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    that for the most part remain buried in the tangle. I think this kind&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    of approach might within a reasonable amount of time give an eavesdropper&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    some crude capability to &quot;read&quot; some of my mind, but it is not clear&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    that it would ever give him the ability to reconstruct a truly faithful copy.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Since I know that given current brain structures, I am probably going to lose memories over time (due to neuronal cell loss and/or incomplete or inaccurate copying to new storage locations), the information losses mentioned, go with the territory from my perspective.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Now you are talking about interfacing, not &quot;uploading.&quot;&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Fine. It is part of the technology path I see to &#039;evolutionary uploading&#039;.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    How is this &quot;off-loading&quot; anything?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Do you bother to &quot;remember&quot; anything now that you put in your notebook or PDA? Sure you may remember some of it, but if you know where you can store it and get it back easily you don&#039;t make a point of remembering it.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    The brain might learn to use an external tool through a bionic interface,&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    but this does not seem likely to be a seamless integration, much less an&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    &quot;off-loading&quot; of already established memory and personality, much less&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    any sort of &quot;migration&quot; of &quot;consciousness&quot;.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Memory is part of who we are. There are lots of cases of people who lost the parts of their brain essential for childhood memories or today&#039;s memories. They may be perfectly &#039;consciouss&#039; but not be fully functional.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    And they can continue to do so, with no need for any bionic implants. As the&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    devices and software get better, they will put more and more information power&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    at our disposal without needing to invade our bodies.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I would want the interface because I view it as a bandwidth issue. Fingers, listening and speaking are very low bandwidth channels compared to the amount of information we now can access. If you want the &quot;philosphical&quot; side, you have to ask would non-bionically enhanced individuals be considered &#039;disabled&#039; at some point.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    I find this news item quite dubious. Of course, people who rely on PDAs might&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    not bother to memorize phone numbers and so forth, but the same would be true&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    of people who used notebooks.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Perhaps. I find that I like to take in information, organize it, then put it someplace where I know where to find it (be it a book or web addresses or phone numbers), then I tend to remember where it is but not always what it is (at least not in any detail).&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Different parts of the brain exchange information, but it is extremely unlikely&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    that there is a universal code that can simply be copied from one region to&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    another.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Calvin seems to suggest that there is in at least some regions of the cortex. He suggests that some of the interesting idea &quot;combinations&quot; may occur when you copy a pattern from one area onto another area that is biased for running a different pattern or when to patterns run side-by-side and influence each other. I would tend to agree that long distance communications (e.g. memory fetching or physiological controls are probably hardwired with their own unique codes). If Calvin&#039;s concept of &#039;thought patterns&#039; has some merit and nanobots can read them out (and fetch and restore them to the neurons in some way), then I believe one has thought transference.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Or would you admit that the diary was just an adjunct record which I could&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    consult if I wanted to recover some lost bit of ephemera from the past?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I think the diary would function similar to the way good books do now, they recreate in your mind the images the author intended. A good author can probably create such rich mental images the first time you read something, that when you re-read the book, the images created are similar to those that are recreated when you read your diary. I would say that you have transfered some of your &#039;mind&#039; (static memories of experiences) into your diary.&lt;/p&gt;
&lt;p&gt;Sounds neat, but again, I can authorize duplication of my diary, and it doesn&#039;t make me immortal, at least not literally (though perhaps literarily), and anyway, I allow that by some technology it might be possible to duplicate my brain (make a facsimilie) any number of times... so what?&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Here&#039;s where you attempt the sleight-of-hand maneuver -- but I caught you!&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    This &quot;majority of the mind&quot; you&#039;re talking about is just the computerized&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    PDA/diary/ajunct/whatever that the person was supposedly using to expand her&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    capabilties.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;:-) Well, I take the point of view that the human mind is a very sophisticated multi-processor (I can drive and think about a work problem at the same time). If I&#039;ve offloaded more and more of my sub-conscious thoughts (such as those required for much of the sensory processing in &#039;driving&#039;) into the exo-computer then part of my mind has migrated. One could even view anti-lock brakes and automobile anti-collision radars as first steps along this path. They are offloading and/or improving on the subconscious processing your brain normally does. There isn&#039;t a bionic interface yet, but I can recall a time or two when I wish I had one for my anti-lock brakes.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Now you say &quot;an &#039;accident&#039; happens&quot;, meaning, the person is dead.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;The person is brain-dead, yes.  How much of the mind is lost depends on the partitioning between their body and the exo-computer.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    End of that story. Perhaps there is another story here, about the &quot;disembodied&quot;&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    computer software. Such rogue software could indeed play havoc. Let&#039;s make&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    sure it can&#039;t, that any &quot;adjunct&quot; software created by a human individual dies&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    with that individual, or is frozen at least so that it cannot cause harm.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Well, since that software and the memories are potentially valuable parts of the estate, I don&#039;t think you watn to &#039;dump&#039; them unless the individual has expressly requested this. Yes, I agree they need to be guaranteed as safe (presumably these run in some sort of virtual machine).&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    First of all, why will we &quot;likely&quot; do this?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Because, if one does not gradually replace the neurons that are dying at a greater rate than they are being formed, at some point after hundreds of years, you are dead. Neuronal supplementation will be a requirement for the extension of life for thousands of years (which is what we will probably achieve by moderately advanced applications of biotechnology).&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    So perhaps some biotech interventions might be desirable. But not a&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    runaway cerebral hypertrophy that turns us into Mars creatures.&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    Anyone who wants that needs to have their head examined, not expanded.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;We climbed on the slippery slope when we started cooking our food, is there any reason to stop now? We should do it carefully and responsibly and try to accomodate the desires of as many people as possible, e.g. you shouldn&#039;t be &#039;required&#039; to go bionic. At the same time disallowing it seems to be like the arguments for not educating women or allowing blacks to vote. What makes current era human genomes or current era human intelligence the &#039;right&#039; place to stop?&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Has it never occurred to you that an unlimited &quot;mental evolution&quot; might&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    not be a good thing?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Yes, thats why I say &#039;carefully&#039; and &#039;responsibly&#039;. If one reads Robin Hanson&#039;s paper &lt;a href=&quot;http://hanson.gmu.edu/uploads.html&quot;&gt;If Uploads Come First&lt;/a&gt;, then one realizes uploads are not to be taken lightly! There will need to be limits on how many copies a person can make and how they treat uncopied minds and the resources they require.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    If a human brain dies naturally or is destroyed artificially, the person&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    is dead, no matter what kind of artifact has been created in the meantime.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;You seem to be attaching the person to his wet brain. My interest is in preserving the continuity of my consiousness and memories that I view as my &#039;mind&#039; which I am not convinced will always be limited to remaining in my wet brain.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Are you nothing more than a &quot;mind-instance&quot;? And your destruction would be&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    okay as long as some other &quot;copy&quot; would be &quot;activated&quot; afterward?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&quot;I&quot; am nothing more than a mind-instance until I personally am shown some concrete scientific evidence that something more is involved. (After-death experiences are hearsay evidence in my book.) While my destruction followed by the reactivation of a copy would not be my most desired path, I do take some comfort in knowing that part of me would continue to exist.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    So if I make a &quot;backup,&quot; and point a gun at you, you will have no fear?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Not unless you suppress my adreneline levels at the same time.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    Suppose I even continually update the &quot;backup&quot;, so that I can promise&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    that your copy, when activated, will remember every experience, right&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    up to the penetration of the bullet and your slow bleeding to death.&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    Then you would have no problem with being shot?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Of course I&#039;m going to have a problem with it as its going to hurt like hell.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;    We can even make it painless if you like. But you are going to die.&lt;/em&gt;&lt;br /&gt;
&lt;em&gt;    I am going to make a copy and activate it, but you are going to die, sucker.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;If you make it sufficiently worth my while, and had proven to me that the copy would indeed have a full set of my experiences up until the point of death, I would have no objection to this. The &quot;sufficiently worth my while&quot; would vary inversely with my degree of confidence in the accuracy of the copy. I would predict that if these technologies do become available, you will see many people creating interesting ways in which to kill themselves and be getting paid a lot to have it shown on &quot;Extreme Uploadings&quot;.&lt;/p&gt;
&lt;p&gt;For me to claim that I object to being replaced by a copy would require that I invalidate the concept that the copies are indeed identical. Since I&#039;m fairly sure the copies can be made identical to a relatively insignificant degree of difference I cannot raise an objection. Should I value the &lt;em&gt;real&lt;/em&gt; Mona Lisa over the atomic resolution copy of the Mona Lisa? Only if I have some romantic attachment to the atoms that Da Vinci used to create the original.&lt;/p&gt;
&lt;p&gt;I found Moravec&#039;s discussion of topics related to this useful in creating my position.&lt;br /&gt;
The document is: &lt;a href=&quot;http://www.aeiveos.com/~bradbury/Authors/Computing/Moravec-H/HDPSF.html#TimeAndAlter%20nityByComputer&quot;&gt;HERE&lt;/a&gt;&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Uploading As Migration</strong></p>
<p><em>    So you are saying that a person&#39;s brain could be discarded and another one</em><br />
<em>    used for the same purpose(s). That might be okay if the purposes were those of</em><br />
<em>    another person.</em></p>
<p>I am perhaps not being completely accurate in matching my words with my meanings. I consider the person&#39;s &quot;mind&quot; to be what is being migrated. To me the brain is nothing more than the hardware the mind currently runs on. I will freely admit that minds are not &quot;currently&quot; software, and operate more at a &quot;firmware&quot; level, but I believe we can migrate the mind up to the software level given sufficiently advanced technologies.</p>
<p>I think anyone uploading someone else&#39;s mind without their express permission to do so is probably violating the fundamental right&#39;s of that individual. But here we get into areas where human rights have not been defined, such as &quot;Do you have a right not to be born genetically enhanced?&quot;, or &quot;Do you have a right to prevent the use of your genetic material?&quot; (say someone wants to clone Madonna, getting her DNA is probably not particularly difficult).</p>
<p>Some of your comments seem to suggest that mind uploading is something employers would do to lower labor costs. If I could &quot;sell&quot; my mind collect royalties from it, it makes for some interesting scenarios. From my perspective, the technologies required for uploading are not strongly dissimilar from very advanced biotech or true molecular nanotech. I believe that environment eliminates the classical employer/employee relationship because nobody will &quot;have&quot; to work to survive.</p>
<p><em>     It seems to me that the second is not fundamentally different from the</em><br />
<em>     first, but is really only a kind of smoke-and-mirrors argument.</em></p>
<p>I believe that &#39;destructive readout&#39; &amp; recreation vs. &#39;evolved relocation&#39; are fundamentally different because there may be significant temporal discontinuities with the destructive methods.  It may take years, from the point where your mind is stopped to the point when your mind may be restarted.  Evolved relocation, IMO, may not require long periods of down time.</p>
<p><em>    If molecules move or undergo significant changes of state in the freezing or</em><br />
<em>    vitrification process, the DNA will be completely useless in reconstructing</em><br />
<em>    their prior states at the time of death.</em></p>
<p>Unless the molecules are substantially reduced to atoms the information required for their proper reconstruction is still there. All one needs to do is assemble a complete genome for the individual and you know what the atomic structure <em>should</em> be for all of the molecules in the body. Assembling a complete genome from even highly broken pieces of DNA <em>is</em> feasible, because that is exactly what companies like Celera do today. Once functional genomics works out all of the DNA and protein regulatory pathways we should know where the molecules belong within the cells and their normal pysical relationships to all of the other molecules. Keeping in mind that <em>many</em> cell types can be frozen solid and revived today, I believe that all of this information combined with nanoscale assistive machines or cleverly engineered drugs, should allow cryonic reanimation. That is not however &quot;uploading&quot; which is going to require a &quot;readout&quot; in some way of the connection matrix and synapse strengths and potentially even the concentrations of various molecules within the neurons and maybe even gene activation state information. I do not beliave that any of that information is &quot;substantially&quot; lost during freezing. It may however take a very big computer to figure out where it all was before the freezing process started.</p>
<p>If you haven&#39;t already read it, I would recomend you read Ralph Merkle&#39;s paper:<br />
    <a href="http://www.merkle.com/cryo/techFeas.html">The molecular repair of the brain</a><br />
It is somewhat dated at this point, but it is a good place to start.</p>
<p><em>    Molecular-level details are almost certainly important; the naive</em><br />
<em>    idea that memory and personality can be reduced to synaptic</em><br />
<em>    connectivity is flatly contradicted by modern neuroscience.</em><br />
<em>    It seems likely that much information will in fact be irretrievably lost,</em><br />
<em>    although that would perhaps be equivalent to the effects of a major brain</em><br />
<em>    trauma which typically causes loss of recent memory and some reversible</em><br />
<em>    loss of competency.</em></p>
<p>I would tend to disagree. Medical procedures that require the body temperature to be lowered to the point where the heart stops also lower the brain temperature to the point where neural electrical activity ceases. People are routinely brought back to life following these procedures. The <a href="http://www.comarecovery.org">Coma Recovery Association</a> documents that to be declared brain dead, an 2 EEGs must detect <em>no</em> electrical activity over a 24 hour interval.</p>
<p>The fact that they require two such scans seems to suggest that the detection of no electrical activity is insufficient to certify the individual is incapable of regaining consciousness. If you have no electrical activity in the brain, that seems to suggest your &#39;mind&#39; can be rebooted from the structural and molecular material alone.</p>
<p><em>    You put the word &quot;brain&quot; in quotes. Why? Perhaps because you recognize that</em><br />
<em>    there is something phoney about it. If it were an atomic-level facsimilie of</em><br />
<em>    the original brain, I would have to agree that it was a brain, no quotes.</em></p>
<p>If you produce a wet brain, containing essentially the same molecules with essentially the same organized structure as the original brain, then I would say you have a brain and you should get back a reasonable recreation of the mind contained within it. If you are running a molecular simulation of the brain or a model of the neural network contained within the original brain, then I would say you have a &quot;brain&quot; (in quotes).</p>
<p><em>    In either case, the original person would have been destroyed, killed.</em></p>
<p>I would argue that every individual has a fundamental right to &#39;migrate&#39; their mind onto whatever hardware he chooses so long as that does not interfere with anyone elses right to do the same. If you believe that an individual&#39;s mind is contained within the specific atoms of the brain of of an individual, then I would tend to agree with that the original has been destroyed. But if you took all of the atoms in my brain apart and recorded their exact position with angstrom accuracy, then put all of those very same atoms back in their original locations I <em>know</em> that that individual would be me &#8212; however waking up after that process, I would might find myself feeling a bit awkward knowing what had been done.</p>
<p><em>    Introducing new terminology at the last minute, you downgrade &quot;precisely a</em><br />
<em>    migration&quot; to &quot;in essence a recreation&quot;. The latter seems to claim a lot</em><br />
<em>    less than the former.</em></p>
<p>If it looks like a <em>specific</em> duck, walks like a <em>specific</em> duck and quacks like a <em>specific</em> duck, then the recreation <em>is</em> the <em>specific</em> duck, even if it isn&#39;t made out of the original atoms the duck is made out of. [This is based on my interpretation that &quot;I&quot; am my mental history &amp; patterning and not the atoms of my brain that are being recycled (with some gain and some loss) every day.]</p>
<p><em>    Why not? You are admitting there is something wrong with the claim that it</em><br />
<em>    amounts to your &quot;own personal &#39;indefinite longevity&#39;&quot;.</em></p>
<p>A recreation is me, it is just a me that I may not rather be.</p>
<p>    [snip -- discussion regarding brain structural and molecular complexity]</p>
<p><em>    In short, the technical challenge of &quot;uploading&quot; has almost certainly been</em><br />
<em>    underestimated by most enthusiasts and authors on the subject.</em></p>
<p>I would agree that the complexity is not understood by most people who discuss the topic. I believe that the lowest level details you suggest may not be particularly significant, but it will take another 10-20 years of neuroscience before we are likely to know for sure. However, if and when we get to the point where we understand what causes comas and how to bring people out of them, I think we will be a long way towards understanding exactly what level information must be restored to recreate a mind.</p>
<p><em>    The important point is that the claim that such processes offer a way for</em><br />
<em>    the individual to escape death and &quot;migrate to other hardware&quot; is</em><br />
<em>    ontological nonsense.</em></p>
<p>If the individual&#39;s mind has been recreated, I consider it a form of migration. The dictionary I&#39;ve got defines &quot;migrate&quot; as 1. to settle in another country or region; 2. to move to another region with the change in season, as many birds. So long as the mind being recreated is a reasonable facsimile [exact reproduction or copy], then I would consider it to have &quot;migrated&quot;. If one attaches the &#39;mind&#39; to the &#39;brain&#39; and then to the &#39;molecules&#39; and even &#39;atoms&#39;, then that would not be the case.</p>
<p><em>    Any estimates on the amount of disruption and displacement of brain</em><br />
<em>    tissue by the required fiber optics, or the heating by the microwaves?</em></p>
<p>Not at this time. This is where concrete work needs to be done. Nanobots are going to be more efficient at performing specific tasks, such as &#39;counting&#39; concentrations of neurotransmitter molecules or &#39;transmitting&#39; information on neuron firing frequencies, spacings and amplitudes. However nanobots do have heat limits. They may have molecular granularity limits (a limit on the amount of sampling they can do without exceeding heat capacities). Even though they should be the size of mitochondria, they may not be able to go inside axons or dendrites if they interfere excessively with molecular trafficing. Remote operations, sensing, etc. will make information collection more difficult. As surgeons now routinely use fine bore needles to deliver therapeutics to the brain (e.g. chemotherapies or cells for treating Parkinsons patient&#39;s), the brain is probably &quot;fairly&quot; tolerant of the physical distortion of linkages. However as head trauma injuries show, there are probably limits to this.</p>
<p><em>    But these are not requirements to be underestimated. And even so,</em><br />
<em>    it is not clear that such an approach would ever succeed in teasing</em><br />
<em>    out &quot;the meanings&quot; of all &quot;specific signals.&quot; There may be thoughts</em><br />
<em>    that will occur to me only a few times in my life, tied to memories</em><br />
<em>    that for the most part remain buried in the tangle. I think this kind</em><br />
<em>    of approach might within a reasonable amount of time give an eavesdropper</em><br />
<em>    some crude capability to &quot;read&quot; some of my mind, but it is not clear</em><br />
<em>    that it would ever give him the ability to reconstruct a truly faithful copy.</em></p>
<p>Since I know that given current brain structures, I am probably going to lose memories over time (due to neuronal cell loss and/or incomplete or inaccurate copying to new storage locations), the information losses mentioned, go with the territory from my perspective.</p>
<p><em>    Now you are talking about interfacing, not &quot;uploading.&quot;</em></p>
<p>Fine. It is part of the technology path I see to &#39;evolutionary uploading&#39;.</p>
<p><em>    How is this &quot;off-loading&quot; anything?</em></p>
<p>Do you bother to &quot;remember&quot; anything now that you put in your notebook or PDA? Sure you may remember some of it, but if you know where you can store it and get it back easily you don&#39;t make a point of remembering it.</p>
<p><em>    The brain might learn to use an external tool through a bionic interface,</em><br />
<em>    but this does not seem likely to be a seamless integration, much less an</em><br />
<em>    &quot;off-loading&quot; of already established memory and personality, much less</em><br />
<em>    any sort of &quot;migration&quot; of &quot;consciousness&quot;.</em></p>
<p>Memory is part of who we are. There are lots of cases of people who lost the parts of their brain essential for childhood memories or today&#39;s memories. They may be perfectly &#39;consciouss&#39; but not be fully functional.</p>
<p><em>    And they can continue to do so, with no need for any bionic implants. As the</em><br />
<em>    devices and software get better, they will put more and more information power</em><br />
<em>    at our disposal without needing to invade our bodies.</em></p>
<p>I would want the interface because I view it as a bandwidth issue. Fingers, listening and speaking are very low bandwidth channels compared to the amount of information we now can access. If you want the &quot;philosphical&quot; side, you have to ask would non-bionically enhanced individuals be considered &#39;disabled&#39; at some point.</p>
<p><em>    I find this news item quite dubious. Of course, people who rely on PDAs might</em><br />
<em>    not bother to memorize phone numbers and so forth, but the same would be true</em><br />
<em>    of people who used notebooks.</em></p>
<p>Perhaps. I find that I like to take in information, organize it, then put it someplace where I know where to find it (be it a book or web addresses or phone numbers), then I tend to remember where it is but not always what it is (at least not in any detail).</p>
<p><em>    Different parts of the brain exchange information, but it is extremely unlikely</em><br />
<em>    that there is a universal code that can simply be copied from one region to</em><br />
<em>    another.</em></p>
<p>Calvin seems to suggest that there is in at least some regions of the cortex. He suggests that some of the interesting idea &quot;combinations&quot; may occur when you copy a pattern from one area onto another area that is biased for running a different pattern or when to patterns run side-by-side and influence each other. I would tend to agree that long distance communications (e.g. memory fetching or physiological controls are probably hardwired with their own unique codes). If Calvin&#39;s concept of &#39;thought patterns&#39; has some merit and nanobots can read them out (and fetch and restore them to the neurons in some way), then I believe one has thought transference.</p>
<p><em>    Or would you admit that the diary was just an adjunct record which I could</em><br />
<em>    consult if I wanted to recover some lost bit of ephemera from the past?</em></p>
<p>I think the diary would function similar to the way good books do now, they recreate in your mind the images the author intended. A good author can probably create such rich mental images the first time you read something, that when you re-read the book, the images created are similar to those that are recreated when you read your diary. I would say that you have transfered some of your &#39;mind&#39; (static memories of experiences) into your diary.</p>
<p>Sounds neat, but again, I can authorize duplication of my diary, and it doesn&#39;t make me immortal, at least not literally (though perhaps literarily), and anyway, I allow that by some technology it might be possible to duplicate my brain (make a facsimilie) any number of times&#8230; so what?</p>
<p><em>    Here&#39;s where you attempt the sleight-of-hand maneuver &#8212; but I caught you!</em><br />
<em>    This &quot;majority of the mind&quot; you&#39;re talking about is just the computerized</em><br />
<em>    PDA/diary/ajunct/whatever that the person was supposedly using to expand her</em><br />
<em>    capabilties.</em></p>
<p> <img src='http://www.foresight.org/nanodot/wp-includes/images/smilies/icon_smile.gif' alt=':-)' class='wp-smiley' />  Well, I take the point of view that the human mind is a very sophisticated multi-processor (I can drive and think about a work problem at the same time). If I&#39;ve offloaded more and more of my sub-conscious thoughts (such as those required for much of the sensory processing in &#39;driving&#39;) into the exo-computer then part of my mind has migrated. One could even view anti-lock brakes and automobile anti-collision radars as first steps along this path. They are offloading and/or improving on the subconscious processing your brain normally does. There isn&#39;t a bionic interface yet, but I can recall a time or two when I wish I had one for my anti-lock brakes.</p>
<p><em>    Now you say &quot;an &#39;accident&#39; happens&quot;, meaning, the person is dead.</em></p>
<p>The person is brain-dead, yes.  How much of the mind is lost depends on the partitioning between their body and the exo-computer.</p>
<p><em>    End of that story. Perhaps there is another story here, about the &quot;disembodied&quot;</em><br />
<em>    computer software. Such rogue software could indeed play havoc. Let&#39;s make</em><br />
<em>    sure it can&#39;t, that any &quot;adjunct&quot; software created by a human individual dies</em><br />
<em>    with that individual, or is frozen at least so that it cannot cause harm.</em></p>
<p>Well, since that software and the memories are potentially valuable parts of the estate, I don&#39;t think you watn to &#39;dump&#39; them unless the individual has expressly requested this. Yes, I agree they need to be guaranteed as safe (presumably these run in some sort of virtual machine).</p>
<p><em>    First of all, why will we &quot;likely&quot; do this?</em></p>
<p>Because, if one does not gradually replace the neurons that are dying at a greater rate than they are being formed, at some point after hundreds of years, you are dead. Neuronal supplementation will be a requirement for the extension of life for thousands of years (which is what we will probably achieve by moderately advanced applications of biotechnology).</p>
<p><em>    So perhaps some biotech interventions might be desirable. But not a</em><br />
<em>    runaway cerebral hypertrophy that turns us into Mars creatures.</em><br />
<em>    Anyone who wants that needs to have their head examined, not expanded.</em></p>
<p>We climbed on the slippery slope when we started cooking our food, is there any reason to stop now? We should do it carefully and responsibly and try to accomodate the desires of as many people as possible, e.g. you shouldn&#39;t be &#39;required&#39; to go bionic. At the same time disallowing it seems to be like the arguments for not educating women or allowing blacks to vote. What makes current era human genomes or current era human intelligence the &#39;right&#39; place to stop?</p>
<p><em>    Has it never occurred to you that an unlimited &quot;mental evolution&quot; might</em><br />
<em>    not be a good thing?</em></p>
<p>Yes, thats why I say &#39;carefully&#39; and &#39;responsibly&#39;. If one reads Robin Hanson&#39;s paper <a href="http://hanson.gmu.edu/uploads.html">If Uploads Come First</a>, then one realizes uploads are not to be taken lightly! There will need to be limits on how many copies a person can make and how they treat uncopied minds and the resources they require.</p>
<p><em>    If a human brain dies naturally or is destroyed artificially, the person</em><br />
<em>    is dead, no matter what kind of artifact has been created in the meantime.</em></p>
<p>You seem to be attaching the person to his wet brain. My interest is in preserving the continuity of my consiousness and memories that I view as my &#39;mind&#39; which I am not convinced will always be limited to remaining in my wet brain.</p>
<p><em>    Are you nothing more than a &quot;mind-instance&quot;? And your destruction would be</em><br />
<em>    okay as long as some other &quot;copy&quot; would be &quot;activated&quot; afterward?</em></p>
<p>&quot;I&quot; am nothing more than a mind-instance until I personally am shown some concrete scientific evidence that something more is involved. (After-death experiences are hearsay evidence in my book.) While my destruction followed by the reactivation of a copy would not be my most desired path, I do take some comfort in knowing that part of me would continue to exist.</p>
<p><em>    So if I make a &quot;backup,&quot; and point a gun at you, you will have no fear?</em></p>
<p>Not unless you suppress my adreneline levels at the same time.</p>
<p><em>    Suppose I even continually update the &quot;backup&quot;, so that I can promise</em><br />
<em>    that your copy, when activated, will remember every experience, right</em><br />
<em>    up to the penetration of the bullet and your slow bleeding to death.</em><br />
<em>    Then you would have no problem with being shot?</em></p>
<p>Of course I&#39;m going to have a problem with it as its going to hurt like hell.</p>
<p><em>    We can even make it painless if you like. But you are going to die.</em><br />
<em>    I am going to make a copy and activate it, but you are going to die, sucker.</em></p>
<p>If you make it sufficiently worth my while, and had proven to me that the copy would indeed have a full set of my experiences up until the point of death, I would have no objection to this. The &quot;sufficiently worth my while&quot; would vary inversely with my degree of confidence in the accuracy of the copy. I would predict that if these technologies do become available, you will see many people creating interesting ways in which to kill themselves and be getting paid a lot to have it shown on &quot;Extreme Uploadings&quot;.</p>
<p>For me to claim that I object to being replaced by a copy would require that I invalidate the concept that the copies are indeed identical. Since I&#39;m fairly sure the copies can be made identical to a relatively insignificant degree of difference I cannot raise an objection. Should I value the <em>real</em> Mona Lisa over the atomic resolution copy of the Mona Lisa? Only if I have some romantic attachment to the atoms that Da Vinci used to create the original.</p>
<p>I found Moravec&#39;s discussion of topics related to this useful in creating my position.<br />
The document is: <a href="http://www.aeiveos.com/~bradbury/Authors/Computing/Moravec-H/HDPSF.html#TimeAndAlter%20nityByComputer">HERE</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: MarkGubrud</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1121</link>
		<dc:creator>MarkGubrud</dc:creator>
		<pubDate>Sat, 10 Feb 2001 03:40:44 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1121</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Uploading As Migration&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Mark is clearly an expert in quantum mechanics and related topics of physics.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I have some professors who&#039;d chuckle at that, and some who&#039;d howl.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;I would agree completely with the idea of &#039;uploading&#039; as a &#039;migration&#039; of our minds onto a different hardware platform.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This language is clearly derived from usage in the software industry, where &quot;applications migrate from one platform to another&quot;. However, even there this is a metaphor. Birds migrate, people migrate; they are actual things that actually move from one place to another. But &quot;software migration&quot; means that you stop using one computer and start using another for the same purpose. So you are saying that a person&#039;s brain could be discarded and another one used for the same purpose(s). That might be okay if the purposes were those of another person. If I am an employer using a computer programmer to get a piece of code written, it might suit me just fine to &quot;migrate&quot; the programmer&#039;s &quot;software&quot; to another &quot;platform.&quot; It would not benefit the programmer, however. Certainly not if I killed her in the process.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&quot;destructive readout&quot; and &quot;mental evolution onto different hardware&quot;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It seems to me that the second is not fundamentally different from the first, but is really only a kind of smoke-and-mirrors argument.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;While we cannot be absolutely certain to be able to &quot;backtrack&quot; the relocation of the molecules to their original states, I believe the largely redundant information in the DNA, protein structures and synaptic connections will allow a reconstruction of a largely equivalent molecular map.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;If molecules move or undergo significant changes of state in the freezing or vitrification process, the DNA will be completely useless in reconstructing their prior states at the time of death. Molecular-level details are almost certainly important; the naive idea that memory and personality can be reduced to synaptic connectivity is flatly contradicted by modern neuroscience. It seems likely that much information will in fact be irretrievably lost, although that would perhaps be equivalent to the effects of a major brain trauma which typically causes loss of recent memory and some reversible loss of competency.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;one either can reassemble a copy of the original brain (molecule by molecule) [or] simply run an atomic scale simulation of the brain on a very large supercomputer [but] even solar system sized supercomputers (Matrioshka Brains) currently seem to have insufficient capacity&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It makes no sense to imagine using a computer to do a simulation if building the real thing would be a more efficient way to get a result. You can call it a computer if you like.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;provided the recreated &quot;brain&quot; is sufficiently accurate, I would argue that either it, or the simulation of it, is precisely a migration (in essence a recreation) of the original individual.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;You put the word &quot;brain&quot; in quotes. Why? Perhaps because you recognize that there is something phoney about it. If it were an atomic-level facsimilie of the original brain, I would have to agree that it was a brain, no quotes. But it would not be the original brain. If it were a simulation, it would not even be a brain. In either case, the original person would have been destroyed, killed.&lt;/p&gt;
&lt;p&gt;Introducing new terminology at the last minute, you downgrage &quot;precisely a migration&quot; to &quot;in essence a recreation&quot;. The latter seems to claim a lot less than the former. But it is not entirely clear what either phrase is claiming. What is apparent is that after our hypothetical procedure we have at best some kind of &quot;recreation,&quot; perhaps in the same sense that Disneyland gives us recreations of the Old West and so on. It certainly is not the original, real thing, in this case the human being whose murder you imagined.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;I personally, do not like the taste this approach leaves in my mouth&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Why not? You are admitting there is something wrong with the claim that it amounts to your &quot;own personal &#039;indefinite longevity&#039;&quot;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;I would not quibble that the recreation is effectively &quot;me&quot;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;What do you mean by the word &quot;effectively&quot;? You are admitting, again, that there is something wrong with the claim that it would really be &quot;me&quot;, whatever this means.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;nanobots that are capable of mapping the neural structure of the brain to the fine detail level of the synaptic connections and the relative &quot;strengths&quot; of those connections (by determining how many receptors are in each synaptic cleft, the quantity of neurotransmitters released when the neuron &quot;fires&quot;, etc.).&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This is certainly not sufficient to reconstruct a functioning brain or brain simulation. At a minimum, you will need also the complete three-dimensional geometry of the dendritic, somatic and axonal membrane, including, very probably, the distribution if not the particular locations of protein complexes throughout. Otherwise you will lose all information about dendritic computation, timing in multineuron assemblies, and likely chemical and glia-mediated interactions between neighbor neurons. However, even that is not enough. Almost certainly there are internal neuron states involving non-membrane-bound proteins, the cytoskeleton, and nucleic acids, and which would have an effect on personality and may even play a role in memory.&lt;/p&gt;
&lt;p&gt;In short, the technical challenge of &quot;uploading&quot; has almost certainly been underestimated by most enthusiasts and authors on the subject. However, this is perhaps only an incidental observation. The important point is that the claim that such processes offer a way for the individual to escape death and &quot;migrate to other hardware&quot; is ontological nonsense.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;whole brain network of nanobots is connected (via fiber optics or very high frequency microwave links)&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Any estimates on the amount of disruption and displacement of brain tissue by the required fiber optics, or the heating by the microwaves?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;internal nanobot network would monitor all internal brain communications gradually learn to interpret the meanings of specific signals (as neuroscientists currently do with NMR and PET scans localizing brain functions to various regions, but at a much finer resolution&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;You evidently want &lt;em&gt;much&lt;/em&gt; &quot;finer resolution&quot;. We can&#039;t rule out that it might be possible, if, as you say, one has complete monitoring of all activity, and a sufficiently powerful computer. But these are not requirements to be underestimated. And even so, it is not clear that such an approach would ever succeed in teasing out &quot;the meanings&quot; of &lt;em&gt;all&lt;/em&gt; &quot;specific signals.&quot; There may be thoughts that will occur to me only a few times in my life, tied to memories that for the most part remain buried in the tangle. I think this kind of approach might within a reasonable amount of time give an eavesdropper some crude capability to &quot;read&quot; some of my mind, but it is not clear that it would ever give him the ability to reconstruct a truly faithful copy.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Our minds and the exocomputer would interact until an effective shorthand is developed between our minds and the computer that allows rapid two-way communication&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Now you are talking about interfacing, not &quot;uploading.&quot;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;One should easily be able to accept that our mind would &quot;program&quot; the exo-computer with agents to store and retreive data (off-loading of our emory)&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;How is this &quot;off-loading&quot; anything? The brain might learn to use an external tool through a bionic interface, but this does not seem likely to be a seamless integration, much less an &quot;off-loading&quot; of already established memory and personality, much less any sort of &quot;migration&quot; of &quot;consciousness&quot;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;people are doing this now with Personal Digital Assistant devices&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;And they can continue to do so, with no need for any bionic implants. As the devices and software get better, they will put more and more information power at our disposal without needing to invade our bodies.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;but the result is that their memories are becoming poorer, presumably due to a lack of exercise.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I find this news item quite dubious. Of course, people who rely on PDAs might not bother to memorize phone numbers and so forth, but the same would be true of people who used notebooks.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&quot;thoughts&quot; are neuronal firing patterns that can be copied from one part of the brain to another,&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Different parts of the brain exchange information, but it is extremely unlikely that there is a universal code that can simply be copied from one region to another.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;then one can envision moving &quot;thoughts&quot; into the exo-computer as well.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I cannot envision &quot;moving &#039;thoughts&#039;&quot; at all. I know how to move brains, but not thoughts.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;as time goes by, more and more of one&#039;s memory and &#039;mind&#039; ends up in the exo-computer.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;If I keep a diary very religiously, so that after thirty years or so there is much more detailed information about my life contained in the many volumes of my diary (and there are people who have done this) than I could ever recall, then would you say that my &quot;memory and &#039;mind&#039;&quot; (there you go again with the apologetic quote marks) ended up in the diary? Or would you admit that the diary was just an adjunct record which I could consult if I wanted to recover some lost bit of ephemera from the past?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;At that point, because the exocomputer hardware would be designed to allow &quot;copying&quot;, you can relocate this part of your mind and execute it anywhere&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Sounds neat, but again, I can authorize duplication of my diary, and it doesn&#039;t make me immortal, at least not literally (though perhaps literarily), and anyway, I allow that by some technology it might be possible to duplicate my brain (make a facsimilie) any number of times... so what?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;an &#039;accident&#039; happens to the original body/brain, leaving the majority of the mind disembodied in the exo-computer&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Here&#039;s where you attempt the sleight-of-hand maneuver -- but I caught you! This &quot;majority of the mind&quot; you&#039;re talking about is just the computerized PDA/diary/ajunct/whatever that the person was supposedly using to expand her capabilties. Now you say &quot;an &#039;accident&#039; happens&quot;, meaning, the person is dead. End of that story. Perhaps there is another story here, about the &quot;disembodied&quot; computer software. Such rogue software could indeed play havoc. Let&#039;s make sure it can&#039;t, that any &quot;adjunct&quot; software created by a human individual dies with that individual, or is frozen at least so that it cannot cause harm.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;It is likely that the evolved mind would view the loss of the original body/brain as we humans currently view the loss of a finger&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I could not have found words to more effectively express the monstrousness of what you are proposing.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;It is worth noting that this scenario is probably not much different from what goes on normally each day&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Uh....&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;neurons do die (in large numbers) so over time we are losing memory, or at least the accuracy of it, and we certainly do neuronal connection remodeling&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Yes, all this is true and is a normal part of human existence. It shows that, over long periods of time, we change substantially, and not only in the exchange of our fundamental particles. Our existence as distinct, unique individuals is local in space and time, and is extended over time only through the continuity of life. This is all part and parcel of the human condition, and it should inspire some humility rather than hubris.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;we will likely chose to enhance our brains to decrease the rate of neuron cell death and/or increase the rate of neuron replacement.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;First of all, why will we &quot;likely&quot; do this? To serve what purpose? Well, I suppose most people as they age are annoyed at the gradual loss of competence, and would like a return to the vigor of youth. So perhaps some biotech interventions might be desirable. But not a runaway cerebral hypertrophy that turns us into Mars creatures. Anyone who wants that needs to have their head examined, not expanded.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;This is likely to increase the pace of our own mental evolution. The nanobot to exo-computer approach simply utilizes more sophisticated hardware to further increase the rate at which this process occurs.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Has it never occurred to you that an unlimited &quot;mental evolution&quot; might not be a good thing?&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;There is probably a third approach that involves the gradual replacement of neurons with enhanced bio-engineered neurons (with I/O ports that can be more easily &#039;tapped&#039; by the nanobots) or even nanobot-based neurons themselves.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;If a human brain dies naturally or is destroyed artificially, the person is dead, no matter what kind of artifact has been created in the meantime.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&#039;true&#039; indefinite longevity, will require that you either learn to live with the idea that the death of a mind-instance (due to an accident) and the subsequent activation of a copy is still &#039;you&#039; (at least up to the last backup point)&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Are you nothing more than a &quot;mind-instance&quot;? And your destruction would be okay as long as some other &quot;copy&quot; would be &quot;activated&quot; afterward? So if I make a &quot;backup,&quot; and point a gun at you, you will have no fear? Suppose I even continually update the &quot;backup&quot;, so that I can promise that your copy, when activated, will remember every experience, right up to the penetration of the bullet and your slow bleeding to death. Then you would have no problem with being shot? We can even make it painless if you like. But you are going to die. I am going to make a copy and activate it, but you are going to die, sucker.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Given these various approaches I fail to see how uploading cannot be viewed as a migration&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Try harder to see it.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Uploading As Migration</strong></p>
<blockquote>
<p>Mark is clearly an expert in quantum mechanics and related topics of physics.</p>
</blockquote>
<p>I have some professors who&#39;d chuckle at that, and some who&#39;d howl.</p>
<blockquote>
<p>I would agree completely with the idea of &#39;uploading&#39; as a &#39;migration&#39; of our minds onto a different hardware platform.</p>
</blockquote>
<p>This language is clearly derived from usage in the software industry, where &quot;applications migrate from one platform to another&quot;. However, even there this is a metaphor. Birds migrate, people migrate; they are actual things that actually move from one place to another. But &quot;software migration&quot; means that you stop using one computer and start using another for the same purpose. So you are saying that a person&#39;s brain could be discarded and another one used for the same purpose(s). That might be okay if the purposes were those of another person. If I am an employer using a computer programmer to get a piece of code written, it might suit me just fine to &quot;migrate&quot; the programmer&#39;s &quot;software&quot; to another &quot;platform.&quot; It would not benefit the programmer, however. Certainly not if I killed her in the process.</p>
<blockquote>
<p>&quot;destructive readout&quot; and &quot;mental evolution onto different hardware&quot;</p>
</blockquote>
<p>It seems to me that the second is not fundamentally different from the first, but is really only a kind of smoke-and-mirrors argument.</p>
<blockquote>
<p>While we cannot be absolutely certain to be able to &quot;backtrack&quot; the relocation of the molecules to their original states, I believe the largely redundant information in the DNA, protein structures and synaptic connections will allow a reconstruction of a largely equivalent molecular map.</p>
</blockquote>
<p>If molecules move or undergo significant changes of state in the freezing or vitrification process, the DNA will be completely useless in reconstructing their prior states at the time of death. Molecular-level details are almost certainly important; the naive idea that memory and personality can be reduced to synaptic connectivity is flatly contradicted by modern neuroscience. It seems likely that much information will in fact be irretrievably lost, although that would perhaps be equivalent to the effects of a major brain trauma which typically causes loss of recent memory and some reversible loss of competency.</p>
<blockquote>
<p>one either can reassemble a copy of the original brain (molecule by molecule) [or] simply run an atomic scale simulation of the brain on a very large supercomputer [but] even solar system sized supercomputers (Matrioshka Brains) currently seem to have insufficient capacity</p>
</blockquote>
<p>It makes no sense to imagine using a computer to do a simulation if building the real thing would be a more efficient way to get a result. You can call it a computer if you like.</p>
<blockquote>
<p>provided the recreated &quot;brain&quot; is sufficiently accurate, I would argue that either it, or the simulation of it, is precisely a migration (in essence a recreation) of the original individual.</p>
</blockquote>
<p>You put the word &quot;brain&quot; in quotes. Why? Perhaps because you recognize that there is something phoney about it. If it were an atomic-level facsimilie of the original brain, I would have to agree that it was a brain, no quotes. But it would not be the original brain. If it were a simulation, it would not even be a brain. In either case, the original person would have been destroyed, killed.</p>
<p>Introducing new terminology at the last minute, you downgrage &quot;precisely a migration&quot; to &quot;in essence a recreation&quot;. The latter seems to claim a lot less than the former. But it is not entirely clear what either phrase is claiming. What is apparent is that after our hypothetical procedure we have at best some kind of &quot;recreation,&quot; perhaps in the same sense that Disneyland gives us recreations of the Old West and so on. It certainly is not the original, real thing, in this case the human being whose murder you imagined.</p>
<blockquote>
<p>I personally, do not like the taste this approach leaves in my mouth</p>
</blockquote>
<p>Why not? You are admitting there is something wrong with the claim that it amounts to your &quot;own personal &#39;indefinite longevity&#39;&quot;.</p>
<blockquote>
<p>I would not quibble that the recreation is effectively &quot;me&quot;.</p>
</blockquote>
<p>What do you mean by the word &quot;effectively&quot;? You are admitting, again, that there is something wrong with the claim that it would really be &quot;me&quot;, whatever this means.</p>
<blockquote>
<p>nanobots that are capable of mapping the neural structure of the brain to the fine detail level of the synaptic connections and the relative &quot;strengths&quot; of those connections (by determining how many receptors are in each synaptic cleft, the quantity of neurotransmitters released when the neuron &quot;fires&quot;, etc.).</p>
</blockquote>
<p>This is certainly not sufficient to reconstruct a functioning brain or brain simulation. At a minimum, you will need also the complete three-dimensional geometry of the dendritic, somatic and axonal membrane, including, very probably, the distribution if not the particular locations of protein complexes throughout. Otherwise you will lose all information about dendritic computation, timing in multineuron assemblies, and likely chemical and glia-mediated interactions between neighbor neurons. However, even that is not enough. Almost certainly there are internal neuron states involving non-membrane-bound proteins, the cytoskeleton, and nucleic acids, and which would have an effect on personality and may even play a role in memory.</p>
<p>In short, the technical challenge of &quot;uploading&quot; has almost certainly been underestimated by most enthusiasts and authors on the subject. However, this is perhaps only an incidental observation. The important point is that the claim that such processes offer a way for the individual to escape death and &quot;migrate to other hardware&quot; is ontological nonsense.</p>
<blockquote>
<p>whole brain network of nanobots is connected (via fiber optics or very high frequency microwave links)</p>
</blockquote>
<p>Any estimates on the amount of disruption and displacement of brain tissue by the required fiber optics, or the heating by the microwaves?</p>
<blockquote>
<p>internal nanobot network would monitor all internal brain communications gradually learn to interpret the meanings of specific signals (as neuroscientists currently do with NMR and PET scans localizing brain functions to various regions, but at a much finer resolution</p>
</blockquote>
<p>You evidently want <em>much</em> &quot;finer resolution&quot;. We can&#39;t rule out that it might be possible, if, as you say, one has complete monitoring of all activity, and a sufficiently powerful computer. But these are not requirements to be underestimated. And even so, it is not clear that such an approach would ever succeed in teasing out &quot;the meanings&quot; of <em>all</em> &quot;specific signals.&quot; There may be thoughts that will occur to me only a few times in my life, tied to memories that for the most part remain buried in the tangle. I think this kind of approach might within a reasonable amount of time give an eavesdropper some crude capability to &quot;read&quot; some of my mind, but it is not clear that it would ever give him the ability to reconstruct a truly faithful copy.</p>
<blockquote>
<p>Our minds and the exocomputer would interact until an effective shorthand is developed between our minds and the computer that allows rapid two-way communication</p>
</blockquote>
<p>Now you are talking about interfacing, not &quot;uploading.&quot;</p>
<blockquote>
<p>One should easily be able to accept that our mind would &quot;program&quot; the exo-computer with agents to store and retreive data (off-loading of our emory)</p>
</blockquote>
<p>How is this &quot;off-loading&quot; anything? The brain might learn to use an external tool through a bionic interface, but this does not seem likely to be a seamless integration, much less an &quot;off-loading&quot; of already established memory and personality, much less any sort of &quot;migration&quot; of &quot;consciousness&quot;.</p>
<blockquote>
<p>people are doing this now with Personal Digital Assistant devices</p>
</blockquote>
<p>And they can continue to do so, with no need for any bionic implants. As the devices and software get better, they will put more and more information power at our disposal without needing to invade our bodies.</p>
<blockquote>
<p>but the result is that their memories are becoming poorer, presumably due to a lack of exercise.</p>
</blockquote>
<p>I find this news item quite dubious. Of course, people who rely on PDAs might not bother to memorize phone numbers and so forth, but the same would be true of people who used notebooks.</p>
<blockquote>
<p>&quot;thoughts&quot; are neuronal firing patterns that can be copied from one part of the brain to another,</p>
</blockquote>
<p>Different parts of the brain exchange information, but it is extremely unlikely that there is a universal code that can simply be copied from one region to another.</p>
<blockquote>
<p>then one can envision moving &quot;thoughts&quot; into the exo-computer as well.</p>
</blockquote>
<p>I cannot envision &quot;moving &#39;thoughts&#39;&quot; at all. I know how to move brains, but not thoughts.</p>
<blockquote>
<p>as time goes by, more and more of one&#39;s memory and &#39;mind&#39; ends up in the exo-computer.</p>
</blockquote>
<p>If I keep a diary very religiously, so that after thirty years or so there is much more detailed information about my life contained in the many volumes of my diary (and there are people who have done this) than I could ever recall, then would you say that my &quot;memory and &#39;mind&#39;&quot; (there you go again with the apologetic quote marks) ended up in the diary? Or would you admit that the diary was just an adjunct record which I could consult if I wanted to recover some lost bit of ephemera from the past?</p>
<blockquote>
<p>At that point, because the exocomputer hardware would be designed to allow &quot;copying&quot;, you can relocate this part of your mind and execute it anywhere</p>
</blockquote>
<p>Sounds neat, but again, I can authorize duplication of my diary, and it doesn&#39;t make me immortal, at least not literally (though perhaps literarily), and anyway, I allow that by some technology it might be possible to duplicate my brain (make a facsimilie) any number of times&#8230; so what?</p>
<blockquote>
<p>an &#39;accident&#39; happens to the original body/brain, leaving the majority of the mind disembodied in the exo-computer</p>
</blockquote>
<p>Here&#39;s where you attempt the sleight-of-hand maneuver &#8212; but I caught you! This &quot;majority of the mind&quot; you&#39;re talking about is just the computerized PDA/diary/ajunct/whatever that the person was supposedly using to expand her capabilties. Now you say &quot;an &#39;accident&#39; happens&quot;, meaning, the person is dead. End of that story. Perhaps there is another story here, about the &quot;disembodied&quot; computer software. Such rogue software could indeed play havoc. Let&#39;s make sure it can&#39;t, that any &quot;adjunct&quot; software created by a human individual dies with that individual, or is frozen at least so that it cannot cause harm.</p>
<blockquote>
<p>It is likely that the evolved mind would view the loss of the original body/brain as we humans currently view the loss of a finger</p>
</blockquote>
<p>I could not have found words to more effectively express the monstrousness of what you are proposing.</p>
<blockquote>
<p>It is worth noting that this scenario is probably not much different from what goes on normally each day</p>
</blockquote>
<p>Uh&#8230;.</p>
<blockquote>
<p>neurons do die (in large numbers) so over time we are losing memory, or at least the accuracy of it, and we certainly do neuronal connection remodeling</p>
</blockquote>
<p>Yes, all this is true and is a normal part of human existence. It shows that, over long periods of time, we change substantially, and not only in the exchange of our fundamental particles. Our existence as distinct, unique individuals is local in space and time, and is extended over time only through the continuity of life. This is all part and parcel of the human condition, and it should inspire some humility rather than hubris.</p>
<blockquote>
<p>we will likely chose to enhance our brains to decrease the rate of neuron cell death and/or increase the rate of neuron replacement.</p>
</blockquote>
<p>First of all, why will we &quot;likely&quot; do this? To serve what purpose? Well, I suppose most people as they age are annoyed at the gradual loss of competence, and would like a return to the vigor of youth. So perhaps some biotech interventions might be desirable. But not a runaway cerebral hypertrophy that turns us into Mars creatures. Anyone who wants that needs to have their head examined, not expanded.</p>
<blockquote>
<p>This is likely to increase the pace of our own mental evolution. The nanobot to exo-computer approach simply utilizes more sophisticated hardware to further increase the rate at which this process occurs.</p>
</blockquote>
<p>Has it never occurred to you that an unlimited &quot;mental evolution&quot; might not be a good thing?</p>
<blockquote>
<p>There is probably a third approach that involves the gradual replacement of neurons with enhanced bio-engineered neurons (with I/O ports that can be more easily &#39;tapped&#39; by the nanobots) or even nanobot-based neurons themselves.</p>
</blockquote>
<p>If a human brain dies naturally or is destroyed artificially, the person is dead, no matter what kind of artifact has been created in the meantime.</p>
<blockquote>
<p>&#39;true&#39; indefinite longevity, will require that you either learn to live with the idea that the death of a mind-instance (due to an accident) and the subsequent activation of a copy is still &#39;you&#39; (at least up to the last backup point)</p>
</blockquote>
<p>Are you nothing more than a &quot;mind-instance&quot;? And your destruction would be okay as long as some other &quot;copy&quot; would be &quot;activated&quot; afterward? So if I make a &quot;backup,&quot; and point a gun at you, you will have no fear? Suppose I even continually update the &quot;backup&quot;, so that I can promise that your copy, when activated, will remember every experience, right up to the penetration of the bullet and your slow bleeding to death. Then you would have no problem with being shot? We can even make it painless if you like. But you are going to die. I am going to make a copy and activate it, but you are going to die, sucker.</p>
<blockquote>
<p>Given these various approaches I fail to see how uploading cannot be viewed as a migration</p>
</blockquote>
<p>Try harder to see it.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RobertBradbury</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1120</link>
		<dc:creator>RobertBradbury</dc:creator>
		<pubDate>Tue, 06 Feb 2001 19:56:26 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1120</guid>
		<description>&lt;p&gt;&lt;strong&gt;Uploading As Migration&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Mark is clearly an expert in quantum mechanics and related topics of physics. I notice that he &#039;disses&#039; Penrose&#039;s &#039;consciousness in the microtubules&#039; perspective and he seems to dislike the idea that there is anything &#039;mysterious&#039; or &#039;magical&#039; about the brain.&lt;/p&gt;
&lt;p&gt;If those statements are moderately accurate then I believe that he and I have similar perspectives regarding the physical basis of human minds. What I cannot understand is precisely what he is objecting to in the &#039;migration&#039; perspective Max proposes.&lt;/p&gt;
&lt;p&gt;Off and on I&#039;ve devoted a couple of years of thought to the ideas regarding uploading and I find that I would agree completely with the idea of &#039;uploading&#039; as a &#039;migration&#039; of our minds onto a different hardware platform.&lt;/p&gt;
&lt;p&gt;There are at least two forms of uploading that one can consider, &quot;destructive readout&quot; and &quot;mental evolution onto different hardware&quot;.&lt;/p&gt;
&lt;p&gt;The &quot;destructive readout&quot; method involves freezing the brain, then disassembling it atom by atom taking careful note of where everything is. Once that is complete, a great deal of computer structural analysis is done to determine precisely what the state of the brain was before the freezing occurred. While we cannot be absolutely certain to be able to &quot;backtrack&quot; the relocation of the molecules to their original states, I believe the largely redundant information in the DNA, protein structures and synaptic connections will allow a reconstruction of a largely equivalent molecular map. Then one either can reassemble a copy of the original brain (molecule by molecule) using highly parallel AFM directed methods. An alternative is to simply run an atomic scale simulation of the brain on a very large supercomputer. The only problem with the last approach seems to be the fact that even solar system sized supercomputers (&lt;a href=&quot;http://www.aeiveos.com/~bradbury/MatrioshkaBrains/&quot;&gt;Matrioshka Brains&lt;/a&gt;) currently seem to have insufficient capacity to perform this task given current molecular modeling methods. Most likely cellular automata architectures highly specialized for running molecular sumulations (perhaps derived from IBM&#039;s Blue Gene supercomputer) may be able to accomplish this.&lt;/p&gt;
&lt;p&gt;Now, provided the recreated &quot;brain&quot; is sufficiently accurate, I would argue that either it, or the simulation of it, is precisely a migration (in essence a recreation) of the original individual. While I personally, do not like the taste this approach leaves in my mouth with regard to my own personal &quot;indefinite longevity&quot;, I would not quibble that the recreation is effectively &quot;me&quot;.&lt;/p&gt;
&lt;p&gt;An alternative to this is &quot;mental evolution onto different hardware&quot;. In this instance, one would administer to the individual nanobots that are capable of mapping the neural structure of the brain to the fine detail level of the synaptic connections and the relative &quot;strengths&quot; of those connections (by determining how many receptors are in each synaptic cleft, the quantity of neurotransmitters released when the neuron &quot;fires&quot;, etc.). These monitoring nanobots would then construct an integrated whole-brain communications network (using methods disscussed in &lt;a href=&quot;http://www.nanomedicine.com/7.1.html&quot;&gt;Chapter 7&lt;/a&gt; of &lt;a href=&quot;http://www.nanomedicine.com&quot;&gt;Nanomedicine&lt;/a&gt;). The whole brain network of nanobots is connected (via fiber optics or very high frequency microwave links) to exo-computers with significantly more capacity than the human brain. The internal nanobot network would monitor all internal brain communications gradually learn to interpret the meanings of specific signals (as neuroscientists currently do with NMR and PET scans localizing brain functions to various regions, but at a much finer resolution). Our minds and the exocomputer would interact until an effective shorthand is developed between our minds and the computer that allows rapid two-way communication. One should easily be able to accept that our mind would &quot;program&quot; the exo-computer with agents to store and retreive data (off-loading of our memory). [As an aside, I&#039;ll note that a recent &lt;a href=&quot;http://slashdot.org/article.pl?sid=01/02/05/1931236&amp;mode=thread&quot;&gt;news item&lt;/a&gt; suggested that people are doing this now with Personal Digital Assistant devices, but the result is that their memories are becoming poorer, presumably due to a lack of exercise.] If, as &lt;a href=&quot;http://www.williamcalvin.com/&quot;&gt;William Calvin&lt;/a&gt;, suggests our &quot;thoughts&quot; are neuronal firing patterns that can be copied from one part of the brain to another, then one can envision moving &quot;thoughts&quot; into the exo-computer as well. I would expect this to be a gradual evolutionary process where as time goes by, more and more of one&#039;s memory and &#039;mind&#039; ends up in the exo-computer. At that point, because the exocomputer hardware would be designed to allow &quot;copying&quot;, you can relocate this part of your mind and execute it anywhere (allowing very lengthy longevities if you have copies in enough &#039;safe&#039; locations). Because the capacity of the exo-computer can be scaled up (a single 1 cm&lt;sup&gt;3&lt;/sup&gt; Drexlerian nanocomputer can probably run 100,000 human &#039;minds&#039;), people gradually put more and more of their &#039;minds&#039; into the computer. Eventually at some point an &#039;accident&#039; happens to the original body/brain, leaving the majority of the mind disembodied in the exo-computer (or circulating around the net). Now, given molecular scale manufacturing capabilities, you could &#039;reconstruct&#039; the original mind, but its capacities would be so limited compared with the evolved (uploaded) mind, there doesn&#039;t seem to be much point to doing this. It is likely that the evolved mind would view the loss of the original body/brain as we humans currently view the loss of a finger, or a tooth, or perhaps the millions of cells we lose every day.&lt;/p&gt;
&lt;p&gt;It is worth noting that this scenario is probably not much different from what goes on normally each day because neurons do die (in large numbers) so over time we are losing memory, or at least the accuracy of it, and we certainly do neuronal connection remodeling and perhaps even gradual neuron replacement with new neurons derived from stem cells. As the pace of biotechnology picks up we will likely chose to enhance our brains to decrease the rate of neuron cell death and/or increase the rate of neuron replacement. This is likely to increase the pace of our own mental evolution. The nanobot to exo-computer approach simply utilizes more sophisticated hardware to further increase the rate at which this process occurs.&lt;/p&gt;
&lt;p&gt;Now, the only &quot;catches&quot; I see to this scenario is whether or not the nanobots can monitor the brain at a sufficiently fine scale to get &quot;all&quot; of the information that is there and whether there will be sufficient bandwidth between the human brain and the exocomputer to allow effective integration of the minds (or whether you develop a split-personality disorder).&lt;/p&gt;
&lt;p&gt;There is probably a third approach that involves the gradual replacement of neurons with enhanced bio-engineered neurons (with I/O ports that can be more easily &#039;tapped&#039; by the nanobots) or even nanobot-based neurons themselves. However, &#039;true&#039; indefinite longevity, &lt;em&gt;will&lt;/em&gt; require that you either learn to live with the idea that the death of a mind-instance (due to an accident) and the subsequent activation of a copy is still &#039;you&#039; (at least up to the last backup point), or else it will be necessary to distribute your mind over a very large volume of space such that local &#039;accidents&#039; only damage very limited parts of your mind (as a minor stroke might do our current brains).&lt;/p&gt;
&lt;p&gt;Given these various approaches I fail to see how uploading &lt;em&gt;cannot&lt;/em&gt; be viewed as a migration across hardware platforms. It is worth noting that much of my career in the software industry involved taking programs and &#039;porting&#039; them onto different hardware platforms. Software &lt;em&gt;does&lt;/em&gt; migrate across platforms even if it isn&#039;t designed for it. The brain is the hardware that supports the human mind. The mind, to me, seems to be intimately enmeshed with the physical structure of the neurons and the molecular architecture of the synapses. Unless one says that it is &quot;impossible&quot; to replace those hardware parts with other effectively equivalent hardware (or software) parts, I fail to see how one can assert that the human mind cannot be migrated.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Uploading As Migration</strong></p>
<p>Mark is clearly an expert in quantum mechanics and related topics of physics. I notice that he &#39;disses&#39; Penrose&#39;s &#39;consciousness in the microtubules&#39; perspective and he seems to dislike the idea that there is anything &#39;mysterious&#39; or &#39;magical&#39; about the brain.</p>
<p>If those statements are moderately accurate then I believe that he and I have similar perspectives regarding the physical basis of human minds. What I cannot understand is precisely what he is objecting to in the &#39;migration&#39; perspective Max proposes.</p>
<p>Off and on I&#39;ve devoted a couple of years of thought to the ideas regarding uploading and I find that I would agree completely with the idea of &#39;uploading&#39; as a &#39;migration&#39; of our minds onto a different hardware platform.</p>
<p>There are at least two forms of uploading that one can consider, &quot;destructive readout&quot; and &quot;mental evolution onto different hardware&quot;.</p>
<p>The &quot;destructive readout&quot; method involves freezing the brain, then disassembling it atom by atom taking careful note of where everything is. Once that is complete, a great deal of computer structural analysis is done to determine precisely what the state of the brain was before the freezing occurred. While we cannot be absolutely certain to be able to &quot;backtrack&quot; the relocation of the molecules to their original states, I believe the largely redundant information in the DNA, protein structures and synaptic connections will allow a reconstruction of a largely equivalent molecular map. Then one either can reassemble a copy of the original brain (molecule by molecule) using highly parallel AFM directed methods. An alternative is to simply run an atomic scale simulation of the brain on a very large supercomputer. The only problem with the last approach seems to be the fact that even solar system sized supercomputers (<a href="http://www.aeiveos.com/~bradbury/MatrioshkaBrains/">Matrioshka Brains</a>) currently seem to have insufficient capacity to perform this task given current molecular modeling methods. Most likely cellular automata architectures highly specialized for running molecular sumulations (perhaps derived from IBM&#39;s Blue Gene supercomputer) may be able to accomplish this.</p>
<p>Now, provided the recreated &quot;brain&quot; is sufficiently accurate, I would argue that either it, or the simulation of it, is precisely a migration (in essence a recreation) of the original individual. While I personally, do not like the taste this approach leaves in my mouth with regard to my own personal &quot;indefinite longevity&quot;, I would not quibble that the recreation is effectively &quot;me&quot;.</p>
<p>An alternative to this is &quot;mental evolution onto different hardware&quot;. In this instance, one would administer to the individual nanobots that are capable of mapping the neural structure of the brain to the fine detail level of the synaptic connections and the relative &quot;strengths&quot; of those connections (by determining how many receptors are in each synaptic cleft, the quantity of neurotransmitters released when the neuron &quot;fires&quot;, etc.). These monitoring nanobots would then construct an integrated whole-brain communications network (using methods disscussed in <a href="http://www.nanomedicine.com/7.1.html">Chapter 7</a> of <a href="http://www.nanomedicine.com">Nanomedicine</a>). The whole brain network of nanobots is connected (via fiber optics or very high frequency microwave links) to exo-computers with significantly more capacity than the human brain. The internal nanobot network would monitor all internal brain communications gradually learn to interpret the meanings of specific signals (as neuroscientists currently do with NMR and PET scans localizing brain functions to various regions, but at a much finer resolution). Our minds and the exocomputer would interact until an effective shorthand is developed between our minds and the computer that allows rapid two-way communication. One should easily be able to accept that our mind would &quot;program&quot; the exo-computer with agents to store and retreive data (off-loading of our memory). [As an aside, I&#39;ll note that a recent <a href="http://slashdot.org/article.pl?sid=01/02/05/1931236&amp;mode=thread">news item</a> suggested that people are doing this now with Personal Digital Assistant devices, but the result is that their memories are becoming poorer, presumably due to a lack of exercise.] If, as <a href="http://www.williamcalvin.com/">William Calvin</a>, suggests our &quot;thoughts&quot; are neuronal firing patterns that can be copied from one part of the brain to another, then one can envision moving &quot;thoughts&quot; into the exo-computer as well. I would expect this to be a gradual evolutionary process where as time goes by, more and more of one&#39;s memory and &#39;mind&#39; ends up in the exo-computer. At that point, because the exocomputer hardware would be designed to allow &quot;copying&quot;, you can relocate this part of your mind and execute it anywhere (allowing very lengthy longevities if you have copies in enough &#39;safe&#39; locations). Because the capacity of the exo-computer can be scaled up (a single 1 cm<sup>3</sup> Drexlerian nanocomputer can probably run 100,000 human &#39;minds&#39;), people gradually put more and more of their &#39;minds&#39; into the computer. Eventually at some point an &#39;accident&#39; happens to the original body/brain, leaving the majority of the mind disembodied in the exo-computer (or circulating around the net). Now, given molecular scale manufacturing capabilities, you could &#39;reconstruct&#39; the original mind, but its capacities would be so limited compared with the evolved (uploaded) mind, there doesn&#39;t seem to be much point to doing this. It is likely that the evolved mind would view the loss of the original body/brain as we humans currently view the loss of a finger, or a tooth, or perhaps the millions of cells we lose every day.</p>
<p>It is worth noting that this scenario is probably not much different from what goes on normally each day because neurons do die (in large numbers) so over time we are losing memory, or at least the accuracy of it, and we certainly do neuronal connection remodeling and perhaps even gradual neuron replacement with new neurons derived from stem cells. As the pace of biotechnology picks up we will likely chose to enhance our brains to decrease the rate of neuron cell death and/or increase the rate of neuron replacement. This is likely to increase the pace of our own mental evolution. The nanobot to exo-computer approach simply utilizes more sophisticated hardware to further increase the rate at which this process occurs.</p>
<p>Now, the only &quot;catches&quot; I see to this scenario is whether or not the nanobots can monitor the brain at a sufficiently fine scale to get &quot;all&quot; of the information that is there and whether there will be sufficient bandwidth between the human brain and the exocomputer to allow effective integration of the minds (or whether you develop a split-personality disorder).</p>
<p>There is probably a third approach that involves the gradual replacement of neurons with enhanced bio-engineered neurons (with I/O ports that can be more easily &#39;tapped&#39; by the nanobots) or even nanobot-based neurons themselves. However, &#39;true&#39; indefinite longevity, <em>will</em> require that you either learn to live with the idea that the death of a mind-instance (due to an accident) and the subsequent activation of a copy is still &#39;you&#39; (at least up to the last backup point), or else it will be necessary to distribute your mind over a very large volume of space such that local &#39;accidents&#39; only damage very limited parts of your mind (as a minor stroke might do our current brains).</p>
<p>Given these various approaches I fail to see how uploading <em>cannot</em> be viewed as a migration across hardware platforms. It is worth noting that much of my career in the software industry involved taking programs and &#39;porting&#39; them onto different hardware platforms. Software <em>does</em> migrate across platforms even if it isn&#39;t designed for it. The brain is the hardware that supports the human mind. The mind, to me, seems to be intimately enmeshed with the physical structure of the neurons and the molecular architecture of the synapses. Unless one says that it is &quot;impossible&quot; to replace those hardware parts with other effectively equivalent hardware (or software) parts, I fail to see how one can assert that the human mind cannot be migrated.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Iron Sun</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1142</link>
		<dc:creator>Iron Sun</dc:creator>
		<pubDate>Mon, 29 Jan 2001 08:47:56 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1142</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Color red, religion-like, and mountaineering&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;My comment about explaining the color red to a blind person is not religious-like at all&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Yes it is. It implies a special knowledge, enlightenment or state of grace that must be believed or directly experienced rather than be subjected to rational debate.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;A case in point.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;A case against point: You have chosen a daring and yet not beyond the pale activity to illustrate, or perhaps obfuscate, your position. Allow me to present a different angle.&lt;/p&gt;
&lt;p&gt;In my misspent youth, I came into contact with people involved in what is commonly known as the drug culture. I was mainly a spectator, and I certainly never became addicted. But I knew people, several of whom are no longer with us, who were. Young heroin users who are in the initial honeymoon period of using the drug feel good about it for a number of reasons. They think that because they didn&#039;t drop dead of an overdose the first time they shot up that smack is a lot less dangerous that all of the drug war propaganda. They feel the illicit thrill of sticking one to the man, and of being part of a subculture. They are convinced that they aren&#039;t going to get addicted, and that they can stop whenever they want if they do. They feel they are initiated into a way of life that non-users cannot understand, and they feel contempt or even pity for anyone who tries to talk them out of it. Older, more desperate users will offer naive non-users the drug in an effort to get them addicted in order to support their own habit. Some younger users offer it to their friends because they honestly believe that it is a wonderful experience that should be shared.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Likewise, with life-extension. You&#039;re either &quot;into it&quot;, or you&#039;re not. Being &quot;into&quot; transhumanism is no more a religion than being &quot;into&quot; mountaineering.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Mountaineering is an activity, not a lifestyle. It may be a major, defining part of your existence, but it is not a philosophy like transhumanism.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;It&#039;s simply a personal choice.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;So, as I have pointed out, is heroin use.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Like-wise, to &quot;object&quot; to me being into transhumanism is as silly as to &quot;object&quot; to me going mountain climbing.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I do not object to your philosophy, in the sense that I wish to &lt;em&gt;compell&lt;/em&gt; a change, I simply take issue with a number of your assumptions and try to point out some of the flaws and pitfalls in your reasoning. Again, a hallmark of religious-style closemindedness in the face of argument is to retreat to a position of &quot;Well, I don;t care what you say, I&#039;m going to believe what makes me happy, bleeah&quot;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Its simply not an issue.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&#039;Nuff said.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Color red, religion-like, and mountaineering</strong></p>
<blockquote>
<p><em>My comment about explaining the color red to a blind person is not religious-like at all</em></p>
</blockquote>
<p>Yes it is. It implies a special knowledge, enlightenment or state of grace that must be believed or directly experienced rather than be subjected to rational debate.</p>
<blockquote>
<p><em>A case in point.</em></p>
</blockquote>
<p>A case against point: You have chosen a daring and yet not beyond the pale activity to illustrate, or perhaps obfuscate, your position. Allow me to present a different angle.</p>
<p>In my misspent youth, I came into contact with people involved in what is commonly known as the drug culture. I was mainly a spectator, and I certainly never became addicted. But I knew people, several of whom are no longer with us, who were. Young heroin users who are in the initial honeymoon period of using the drug feel good about it for a number of reasons. They think that because they didn&#39;t drop dead of an overdose the first time they shot up that smack is a lot less dangerous that all of the drug war propaganda. They feel the illicit thrill of sticking one to the man, and of being part of a subculture. They are convinced that they aren&#39;t going to get addicted, and that they can stop whenever they want if they do. They feel they are initiated into a way of life that non-users cannot understand, and they feel contempt or even pity for anyone who tries to talk them out of it. Older, more desperate users will offer naive non-users the drug in an effort to get them addicted in order to support their own habit. Some younger users offer it to their friends because they honestly believe that it is a wonderful experience that should be shared.</p>
<blockquote>
<p><em>Likewise, with life-extension. You&#39;re either &quot;into it&quot;, or you&#39;re not. Being &quot;into&quot; transhumanism is no more a religion than being &quot;into&quot; mountaineering.</em></p>
</blockquote>
<p>Mountaineering is an activity, not a lifestyle. It may be a major, defining part of your existence, but it is not a philosophy like transhumanism.</p>
<blockquote>
<p><em>It&#39;s simply a personal choice.</em></p>
</blockquote>
<p>So, as I have pointed out, is heroin use.</p>
<blockquote>
<p><em>Like-wise, to &quot;object&quot; to me being into transhumanism is as silly as to &quot;object&quot; to me going mountain climbing.</em></p>
</blockquote>
<p>I do not object to your philosophy, in the sense that I wish to <em>compell</em> a change, I simply take issue with a number of your assumptions and try to point out some of the flaws and pitfalls in your reasoning. Again, a hallmark of religious-style closemindedness in the face of argument is to retreat to a position of &quot;Well, I don;t care what you say, I&#39;m going to believe what makes me happy, bleeah&quot;</p>
<blockquote>
<p><em>Its simply not an issue.</em></p>
</blockquote>
<p>&#39;Nuff said.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: kurt2100</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1141</link>
		<dc:creator>kurt2100</dc:creator>
		<pubDate>Mon, 29 Jan 2001 07:52:10 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1141</guid>
		<description>&lt;p&gt;&lt;strong&gt;Color red, religion-like, and mountaineering&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;My comment about explaining the color red to a blind person is not religious-like at all, and is entirely appropriate to the issue of living forever. A case in point. I occasionally climb mountains. I have occasionally been asked by people who are not climbers why I like to climb mountians. Some of these people have told me that they think mountain climbing is dangerous, difficult, and a stupid thing to do (especially since I am into life-extension) and they do not understand why I do it. I tell them that I climb because I enjoy it and that I am &quot;into it&quot;. I tell them that I cannot explain to them why I like to climb, just that it has to be experienced in order to be understood. You see, they cannot understand why mountain climbing is enjoyable any more than someone who is blind can understand the color red. If you like mountain climbing, you do it. If you don&#039;t like you don&#039;t do it.&lt;br /&gt;
&lt;br /&gt;
Likewise, with life-extension. You&#039;re either &quot;into it&quot;, or you&#039;re not. Being &quot;into&quot; transhumanism is no more a religion than being &quot;into&quot; mountaineering. It&#039;s simply a personal choice. Like-wise, to &quot;object&quot; to me being into transhumanism is as silly as to &quot;object&quot; to me going mountain climbing. Its simply not an issue.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Color red, religion-like, and mountaineering</strong></p>
<p>My comment about explaining the color red to a blind person is not religious-like at all, and is entirely appropriate to the issue of living forever. A case in point. I occasionally climb mountains. I have occasionally been asked by people who are not climbers why I like to climb mountians. Some of these people have told me that they think mountain climbing is dangerous, difficult, and a stupid thing to do (especially since I am into life-extension) and they do not understand why I do it. I tell them that I climb because I enjoy it and that I am &quot;into it&quot;. I tell them that I cannot explain to them why I like to climb, just that it has to be experienced in order to be understood. You see, they cannot understand why mountain climbing is enjoyable any more than someone who is blind can understand the color red. If you like mountain climbing, you do it. If you don&#39;t like you don&#39;t do it.</p>
<p>Likewise, with life-extension. You&#39;re either &quot;into it&quot;, or you&#39;re not. Being &quot;into&quot; transhumanism is no more a religion than being &quot;into&quot; mountaineering. It&#39;s simply a personal choice. Like-wise, to &quot;object&quot; to me being into transhumanism is as silly as to &quot;object&quot; to me going mountain climbing. Its simply not an issue.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Iron Sun</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1137</link>
		<dc:creator>Iron Sun</dc:creator>
		<pubDate>Sun, 28 Jan 2001 22:26:07 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1137</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Avoiding uncomfortable implications&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Assume that you were addressing a skeptical, possibly hostile audience. Would you immediately jump to the most radical implications of your position?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Of course not. If I was a KKK Grand Dragon recruiting a new member I would start by playing on feelings of hostility and alienation before I burned any crosses.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Yet a single careless driver, a fall in the bathtub, a stray bit of fat in the wrong artery and...poof!...in an instant, all of that accumulated knowledge is gone. This seems quite wasteful to me.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Perhaps it is that imminent sense of potential loss that lets us savour each moment. In order to truly live, we must accept the &lt;em&gt;total inevitability&lt;/em&gt; that one way or another all that we have will pass away. Anything else devalues the human experience far more than you seem willing to admit to yourself at this point.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;At each step, the advances will seem obviously helpful. A prosthetic eye for blind people? Of course. Brain/computer interfaces for the paralyzed? Seems very helpful. Nanometer scale medical imaging? Marvelous. By itself, no single innovation will suffice, but each advance will build upon the others.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;And because it is gradual, that makes it acceptable? Have you ever heard the one about &quot;first they came for the Jews and I did nothing... then they came for the etc etc&quot;? The slow erosion of civil rights, the rise of Nazism, someone becoming addicted to prescription painkillers and sliding into heroin use - all of these things can happen incrementally with justifications each step of the way. The problem is that many of the steps may make perfect sense, such as prosthetics for the disabled. But, like the prescription painkillers, some people find the dangerous side effects attractive even when they do not need the therapeutic aspects. And before you say &quot;why can&#039;t I enhance myself in whatever way I choose&quot; let me give you this analogy: If I live on a farm and need to get rid of vermin, I can get a shotgun. I can also take it down to the shooting range and take some potshots at clay pigeons for fun. That doesn&#039;t mean that I need to have a gun grafted to my arm. To make a useful or fun faculty part of my self changes the way that we look at it and the way it affects our behaviour.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Avoiding uncomfortable implications</strong></p>
<blockquote>
<p><em>Assume that you were addressing a skeptical, possibly hostile audience. Would you immediately jump to the most radical implications of your position?</em></p>
</blockquote>
<p>Of course not. If I was a KKK Grand Dragon recruiting a new member I would start by playing on feelings of hostility and alienation before I burned any crosses.</p>
<blockquote>
<p><em>Yet a single careless driver, a fall in the bathtub, a stray bit of fat in the wrong artery and&#8230;poof!&#8230;in an instant, all of that accumulated knowledge is gone. This seems quite wasteful to me.</em></p>
</blockquote>
<p>Perhaps it is that imminent sense of potential loss that lets us savour each moment. In order to truly live, we must accept the <em>total inevitability</em> that one way or another all that we have will pass away. Anything else devalues the human experience far more than you seem willing to admit to yourself at this point.</p>
<blockquote>
<p><em>At each step, the advances will seem obviously helpful. A prosthetic eye for blind people? Of course. Brain/computer interfaces for the paralyzed? Seems very helpful. Nanometer scale medical imaging? Marvelous. By itself, no single innovation will suffice, but each advance will build upon the others.</em></p>
</blockquote>
<p>And because it is gradual, that makes it acceptable? Have you ever heard the one about &quot;first they came for the Jews and I did nothing&#8230; then they came for the etc etc&quot;? The slow erosion of civil rights, the rise of Nazism, someone becoming addicted to prescription painkillers and sliding into heroin use &#8211; all of these things can happen incrementally with justifications each step of the way. The problem is that many of the steps may make perfect sense, such as prosthetics for the disabled. But, like the prescription painkillers, some people find the dangerous side effects attractive even when they do not need the therapeutic aspects. And before you say &quot;why can&#39;t I enhance myself in whatever way I choose&quot; let me give you this analogy: If I live on a farm and need to get rid of vermin, I can get a shotgun. I can also take it down to the shooting range and take some potshots at clay pigeons for fun. That doesn&#39;t mean that I need to have a gun grafted to my arm. To make a useful or fun faculty part of my self changes the way that we look at it and the way it affects our behaviour.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: transhuman57</title>
		<link>http://www.foresight.org/nanodot/?p=396#comment-1147</link>
		<dc:creator>transhuman57</dc:creator>
		<pubDate>Sun, 28 Jan 2001 10:31:55 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=396#comment-1147</guid>
		<description>&lt;p&gt;&lt;strong&gt;Those Wacky Transhumanists!&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Brendan Bernhard, we owe you a big debt of gratitude for your probing and insightful article about transhumanists Max More and Natasha Vita-More. Rather than adopt the usual sardonic and patronizing tone that seems to permeate the rest of the L.A. Weekly, you have bravely taken a different approach in exposing this self-proclaimed &quot;movement&quot; as just another California fad that deserves no more than a passing guffaw. I did notice your appreciation of Natasha&#039;s physique, as well as the &quot;long legs of the interviewer&#039;s 20-something assistant, languorously stretched out on the other side of the living room&quot;. You also state that &quot;Natasha calls herself an artist, but she might more accurately be viewed as a symptom.&quot; What astounding journalism! You sure put these misguided people in their place. Thank you for also pointing out that &quot;Death is awful, but an endlessly prolonged life span seems unimaginable, even monstrous.&quot; How deluded I had been to desire an unlimited lifespan! Now I can degrade, die and decay in peace. Long live regular humanity! Well, not too long...&lt;br /&gt;&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Those Wacky Transhumanists!</strong></p>
<p>Brendan Bernhard, we owe you a big debt of gratitude for your probing and insightful article about transhumanists Max More and Natasha Vita-More. Rather than adopt the usual sardonic and patronizing tone that seems to permeate the rest of the L.A. Weekly, you have bravely taken a different approach in exposing this self-proclaimed &quot;movement&quot; as just another California fad that deserves no more than a passing guffaw. I did notice your appreciation of Natasha&#39;s physique, as well as the &quot;long legs of the interviewer&#39;s 20-something assistant, languorously stretched out on the other side of the living room&quot;. You also state that &quot;Natasha calls herself an artist, but she might more accurately be viewed as a symptom.&quot; What astounding journalism! You sure put these misguided people in their place. Thank you for also pointing out that &quot;Death is awful, but an endlessly prolonged life span seems unimaginable, even monstrous.&quot; How deluded I had been to desire an unlimited lifespan! Now I can degrade, die and decay in peace. Long live regular humanity! Well, not too long&#8230;</p>
]]></content:encoded>
	</item>
</channel>
</rss>