<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: MIT psychologist vs. frightening predictions</title>
	<atom:link href="http://www.foresight.org/nanodot/?feed=rss2&#038;p=194" rel="self" type="application/rss+xml" />
	<link>http://www.foresight.org/nanodot/?p=194</link>
	<description>examining transformative technology</description>
	<lastBuildDate>Wed, 03 Apr 2013 18:23:47 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.0.4</generator>
	<item>
		<title>By: Jeffrey Soreff</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-368</link>
		<dc:creator>Jeffrey Soreff</dc:creator>
		<pubDate>Thu, 31 Aug 2000 15:44:49 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-368</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:and I don&#039;t even read extropians&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;It bothers me that Pinker&#039;s entire set of conclusions &lt;strong&gt;hinge&lt;/strong&gt; on the rejection of human genetic engineering. For good or ill, is this really plausible for a thousand years??? Cell manipulation requires devices of modest size (and, over the long term, cost). This is a lot harder than regulating nuclear reactors or ICBMs. Many parents are highly motivated to give their children any advantage that they can get their hands on. Genetic technology is immature enough today that I can easily believe that &quot;designer babies&quot; are more than a decade away, but... a &lt;strong&gt;millennium&lt;/strong&gt;???&lt;/p&gt;
&lt;p&gt;Consider also that our technology already tweaks human nature, albeit in small ways. Consider Valium and Prozac... These don&#039;t rewire the human brain, of course, but even now, even without genetic engineering, let alone MNT, they make the average concentrations of neurotransmitters in our population a bit different than it was a century ago. Just normal biomedical progress (even without MNT) is going to yield a wider range of more specific CNS drugs. Even minor progress in drug delivery systems will probably allow dribbling the drugs into &lt;strong&gt;specific areas&lt;/strong&gt; of patients&#039; brains, which considerably widens the possible useful effects one could get. For good or ill, I&#039;d expect that at least one of these options is going to be useful enough to become common, and will alter &quot;human nature&quot; is some significant way.&lt;/p&gt;
&lt;p&gt;Short of a full stop to medical progress, I find Pinker&#039;s projection of a substantially unchanged human nature in 3000 AD very implausible.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:and I don&#39;t even read extropians</strong></p>
<p>It bothers me that Pinker&#39;s entire set of conclusions <strong>hinge</strong> on the rejection of human genetic engineering. For good or ill, is this really plausible for a thousand years??? Cell manipulation requires devices of modest size (and, over the long term, cost). This is a lot harder than regulating nuclear reactors or ICBMs. Many parents are highly motivated to give their children any advantage that they can get their hands on. Genetic technology is immature enough today that I can easily believe that &quot;designer babies&quot; are more than a decade away, but&#8230; a <strong>millennium</strong>???</p>
<p>Consider also that our technology already tweaks human nature, albeit in small ways. Consider Valium and Prozac&#8230; These don&#39;t rewire the human brain, of course, but even now, even without genetic engineering, let alone MNT, they make the average concentrations of neurotransmitters in our population a bit different than it was a century ago. Just normal biomedical progress (even without MNT) is going to yield a wider range of more specific CNS drugs. Even minor progress in drug delivery systems will probably allow dribbling the drugs into <strong>specific areas</strong> of patients&#39; brains, which considerably widens the possible useful effects one could get. For good or ill, I&#39;d expect that at least one of these options is going to be useful enough to become common, and will alter &quot;human nature&quot; is some significant way.</p>
<p>Short of a full stop to medical progress, I find Pinker&#39;s projection of a substantially unchanged human nature in 3000 AD very implausible.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: MarkGubrud</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-367</link>
		<dc:creator>MarkGubrud</dc:creator>
		<pubDate>Wed, 30 Aug 2000 23:25:00 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-367</guid>
		<description>&lt;p&gt;&lt;strong&gt;and I don&#039;t even read extropians&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Pinker uses the word &quot;preposterous&quot; to dismiss a large and growing body of people and their ideas, but doesn&#039;t seem to have spent much time talking with or thinking about either.&lt;br /&gt;
&lt;br /&gt;
I read another piece where Pinker questioned the plausibility of &quot;uploading&quot; schemes. Fine, but what does he have to say about the growing readiness of people to accept the proposition that life as software is a potentially attractive alternative to death? That the ascendance of technology and decline of humanity, or at least the transformation of humanity into a race of &quot;augmented&quot; cyborgs is inevitable, and even desirable, as the next stage of &quot;evolution?&quot; What is behind this growing cult of technology? Could the famous psychologist shed some light on that?&lt;br /&gt;
&lt;br /&gt;
It is not enough to assert that such ideas, which Pinker only alludes to, not even discussing them in any depth, are &quot;scaring people.&quot; Sure, they worry a lot of people, myself for one. But what is scariest is the number of people around who appear to know a lot more and have given much more thought to these issues than Pinker, and who profess not to be worried.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>and I don&#39;t even read extropians</strong></p>
<p>Pinker uses the word &quot;preposterous&quot; to dismiss a large and growing body of people and their ideas, but doesn&#39;t seem to have spent much time talking with or thinking about either.</p>
<p>I read another piece where Pinker questioned the plausibility of &quot;uploading&quot; schemes. Fine, but what does he have to say about the growing readiness of people to accept the proposition that life as software is a potentially attractive alternative to death? That the ascendance of technology and decline of humanity, or at least the transformation of humanity into a race of &quot;augmented&quot; cyborgs is inevitable, and even desirable, as the next stage of &quot;evolution?&quot; What is behind this growing cult of technology? Could the famous psychologist shed some light on that?</p>
<p>It is not enough to assert that such ideas, which Pinker only alludes to, not even discussing them in any depth, are &quot;scaring people.&quot; Sure, they worry a lot of people, myself for one. But what is scariest is the number of people around who appear to know a lot more and have given much more thought to these issues than Pinker, and who profess not to be worried.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RobVirkus</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-360</link>
		<dc:creator>RobVirkus</dc:creator>
		<pubDate>Wed, 30 Aug 2000 22:15:33 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-360</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Pinker and most futurists are clueless&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;There is no data to suggest that human nature can be changed by self-directed evolution or not. There is no data to suggest that human nature will be irrelevant by 2100 or sooner. It is merely an assumption that those who choose to self-evolve will turn out better or have better lives than those who don&#039;t. They may unwittingly evolve themselves out of existence. We can&#039;t know until something happens. Pinker is insightful rather than clueless.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Pinker and most futurists are clueless</strong></p>
<p>There is no data to suggest that human nature can be changed by self-directed evolution or not. There is no data to suggest that human nature will be irrelevant by 2100 or sooner. It is merely an assumption that those who choose to self-evolve will turn out better or have better lives than those who don&#39;t. They may unwittingly evolve themselves out of existence. We can&#39;t know until something happens. Pinker is insightful rather than clueless.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: jbash</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-355</link>
		<dc:creator>jbash</dc:creator>
		<pubDate>Wed, 30 Aug 2000 20:33:05 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-355</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Trying to put words in another&#039;s mouth&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Hmm. You have a point; I may have gone too far with some of that. I don&#039;t think I extrapolated as far as you seem to think (or indeed any further than you yourself are extrapolating), but too far nonetheless. For that, I apologize to all.&lt;/p&gt;
&lt;p&gt;... but what leads you to think I&#039;m not a gibbering wreck now?&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Trying to put words in another&#39;s mouth</strong></p>
<p>Hmm. You have a point; I may have gone too far with some of that. I don&#39;t think I extrapolated as far as you seem to think (or indeed any further than you yourself are extrapolating), but too far nonetheless. For that, I apologize to all.</p>
<p>&#8230; but what leads you to think I&#39;m not a gibbering wreck now?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: fool</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-356</link>
		<dc:creator>fool</dc:creator>
		<pubDate>Wed, 30 Aug 2000 20:16:43 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-356</guid>
		<description>&lt;p&gt;&lt;strong&gt;Have you even read Frankenstein?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Because I believe that Shelley&#039;s premise was not technofear&lt;br /&gt;
but that technology merely expresses the motives of the tool wielders.&lt;br /&gt;
Indeed, she presented the Monster as a rather compassionate creature to begin with,&lt;br /&gt;
while Frankenstein was shown to have brought about his destruction&lt;br /&gt;
through his own driven arrogance.&lt;br /&gt;
&lt;br /&gt;
I think we&#039;d be a lot better off today if more people had&lt;br /&gt;
actually listened to what the woman was saying.&lt;br /&gt;
&lt;br /&gt;&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Have you even read Frankenstein?</strong></p>
<p>Because I believe that Shelley&#39;s premise was not technofear<br />
but that technology merely expresses the motives of the tool wielders.<br />
Indeed, she presented the Monster as a rather compassionate creature to begin with,<br />
while Frankenstein was shown to have brought about his destruction<br />
through his own driven arrogance.</p>
<p>I think we&#39;d be a lot better off today if more people had<br />
actually listened to what the woman was saying.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: jbash</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-366</link>
		<dc:creator>jbash</dc:creator>
		<pubDate>Wed, 30 Aug 2000 20:00:55 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-366</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Pinker has some points.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Although I agree that many of these speculative things (and I think, here, particularly of uploading) may never come to pass, or may not come to pass for a very long time, or may not ever have a significant number of users if they do come to pass, I &lt;em&gt;don&#039;t&lt;/em&gt; think that there&#039;s any doubt that a lot of very strange and disturbing capabilities will come our way.&lt;/p&gt;
&lt;p&gt;We may not know exactly which of these ideas will pan out, but it would appear almost certain that some of them will. Hell, there are things that are almost entirely in reach right now that have enormously disturbing implications.&lt;/p&gt;
&lt;p&gt;As for the more speculative stuff, it&#039;s true that there is a certain amount of unquestioning acceptance around. There&#039;s also a certain amount of very well-informed opinion. I&#039;ve tried to become well-informed, and things like strong AI and very serious human augmentation become &lt;em&gt;more&lt;/em&gt; plausible the more well-informed I become... although at the same time, the likely time table seems to get longer the more I learn.&lt;/p&gt;
&lt;p&gt;... and nobody doubts that human nature will still be around for quite a while. That&#039;s one of the things we&#039;re all worried about, since part of human nature is not being trustworthy when tremendous power is involved.&lt;/p&gt;
&lt;p&gt;The questions are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;What forces will be around in &lt;em&gt;addition&lt;/em&gt; to human nature? There&#039;s a very good chance that there may be players that aren&#039;t human. Those players may make humans and their nature largely irrelevant, depending on your view of what&#039;s important.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Perhaps more important, because the premise is more certain: what will human nature do with an enormous amount of power?&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;As far as I can see, the biggest piece of wishful thinking in this whole discussion is Pinker&#039;s original idea that, because most people find something distasteful, it will never be done. I see no support for that &lt;em&gt;anywhere&lt;/em&gt;. If something is possible, somebody &lt;em&gt;will&lt;/em&gt; do it. Depending on what it is, it may not matter if most people are scared of it... the effects still take place.&lt;/p&gt;
&lt;p&gt;Actually, that&#039;s the second-biggest piece of wishful thinking. The &lt;em&gt;biggest&lt;/em&gt; piece of wishful thinking was Pinker&#039;s implicit and unsupported claim that it had been disproven that human nature made war inevitable...&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Pinker has some points.</strong></p>
<p>Although I agree that many of these speculative things (and I think, here, particularly of uploading) may never come to pass, or may not come to pass for a very long time, or may not ever have a significant number of users if they do come to pass, I <em>don&#39;t</em> think that there&#39;s any doubt that a lot of very strange and disturbing capabilities will come our way.</p>
<p>We may not know exactly which of these ideas will pan out, but it would appear almost certain that some of them will. Hell, there are things that are almost entirely in reach right now that have enormously disturbing implications.</p>
<p>As for the more speculative stuff, it&#39;s true that there is a certain amount of unquestioning acceptance around. There&#39;s also a certain amount of very well-informed opinion. I&#39;ve tried to become well-informed, and things like strong AI and very serious human augmentation become <em>more</em> plausible the more well-informed I become&#8230; although at the same time, the likely time table seems to get longer the more I learn.</p>
<p>&#8230; and nobody doubts that human nature will still be around for quite a while. That&#39;s one of the things we&#39;re all worried about, since part of human nature is not being trustworthy when tremendous power is involved.</p>
<p>The questions are:</p>
<ul>
<li>
<p>What forces will be around in <em>addition</em> to human nature? There&#39;s a very good chance that there may be players that aren&#39;t human. Those players may make humans and their nature largely irrelevant, depending on your view of what&#39;s important.</p>
</li>
<li>
<p>Perhaps more important, because the premise is more certain: what will human nature do with an enormous amount of power?</p>
</li>
</ul>
<p>As far as I can see, the biggest piece of wishful thinking in this whole discussion is Pinker&#39;s original idea that, because most people find something distasteful, it will never be done. I see no support for that <em>anywhere</em>. If something is possible, somebody <em>will</em> do it. Depending on what it is, it may not matter if most people are scared of it&#8230; the effects still take place.</p>
<p>Actually, that&#39;s the second-biggest piece of wishful thinking. The <em>biggest</em> piece of wishful thinking was Pinker&#39;s implicit and unsupported claim that it had been disproven that human nature made war inevitable&#8230;</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: fool</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-354</link>
		<dc:creator>fool</dc:creator>
		<pubDate>Wed, 30 Aug 2000 19:58:54 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-354</guid>
		<description>&lt;p&gt;&lt;strong&gt;Trying to put words in another&#039;s mouth&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Each time you used the phrase &lt;em&gt;&quot;you seem to advocate that&quot;&lt;/em&gt;&lt;br /&gt;
you proceeded to spout something which bore little relation&lt;br /&gt;
to the post you were replying to.&lt;br /&gt;
&lt;br /&gt;
Rather than advocating technofear and a restriction of descussion,&lt;br /&gt;
Iron Sun seemed to be pointing out that being dismissive toward this fear&lt;br /&gt;
would not do much to alleviate it, and that this would be dangerous,&lt;br /&gt;
because mob fear could kill us all.&lt;br /&gt;
In this last point, you actually seem to be in agreeance with Iron Sun,&lt;br /&gt;
which makes it strange that you felt inclined&lt;br /&gt;
to shove an opposing idea in their mouth.&lt;br /&gt;
&lt;br /&gt;
You display a rather helpless attitude to the future:&lt;br /&gt;
&quot;I&#039;m just one person, what can I do about it?&quot;&lt;br /&gt;
True, the future is not optional, but the form it takes &lt;em&gt;is.&lt;/em&gt;&lt;br /&gt;
We are about to go through the equivalent of discovering fire,&lt;br /&gt;
but will we build a useful campfire or start a destructive forest fire?&lt;br /&gt;
Iron Sun seemed to be advocating that the best way to get a useful result&lt;br /&gt;
would be to include &lt;em&gt;everybody&lt;/em&gt; in the process&lt;br /&gt;
rather than leaving the decision to a tiny elite,&lt;br /&gt;
which seemed to be the real intent behind the individuals/society comment,&lt;br /&gt;
and Iron Sun&#039;s main reason for rebuking Practical Transhuman&lt;br /&gt;
for being dismissive of technofear.&lt;br /&gt;
&lt;br /&gt;
I actually thought they both made very good points, while you&lt;br /&gt;
added little to the debate because of your dichotomistic attitude.&lt;br /&gt;
ie: you disagreed with Pinker&#039;s article, and P.Transhuman disagreed with him,&lt;br /&gt;
therefore P.Transhuman was on &quot;your side&quot;, which meant I.Sun must be on the &quot;other side&quot;.&lt;br /&gt;
This rigid mindset will leave you a gibbering wreck in the years to come.&lt;br /&gt;
&lt;br /&gt;&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Trying to put words in another&#39;s mouth</strong></p>
<p>Each time you used the phrase <em>&quot;you seem to advocate that&quot;</em><br />
you proceeded to spout something which bore little relation<br />
to the post you were replying to.</p>
<p>Rather than advocating technofear and a restriction of descussion,<br />
Iron Sun seemed to be pointing out that being dismissive toward this fear<br />
would not do much to alleviate it, and that this would be dangerous,<br />
because mob fear could kill us all.<br />
In this last point, you actually seem to be in agreeance with Iron Sun,<br />
which makes it strange that you felt inclined<br />
to shove an opposing idea in their mouth.</p>
<p>You display a rather helpless attitude to the future:<br />
&quot;I&#39;m just one person, what can I do about it?&quot;<br />
True, the future is not optional, but the form it takes <em>is.</em><br />
We are about to go through the equivalent of discovering fire,<br />
but will we build a useful campfire or start a destructive forest fire?<br />
Iron Sun seemed to be advocating that the best way to get a useful result<br />
would be to include <em>everybody</em> in the process<br />
rather than leaving the decision to a tiny elite,<br />
which seemed to be the real intent behind the individuals/society comment,<br />
and Iron Sun&#39;s main reason for rebuking Practical Transhuman<br />
for being dismissive of technofear.</p>
<p>I actually thought they both made very good points, while you<br />
added little to the debate because of your dichotomistic attitude.<br />
ie: you disagreed with Pinker&#39;s article, and P.Transhuman disagreed with him,<br />
therefore P.Transhuman was on &quot;your side&quot;, which meant I.Sun must be on the &quot;other side&quot;.<br />
This rigid mindset will leave you a gibbering wreck in the years to come.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RobVirkus</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-365</link>
		<dc:creator>RobVirkus</dc:creator>
		<pubDate>Wed, 30 Aug 2000 17:24:03 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-365</guid>
		<description>&lt;p&gt;&lt;strong&gt;Pinker has some points.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I am amazed at the increasing acceptance of unquestioned ideas, growing up along with the field of Nanotechnology. Rampant speculations of self-aware computers, uploading, transhumanism, ect. Pinker touched on some of my long held suspicions that critical thinking is becoming as loose as our morals. Human nature is a force much underestimated. It will be the same for the indefinite future and we will pay a heavy price if we ever think we have trancended it. Nanotechnology will come and it is precisely because of human nature that the mission of organizations such as the Foresight Institute is so critical.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Pinker has some points.</strong></p>
<p>I am amazed at the increasing acceptance of unquestioned ideas, growing up along with the field of Nanotechnology. Rampant speculations of self-aware computers, uploading, transhumanism, ect. Pinker touched on some of my long held suspicions that critical thinking is becoming as loose as our morals. Human nature is a force much underestimated. It will be the same for the indefinite future and we will pay a heavy price if we ever think we have trancended it. Nanotechnology will come and it is precisely because of human nature that the mission of organizations such as the Foresight Institute is so critical.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: jbash</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-353</link>
		<dc:creator>jbash</dc:creator>
		<pubDate>Wed, 30 Aug 2000 14:31:39 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-353</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:&quot;Creepiness&quot; is a matter of opinion.&lt;/strong&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;For a start, organ transplants and blood transfusions have brought great benefits, sure, but they have also given us a new set of problems. HIV screening of blood donors and Chinese death row prisoners being organ harvested, for example. This isn&#039;t to say that these procedures that save so many lives should be discontinued, but it shows that these are complex issues that must be thoroughly examined, not by individuals, but by society.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Insofar as anything that can reasonably be called &quot;society&quot; exists, it does not have the cognitive capacity to examine anything. Individuals think. Groups do not think.&lt;/p&gt;
&lt;p&gt;Now, &lt;em&gt;organizations&lt;/em&gt; (like governments, which are &lt;em&gt;not&lt;/em&gt; the same thing as society, and shouldn&#039;t be allowed to get away with claiming to speak for it) may have processes for examining issues. However, those processes are so different from the internal workings of a mind that an organization&#039;s &quot;examining&quot; something is at best tenuously similar to an individual&#039;s &quot;examining&quot; it. Certainly the two shouldn&#039;t be presented in opposition.&lt;/p&gt;
&lt;p&gt;Furthermore, in my opinion and that of a lot of other people who&#039;ve thought a lot about it, groups (as opposed to the members of those groups) have no rights that an individual is bound to respect... only individuals have rights.&lt;/p&gt;
&lt;p&gt;Put another way, &quot;society&quot;, whatever the hell that means, can suck my weenie. But that&#039;s irrelevant to the main point...&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Prudence would seem to be called for, but it seems to me that a lot of the most vociferous advocates for a headlong rush toward such transformations are behaving like an impatient, tantrum-throwing toddler who doesn&#039;t understand why Mummy and Daddy won&#039;t let them have a go driving the car NOW.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;First of all, I don&#039;t know that anybody is advocating a headlong rush toward anything.&lt;/p&gt;
&lt;p&gt;The point is that things &lt;em&gt;will&lt;/em&gt; happen in their own time. The idea that you can significantly delay them is arrogant stupidity. The best you &lt;em&gt;may&lt;/em&gt; be able to do is to shape them to some degree as they happen. Hiding from them is not an option.&lt;/p&gt;
&lt;p&gt;Which brings us to this...&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The important thing here is to realise that a lot of so-called &quot;regular&quot; people are scared shitless about all this talk of uploading brains and so on. To dismissively label their fears as irrational or unenlightened won&#039;t help change their minds.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Their fears, rational or irrational, enlightened or unenlightened, are irrelevant. I share many of those fears, but that&#039;s really just tough for me. The future is not optional, regardless of how afraid you are of it.&lt;/p&gt;
&lt;p&gt;Why should anybody want to change their minds? It might even be &lt;em&gt;good&lt;/em&gt; to have them afraid.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;You can believe in the rightness of your position all you want, but if a grassroots campaign to legislate against such technology is started, I don&#039;t think being arrogant or elitist will help.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;In the long term, such a campaign, whether successful or not, will not prevent the development or use of these technologies. Full stop. If they&#039;re possible (and many, many very &quot;scary&quot; technologies look possible), then they will be developed.&lt;/p&gt;
&lt;p&gt;Now, it&#039;s true that having these things outlawed is likely to cause them to be developed in such a way that we all get killed, or worse. That&#039;s a problem.&lt;/p&gt;
&lt;p&gt;However, let&#039;s be clear on what you&#039;re advocating. You seem to say that the predictions people are giving out about technology are likely to create a fear-based backlash.&lt;/p&gt;
&lt;p&gt;I think you think that backlash would be bad because it would prevent development of the technology. I think that backlash would be bad because, although it would &lt;em&gt;not&lt;/em&gt; prevent the development of the technology, it would shape the future in a potentially fatal way. It would appear that we both think it would be a bad thing.&lt;/p&gt;
&lt;p&gt;Now, you seem to advocate that, in order to prevent the backlash, we stop giving the public our best predictions about the things technology will make possible. Ignoring the fact that it&#039;s clearly impossible to make &lt;em&gt;everybody&lt;/em&gt; shut up about such things, isn&#039;t what you&#039;re advocating, really, that we should just lie to the public to keep them docile?&lt;/p&gt;
&lt;p&gt;If I thought that tactic would work, I might be persuaded to use it. I&#039;m that worried; I see this as a matter of survival. Since I &lt;em&gt;don&#039;t&lt;/em&gt; think the tactic will work, I don&#039;t have to deal with the very serious moral issues involved in lying to people about what&#039;s likely to happen to them. How do &lt;em&gt;you&lt;/em&gt; feel about those moral issues?&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:&quot;Creepiness&quot; is a matter of opinion.</strong></p>
<blockquote>
<p>For a start, organ transplants and blood transfusions have brought great benefits, sure, but they have also given us a new set of problems. HIV screening of blood donors and Chinese death row prisoners being organ harvested, for example. This isn&#39;t to say that these procedures that save so many lives should be discontinued, but it shows that these are complex issues that must be thoroughly examined, not by individuals, but by society.</p>
</blockquote>
<p>Insofar as anything that can reasonably be called &quot;society&quot; exists, it does not have the cognitive capacity to examine anything. Individuals think. Groups do not think.</p>
<p>Now, <em>organizations</em> (like governments, which are <em>not</em> the same thing as society, and shouldn&#39;t be allowed to get away with claiming to speak for it) may have processes for examining issues. However, those processes are so different from the internal workings of a mind that an organization&#39;s &quot;examining&quot; something is at best tenuously similar to an individual&#39;s &quot;examining&quot; it. Certainly the two shouldn&#39;t be presented in opposition.</p>
<p>Furthermore, in my opinion and that of a lot of other people who&#39;ve thought a lot about it, groups (as opposed to the members of those groups) have no rights that an individual is bound to respect&#8230; only individuals have rights.</p>
<p>Put another way, &quot;society&quot;, whatever the hell that means, can suck my weenie. But that&#39;s irrelevant to the main point&#8230;</p>
<blockquote>
<p>Prudence would seem to be called for, but it seems to me that a lot of the most vociferous advocates for a headlong rush toward such transformations are behaving like an impatient, tantrum-throwing toddler who doesn&#39;t understand why Mummy and Daddy won&#39;t let them have a go driving the car NOW.</p>
</blockquote>
<p>First of all, I don&#39;t know that anybody is advocating a headlong rush toward anything.</p>
<p>The point is that things <em>will</em> happen in their own time. The idea that you can significantly delay them is arrogant stupidity. The best you <em>may</em> be able to do is to shape them to some degree as they happen. Hiding from them is not an option.</p>
<p>Which brings us to this&#8230;</p>
<blockquote>
<p>The important thing here is to realise that a lot of so-called &quot;regular&quot; people are scared shitless about all this talk of uploading brains and so on. To dismissively label their fears as irrational or unenlightened won&#39;t help change their minds.</p>
</blockquote>
<p>Their fears, rational or irrational, enlightened or unenlightened, are irrelevant. I share many of those fears, but that&#39;s really just tough for me. The future is not optional, regardless of how afraid you are of it.</p>
<p>Why should anybody want to change their minds? It might even be <em>good</em> to have them afraid.</p>
<blockquote>
<p>You can believe in the rightness of your position all you want, but if a grassroots campaign to legislate against such technology is started, I don&#39;t think being arrogant or elitist will help.</p>
</blockquote>
<p>In the long term, such a campaign, whether successful or not, will not prevent the development or use of these technologies. Full stop. If they&#39;re possible (and many, many very &quot;scary&quot; technologies look possible), then they will be developed.</p>
<p>Now, it&#39;s true that having these things outlawed is likely to cause them to be developed in such a way that we all get killed, or worse. That&#39;s a problem.</p>
<p>However, let&#39;s be clear on what you&#39;re advocating. You seem to say that the predictions people are giving out about technology are likely to create a fear-based backlash.</p>
<p>I think you think that backlash would be bad because it would prevent development of the technology. I think that backlash would be bad because, although it would <em>not</em> prevent the development of the technology, it would shape the future in a potentially fatal way. It would appear that we both think it would be a bad thing.</p>
<p>Now, you seem to advocate that, in order to prevent the backlash, we stop giving the public our best predictions about the things technology will make possible. Ignoring the fact that it&#39;s clearly impossible to make <em>everybody</em> shut up about such things, isn&#39;t what you&#39;re advocating, really, that we should just lie to the public to keep them docile?</p>
<p>If I thought that tactic would work, I might be persuaded to use it. I&#39;m that worried; I see this as a matter of survival. Since I <em>don&#39;t</em> think the tactic will work, I don&#39;t have to deal with the very serious moral issues involved in lying to people about what&#39;s likely to happen to them. How do <em>you</em> feel about those moral issues?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Iron Sun</title>
		<link>http://www.foresight.org/nanodot/?p=194#comment-352</link>
		<dc:creator>Iron Sun</dc:creator>
		<pubDate>Wed, 30 Aug 2000 11:48:21 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=194#comment-352</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:&quot;Creepiness&quot; is a matter of opinion.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;This post smacks of what I find to be disturbingly common transhuman Olympian arrogance. Or perhaps petulance.&lt;/p&gt;
&lt;p&gt;For a start, organ transplants and blood transfusions have brought great benefits, sure, but they have also given us a new set of problems. HIV screening of blood donors and Chinese death row prisoners being organ harvested, for example. This isn&#039;t to say that these procedures that save so many lives should be discontinued, but it shows that these are complex issues that must be thoroughly examined, not by individuals, but by society. It is far too easy to label an activity that one wishes to participate in as &quot;rational and enlightened&quot; as a way of stifling just such a debate on its merits.&lt;/p&gt;
&lt;p&gt;&quot;Squeamish&quot; is also a potentially short-sighted way of viewing misgivings about these issues. Atomic power may have brought many benefits, but few people extol the health benefits of radium-impregnated underpants anymore. We have absolutely no idea what sort of long-term effects these technologies will have on the human (yeah,yeah - or posthuman) psyche. Prudence would seem to be called for, but it seems to me that a lot of the most vociferous advocates for a headlong rush toward such transformations are behaving like an impatient, tantrum-throwing toddler who doesn&#039;t understand why Mummy and Daddy won&#039;t let them have a go driving the car NOW.&lt;/p&gt;
&lt;p&gt;Calling Mary Shelley &quot;foolish&quot; is just stupid. The urge to play God is a dark element of human nature that deserves to be written about in a way that will make people think about their actions. This example may sound like hyterical rhetoric, but Josef Mengele may well have used the same word to describe Frankenstein. If you try to tell me that we are different or more moral than the Nazis I will laugh in your face.&lt;/p&gt;
&lt;p&gt;The important thing here is to realise that a lot of so-called &quot;regular&quot; people are scared shitless about all this talk of uploading brains and so on. To dismissively label their fears as irrational or unenlightened won&#039;t help change their minds. You can believe in the rightness of your position all you want, but if a grassroots campaign to legislate against such technology is started, I don&#039;t think being arrogant or elitist will help.&lt;/p&gt;
&lt;p&gt;Disclaimer: I use the phrase &quot;playing God&quot; but I am not religious. I use the word &quot;prudence&quot; but I believe in progress. I think that these technologies will be liberating, that they will make possible wonderful, creative ways of life that I for one want to be a part of. I just think that we need to be careful.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:&quot;Creepiness&quot; is a matter of opinion.</strong></p>
<p>This post smacks of what I find to be disturbingly common transhuman Olympian arrogance. Or perhaps petulance.</p>
<p>For a start, organ transplants and blood transfusions have brought great benefits, sure, but they have also given us a new set of problems. HIV screening of blood donors and Chinese death row prisoners being organ harvested, for example. This isn&#39;t to say that these procedures that save so many lives should be discontinued, but it shows that these are complex issues that must be thoroughly examined, not by individuals, but by society. It is far too easy to label an activity that one wishes to participate in as &quot;rational and enlightened&quot; as a way of stifling just such a debate on its merits.</p>
<p>&quot;Squeamish&quot; is also a potentially short-sighted way of viewing misgivings about these issues. Atomic power may have brought many benefits, but few people extol the health benefits of radium-impregnated underpants anymore. We have absolutely no idea what sort of long-term effects these technologies will have on the human (yeah,yeah &#8211; or posthuman) psyche. Prudence would seem to be called for, but it seems to me that a lot of the most vociferous advocates for a headlong rush toward such transformations are behaving like an impatient, tantrum-throwing toddler who doesn&#39;t understand why Mummy and Daddy won&#39;t let them have a go driving the car NOW.</p>
<p>Calling Mary Shelley &quot;foolish&quot; is just stupid. The urge to play God is a dark element of human nature that deserves to be written about in a way that will make people think about their actions. This example may sound like hyterical rhetoric, but Josef Mengele may well have used the same word to describe Frankenstein. If you try to tell me that we are different or more moral than the Nazis I will laugh in your face.</p>
<p>The important thing here is to realise that a lot of so-called &quot;regular&quot; people are scared shitless about all this talk of uploading brains and so on. To dismissively label their fears as irrational or unenlightened won&#39;t help change their minds. You can believe in the rightness of your position all you want, but if a grassroots campaign to legislate against such technology is started, I don&#39;t think being arrogant or elitist will help.</p>
<p>Disclaimer: I use the phrase &quot;playing God&quot; but I am not religious. I use the word &quot;prudence&quot; but I believe in progress. I think that these technologies will be liberating, that they will make possible wonderful, creative ways of life that I for one want to be a part of. I just think that we need to be careful.</p>
]]></content:encoded>
	</item>
</channel>
</rss>