<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Report sparks technology utopia dialog</title>
	<atom:link href="http://www.foresight.org/nanodot/?feed=rss2&#038;p=1254" rel="self" type="application/rss+xml" />
	<link>http://www.foresight.org/nanodot/?p=1254</link>
	<description>examining transformative technology</description>
	<lastBuildDate>Wed, 03 Apr 2013 18:23:47 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.0.4</generator>
	<item>
		<title>By: bhoover</title>
		<link>http://www.foresight.org/nanodot/?p=1254#comment-2756</link>
		<dc:creator>bhoover</dc:creator>
		<pubDate>Sun, 18 Aug 2002 15:32:19 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=1254#comment-2756</guid>
		<description>&lt;p&gt;&lt;strong&gt;Trouble In Paradise?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I&#039;m sympathetic to Farber&#039;s concerns about conflict and human nature.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;We are still a society partly bent on self-destruction; our appetite for violence and reality TV remains intact. Advances over the next 20 years won&#039;t change human nature unless everyone gets gene therapy and we end up with a society of smiley faces. In fact, the report devotes an extensive chapter to the future of war and combat, which envisions a battlefield occupied by uninhabited combat vehicles and soldiers with enhanced physical and mental capacities.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Here&#039;s an attempt to narrow human conflict down to two general categories: conflicting goals, and the desire to control.&lt;/p&gt;
&lt;p&gt;1) &lt;strong&gt;conflicting goals&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;a) Under this category is the familiar &lt;strong&gt;competition for resources&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;b) Also under this category is the sort: &#039;I want to do such and such, which precludes another from doing this or that.&#039; For instance, &#039;the Euroamerican Gateway Committee could not agree on whether it would be more economically feasible to build a transatlantic tunnel, or invest in a new high capacity teletransportation portal.&#039;&lt;/p&gt;
&lt;p&gt;2) &lt;strong&gt;control&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;a) &lt;strong&gt;predictability&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;We often have a desire to control one another, to make a person do as we would like them to, if for no other reason than for the sake of predictibility. We humans are obviously real big on control in general - otherwise, why science :). But I&#039;m not sure this one alone is of war scale caliber so to speak - even Hitler&#039;s motives were (one would hope at least) resources competition related.&lt;/p&gt;
&lt;p&gt;b) &lt;strong&gt;power lust&lt;/strong&gt; (pathological?)&lt;/p&gt;
&lt;p&gt;Imagine a world where everyone has everything they need, and more - brought to by Nanotechnology! Happy, happy, joy, joy! What could possibliy be the problem? Oh, but wait, here comes the ugly spector, Mr. Pathological Powerluster! Oooo, you&#039;d better do as he says, or he&#039;ll take away your molecular french fry machine. This one&#039;s much like the control thing, except on a war or tyranny scale &#039;cause we&#039;re talking heads of state level control here - &lt;em&gt;what did Johnny&#039;s parent&#039;s &lt;strong&gt;do&lt;/strong&gt; to that poor child?.&lt;/em&gt; There&#039;s nothing more satisfying than a good day behind the pupetier&#039;s stage.&lt;/p&gt;
&lt;p&gt;c) &lt;strong&gt;ego&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Here, I was thinking in terms of hurt feelings, and revenge, aggression, that sort of thing.&lt;/p&gt;
&lt;p&gt;Theoretically, nanotechnology could eliminate competition for resources problems, and this is the really big one, and the most &quot;reasonable&quot; (though still not an excuse for tyrany) one. But could we side step other goal conflict problems?&lt;/p&gt;
&lt;p&gt;As for 2, the &quot;unreasonable&quot; ones, maybe gene thearpy indeed? Would such tampering dilute the very attributes that brought humans to nanotechnology in the first place?&lt;/p&gt;
&lt;p&gt;But I would think the only serious threat to a nanotechnology utopia, would be the power lust problem. These are the kind of folks we need to worry about.&lt;/p&gt;
&lt;p&gt;I don&#039;t know. I took a shot at it. I think I&#039;m in the ballpark. At the very least, this is a beginning analysis of human conflict (not that this isn&#039;t already being done).&lt;/p&gt;
&lt;p&gt;The point is, there&#039;ll still be problems even with the ellimination of competition for resources. I think mainly these will be along the lines of 1-b, and 2-b with 2-b being the only serious threat (aside from psychos).&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Trouble In Paradise?</strong></p>
<p>I&#39;m sympathetic to Farber&#39;s concerns about conflict and human nature.</p>
<p><em>We are still a society partly bent on self-destruction; our appetite for violence and reality TV remains intact. Advances over the next 20 years won&#39;t change human nature unless everyone gets gene therapy and we end up with a society of smiley faces. In fact, the report devotes an extensive chapter to the future of war and combat, which envisions a battlefield occupied by uninhabited combat vehicles and soldiers with enhanced physical and mental capacities.</em></p>
<p>Here&#39;s an attempt to narrow human conflict down to two general categories: conflicting goals, and the desire to control.</p>
<p>1) <strong>conflicting goals</strong></p>
<p>a) Under this category is the familiar <strong>competition for resources</strong>.</p>
<p>b) Also under this category is the sort: &#39;I want to do such and such, which precludes another from doing this or that.&#39; For instance, &#39;the Euroamerican Gateway Committee could not agree on whether it would be more economically feasible to build a transatlantic tunnel, or invest in a new high capacity teletransportation portal.&#39;</p>
<p>2) <strong>control</strong></p>
<p>a) <strong>predictability</strong></p>
<p>We often have a desire to control one another, to make a person do as we would like them to, if for no other reason than for the sake of predictibility. We humans are obviously real big on control in general &#8211; otherwise, why science <img src='http://www.foresight.org/nanodot/wp-includes/images/smilies/icon_smile.gif' alt=':)' class='wp-smiley' /> . But I&#39;m not sure this one alone is of war scale caliber so to speak &#8211; even Hitler&#39;s motives were (one would hope at least) resources competition related.</p>
<p>b) <strong>power lust</strong> (pathological?)</p>
<p>Imagine a world where everyone has everything they need, and more &#8211; brought to by Nanotechnology! Happy, happy, joy, joy! What could possibliy be the problem? Oh, but wait, here comes the ugly spector, Mr. Pathological Powerluster! Oooo, you&#39;d better do as he says, or he&#39;ll take away your molecular french fry machine. This one&#39;s much like the control thing, except on a war or tyranny scale &#39;cause we&#39;re talking heads of state level control here &#8211; <em>what did Johnny&#39;s parent&#39;s <strong>do</strong> to that poor child?.</em> There&#39;s nothing more satisfying than a good day behind the pupetier&#39;s stage.</p>
<p>c) <strong>ego</strong></p>
<p>Here, I was thinking in terms of hurt feelings, and revenge, aggression, that sort of thing.</p>
<p>Theoretically, nanotechnology could eliminate competition for resources problems, and this is the really big one, and the most &quot;reasonable&quot; (though still not an excuse for tyrany) one. But could we side step other goal conflict problems?</p>
<p>As for 2, the &quot;unreasonable&quot; ones, maybe gene thearpy indeed? Would such tampering dilute the very attributes that brought humans to nanotechnology in the first place?</p>
<p>But I would think the only serious threat to a nanotechnology utopia, would be the power lust problem. These are the kind of folks we need to worry about.</p>
<p>I don&#39;t know. I took a shot at it. I think I&#39;m in the ballpark. At the very least, this is a beginning analysis of human conflict (not that this isn&#39;t already being done).</p>
<p>The point is, there&#39;ll still be problems even with the ellimination of competition for resources. I think mainly these will be along the lines of 1-b, and 2-b with 2-b being the only serious threat (aside from psychos).</p>
]]></content:encoded>
	</item>
</channel>
</rss>