<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Nanotechnology and The Ultimate Terrorists</title>
	<atom:link href="http://www.foresight.org/nanodot/?feed=rss2&#038;p=457" rel="self" type="application/rss+xml" />
	<link>http://www.foresight.org/nanodot/?p=457</link>
	<description>examining transformative technology</description>
	<lastBuildDate>Wed, 03 Apr 2013 18:23:47 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.0.4</generator>
	<item>
		<title>By: MarkGubrud</title>
		<link>http://www.foresight.org/nanodot/?p=457#comment-1341</link>
		<dc:creator>MarkGubrud</dc:creator>
		<pubDate>Wed, 28 Feb 2001 21:32:33 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=457#comment-1341</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:Does Joy have &quot;rebuttable&quot; arguments?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Bill Joy makes sweeping but distinct and coherent arguments which I think we can all understand and reply to.&lt;/p&gt;
&lt;p&gt;By way of comparison, you also write in quite general terms:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Stern&#039;s analysis does imply &quot;relaxing concerns&quot; in the sense of somewhat lower priority concerning threats from terrorist use of weapons of mass destruction, compared to other issues. However she clearly argues that a threat exists and deserves significant attention. More important, perhaps, are better priorities about how to respond to those concerns, including more preparation for coping with attacks and minimizing their damage, as well as vigilance about not allowing overreaction to destroy civil liberties, either currently or in a crisis response to a real attack.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;I could not agree more with every one of these statements. I have long felt, and argued, that the issue of terrorism has been overemphasized, over the past decade, both in general discussions of military and security policy and specifically in the context of nanotech, while the danger of a new or renewed arms race has been gratuitously dismissed.&lt;/p&gt;
&lt;p&gt;At the same time, it would be wrong to gratuitously dismiss the growing danger that arises from poliferation of mass-destruction weapons, both into the hands of more and more states and to potential non-state terrorists.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The point about &quot;terrible empowerment of extreme individuals&quot; is that such people already could have access to powerful means, contrary to what seems implicit in much of the alarmist rhetoric about nanotech and biotech.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It is important to realize that material impediments are not the only factor mitigating the frequency of very deadly terror attacks. But it is just as important to realize that impediments &lt;em&gt;are&lt;/em&gt; a factor. If you make it easy to produce and use mass-destruction weapons, then sooner rather than later, someone will do it, perhaps for coherent reasons, perhaps not.&lt;/p&gt;
&lt;p&gt;If we&#039;re talking about an end of the world scenario, if there is something that someone could do that would destroy the world, and you made it easy for anyone to do, then I think we would not have long to live. Realistically, though, you are talking about a spectrum of possible attacks ranging up to deaths in the thousands or even millions. What I am saying is that prevailing conditions will determine that such events occur with some frequency, and the frequency of such events will be higher the more accessible you make the means to cause them.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Joy employs effective rhetoric for getting public attention, and highlights relevant philosophical issues about principles. Beyond that he doesn&#039;t seem to help advance specific discussion about what might be appropriate or &quot;effective controls&quot; to regulate nanotech or other technologies.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;You are right about this, but everyone can&#039;t do everything. Hats off to Bill for doing what he&#039;s doing. We&#039;ve all been stimulated by it, so let&#039;s get on with the business of &quot;specific discussion.&quot;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;It is still not clear to me whether Joy is now talking about anything similar to what Kurzweil calls &quot;fine-grained&quot; relinquishment&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;If Bill Joy came out with a &quot;fine-grained&quot; plan for control of technology, it would be easier for him to be perceived, and portrayed, as a flake. In his position, it works better to be a bit vague, to speak in generalities and to invite more specific discussion by others.&lt;/p&gt;
&lt;p&gt;On the other hand, I think Kurzweil is on target with the suggestion of &quot;fine grained relinquisment.&quot; Ray is not really such a bad guy, when he isn&#039;t promoting evil.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:Does Joy have &quot;rebuttable&quot; arguments?</strong></p>
<p>Bill Joy makes sweeping but distinct and coherent arguments which I think we can all understand and reply to.</p>
<p>By way of comparison, you also write in quite general terms:</p>
<blockquote>
<p>Stern&#39;s analysis does imply &quot;relaxing concerns&quot; in the sense of somewhat lower priority concerning threats from terrorist use of weapons of mass destruction, compared to other issues. However she clearly argues that a threat exists and deserves significant attention. More important, perhaps, are better priorities about how to respond to those concerns, including more preparation for coping with attacks and minimizing their damage, as well as vigilance about not allowing overreaction to destroy civil liberties, either currently or in a crisis response to a real attack.</p>
</blockquote>
<p>I could not agree more with every one of these statements. I have long felt, and argued, that the issue of terrorism has been overemphasized, over the past decade, both in general discussions of military and security policy and specifically in the context of nanotech, while the danger of a new or renewed arms race has been gratuitously dismissed.</p>
<p>At the same time, it would be wrong to gratuitously dismiss the growing danger that arises from poliferation of mass-destruction weapons, both into the hands of more and more states and to potential non-state terrorists.</p>
<blockquote>
<p>The point about &quot;terrible empowerment of extreme individuals&quot; is that such people already could have access to powerful means, contrary to what seems implicit in much of the alarmist rhetoric about nanotech and biotech.</p>
</blockquote>
<p>It is important to realize that material impediments are not the only factor mitigating the frequency of very deadly terror attacks. But it is just as important to realize that impediments <em>are</em> a factor. If you make it easy to produce and use mass-destruction weapons, then sooner rather than later, someone will do it, perhaps for coherent reasons, perhaps not.</p>
<p>If we&#39;re talking about an end of the world scenario, if there is something that someone could do that would destroy the world, and you made it easy for anyone to do, then I think we would not have long to live. Realistically, though, you are talking about a spectrum of possible attacks ranging up to deaths in the thousands or even millions. What I am saying is that prevailing conditions will determine that such events occur with some frequency, and the frequency of such events will be higher the more accessible you make the means to cause them.</p>
<blockquote>
<p>Joy employs effective rhetoric for getting public attention, and highlights relevant philosophical issues about principles. Beyond that he doesn&#39;t seem to help advance specific discussion about what might be appropriate or &quot;effective controls&quot; to regulate nanotech or other technologies.</p>
</blockquote>
<p>You are right about this, but everyone can&#39;t do everything. Hats off to Bill for doing what he&#39;s doing. We&#39;ve all been stimulated by it, so let&#39;s get on with the business of &quot;specific discussion.&quot;</p>
<blockquote>
<p>It is still not clear to me whether Joy is now talking about anything similar to what Kurzweil calls &quot;fine-grained&quot; relinquishment</p>
</blockquote>
<p>If Bill Joy came out with a &quot;fine-grained&quot; plan for control of technology, it would be easier for him to be perceived, and portrayed, as a flake. In his position, it works better to be a bit vague, to speak in generalities and to invite more specific discussion by others.</p>
<p>On the other hand, I think Kurzweil is on target with the suggestion of &quot;fine grained relinquisment.&quot; Ray is not really such a bad guy, when he isn&#39;t promoting evil.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: PaulKrieger000</title>
		<link>http://www.foresight.org/nanodot/?p=457#comment-1343</link>
		<dc:creator>PaulKrieger000</dc:creator>
		<pubDate>Mon, 26 Feb 2001 21:17:16 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=457#comment-1343</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:so you want to blow up a city...&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;1) What is the deal with the one solution approach?&lt;br /&gt;
&lt;br /&gt;
2) Why not take the benefits of substantial regulation with defense measures?&lt;br /&gt;
&lt;br /&gt;
3) Why not throw everything we have at this?&lt;br /&gt;
&lt;br /&gt;
I have a solution scenario but I do not want to go ntech public with it yet. It needs fine working but it is specific unlike Joy&#039;s warnings. I suggest that everyone start studying up on strategy games. (Not Chess, the mental kind, persuasion perhaps.) I like to watch Aeon FLUX. Do you?&lt;br /&gt;
&lt;br /&gt;
I&#039;ve found the best way to play strategy, when the games are complex, is to treat it like a story. Characters or major players, motivations, plot, setting...etc. I&#039;m sure that most of my posts look stupid or simple, they are not if you can read between the lines.&lt;br /&gt;
&lt;br /&gt;
When posing the character of the technicians, be brutally honest with yourself. The greatest men in history saw something powerful and knew that if they could rally the support of the people they would be kings. It&#039;s political, it&#039;s public and it&#237;s....&lt;br /&gt;
&lt;br /&gt;
&quot;Suspicion breeds confidence.&quot; - GOOD MOVIE.&lt;br /&gt;
I know I am risking a flamebait with this one but I had to it was killing me.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:so you want to blow up a city&#8230;</strong></p>
<p>1) What is the deal with the one solution approach?</p>
<p>2) Why not take the benefits of substantial regulation with defense measures?</p>
<p>3) Why not throw everything we have at this?</p>
<p>I have a solution scenario but I do not want to go ntech public with it yet. It needs fine working but it is specific unlike Joy&#39;s warnings. I suggest that everyone start studying up on strategy games. (Not Chess, the mental kind, persuasion perhaps.) I like to watch Aeon FLUX. Do you?</p>
<p>I&#39;ve found the best way to play strategy, when the games are complex, is to treat it like a story. Characters or major players, motivations, plot, setting&#8230;etc. I&#39;m sure that most of my posts look stupid or simple, they are not if you can read between the lines.</p>
<p>When posing the character of the technicians, be brutally honest with yourself. The greatest men in history saw something powerful and knew that if they could rally the support of the people they would be kings. It&#39;s political, it&#39;s public and it&iacute;s&#8230;.</p>
<p>&quot;Suspicion breeds confidence.&quot; &#8211; GOOD MOVIE.<br />
I know I am risking a flamebait with this one but I had to it was killing me.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RobertBradbury</title>
		<link>http://www.foresight.org/nanodot/?p=457#comment-1342</link>
		<dc:creator>RobertBradbury</dc:creator>
		<pubDate>Mon, 26 Feb 2001 08:53:57 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=457#comment-1342</guid>
		<description>&lt;p&gt;&lt;strong&gt;Re:so you want to blow up a city...&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I&#039;m only going to make a brief comment. Bill was at the SA meeting last spring, I sat in on several conversations with him, Neil Jacobstein, Greg Burch and others. I made the point which Neil and even Bill seemed to accept that &quot;The horse is already out of the barn&quot;. There are unregulated and uncontrolled &quot;matter compilers&quot; in the form of DNA and Protein synthesizers scattered all around the world. Further, the development of such machines isn&#039;t particularly difficult as the TIGR conference in Sept. 2000 had a poster from a group in China that had designed and constructed their own 96-channel DNA synthesizer.&lt;/p&gt;
&lt;p&gt;Unless you want to make an argument for ubiquitous surveilance and preemptive strikes, the regulation of foreign powers or rogue states &lt;em&gt;isn&#039;t&lt;/em&gt; going to happen.&lt;/p&gt;
&lt;p&gt;So Bill confuses the issue by compressing the time frame making it seem like the GNR-nightmare will rise up out of the swamp tomorrow. The U.S. intelligence community tries to impose stop-gap measures by limiting the export &quot;encryption&quot; technologies, and we find that the terrorists are the key people taking advantage of these technologies (what does this say -- that sociopaths are smart enough to figure out that they should be using the things we advertise they should not be using... Duh....).&lt;/p&gt;
&lt;p&gt;My perspective -- regulation isn&#039;t going to work. Simple, end of discussion. The only thing that will work is defenses against the things that could destroy you.&lt;/p&gt;
&lt;p&gt;Along with detecting those things we may have to learn to live with some pretty ubiquitous and detailed surveilance.&lt;/p&gt;
&lt;p&gt;The alternative is moving to enclaves that have much more relaxed security (and higher risk) and where &quot;buyer beware&quot; is the rule of the day.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Re:so you want to blow up a city&#8230;</strong></p>
<p>I&#39;m only going to make a brief comment. Bill was at the SA meeting last spring, I sat in on several conversations with him, Neil Jacobstein, Greg Burch and others. I made the point which Neil and even Bill seemed to accept that &quot;The horse is already out of the barn&quot;. There are unregulated and uncontrolled &quot;matter compilers&quot; in the form of DNA and Protein synthesizers scattered all around the world. Further, the development of such machines isn&#39;t particularly difficult as the TIGR conference in Sept. 2000 had a poster from a group in China that had designed and constructed their own 96-channel DNA synthesizer.</p>
<p>Unless you want to make an argument for ubiquitous surveilance and preemptive strikes, the regulation of foreign powers or rogue states <em>isn&#39;t</em> going to happen.</p>
<p>So Bill confuses the issue by compressing the time frame making it seem like the GNR-nightmare will rise up out of the swamp tomorrow. The U.S. intelligence community tries to impose stop-gap measures by limiting the export &quot;encryption&quot; technologies, and we find that the terrorists are the key people taking advantage of these technologies (what does this say &#8212; that sociopaths are smart enough to figure out that they should be using the things we advertise they should not be using&#8230; Duh&#8230;.).</p>
<p>My perspective &#8212; regulation isn&#39;t going to work. Simple, end of discussion. The only thing that will work is defenses against the things that could destroy you.</p>
<p>Along with detecting those things we may have to learn to live with some pretty ubiquitous and detailed surveilance.</p>
<p>The alternative is moving to enclaves that have much more relaxed security (and higher risk) and where &quot;buyer beware&quot; is the rule of the day.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: BryanBruns</title>
		<link>http://www.foresight.org/nanodot/?p=457#comment-1340</link>
		<dc:creator>BryanBruns</dc:creator>
		<pubDate>Mon, 26 Feb 2001 05:48:02 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=457#comment-1340</guid>
		<description>&lt;p&gt;&lt;strong&gt;Does Joy have &quot;rebuttable&quot; arguments?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Thanks for the comments, Mark. I agree that avoiding nanotech arms races is a priority, which is why I think it&#039;s worth putting the different risks into perspective and looking at the ways they might be managed.&lt;/p&gt;
&lt;p&gt;Stern&#039;s analysis does imply &quot;relaxing concerns&quot; in the sense of somewhat lower priority concerning threats from terrorist use of weapons of mass destruction, compared to other issues. However she clearly argues that a threat exists and deserves significant attention. More important, perhaps, are better priorities about how to respond to those concerns, including more preparation for coping with attacks and minimizing their damage, as well as vigilance about not allowing overreaction to destroy civil liberties, either currently or in a crisis response to a real attack.&lt;/p&gt;
&lt;p&gt;The point about &quot;terrible empowerment of extreme individuals&quot; is that such people already could have access to powerful means, contrary to what seems implicit in much of the alarmist rhetoric about nanotech and biotech. In addition to the &quot;crude&quot; chemical and bioweapons that Stern discusses, one can also point to fire, a dangerous &quot;self-replicating&quot; technology. History, from medieval urban warfare to WW II firestorms to all kinds of arson, shows both the threats, and some of the factors that have constrained its use.&lt;/p&gt;
&lt;p&gt;More generally, I&#039;m starting to feel that Bill Joy&#039;s ideas are so vaguely expressed that they can&#039;t be &quot;rebutted.&quot; Joy employs effective rhetoric for getting public attention, and highlights relevant philosophical issues about principles. Beyond that he doesn&#039;t seem to help advance specific discussion about what might be appropriate or &quot;effective controls&quot; to regulate nanotech or other technologies.&lt;/p&gt;
&lt;p&gt;The original Wired essay, &lt;a href=&quot;http://www.wired.com/wired/archive/8.04/joy.html&quot;&gt;&quot;Why the Future Doesn&#039;t Need Us&quot;&lt;/a&gt; at one point refers to &lt;a href=&quot;http://www.wired.com/wired/archive/8.04/joy.html?pg=10&amp;topic=&amp;topic_set=&quot;&gt;;&lt;em&gt;&quot;relinq uishment of certain GNR technologies,&quot;&lt;/em&gt;&lt;/a&gt; i.e. a selective approach of some sort. In the &lt;a href=&quot;http://dw.kqed.org/2000/08/11/transcript.html&quot;&gt;Digital West interview&lt;/a&gt;, which was &lt;a href=&quot;http://nanodot.org/article.pl?sid=00/08/17/239222&amp;mode=nested&quot;&gt;mentioned on Nanodot&lt;/a&gt;, Joy says &lt;em&gt;&quot;We need international cooperation and collective sanity&quot;&lt;/em&gt; and &lt;em&gt;&quot;We have the Internet that we can organize people, we can exchange information. We have some help from technology, but we have to become more mature to live with these things, and we have to do that collectively. We have to avoid an arms race in these technologies.&quot;&lt;/em&gt; The Nanodot post &lt;a href=&quot;http://nanodot.org/article.pl?sid=00/11/03/2213253&amp;mode=nested&quot;&gt;&quot;Bill Joy advocates common sense, not banning research&quot;&lt;/a&gt; linked to &lt;a href=&quot;http://www.wired.com/news/print/0,1294,39864,00.html&quot;&gt;Wired&#039;s discussion&lt;/a&gt; of the Camden Technology Conference, mentioning Joy as talking about &quot;self-imposed limitations&quot; and &quot;self-regulation.&quot; &lt;em&gt;&quot;Although some have misinterpreted his arguments as calling for a ban on some types of research, Joy said he is simply calling for a return to common sense.&quot; ... &quot;Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.&quot;&lt;/em&gt; The most recent article mentioned &lt;a href=&quot;http://nanodot.org/article.pl?sid=01/02/21/0143259&amp;mode=nested&quot;&gt;on Nanodot&lt;/a&gt; [the original Mercury News URL seems no longer available] said he raised questions about limits on freedom of speech, but doesn&#039;t seem to offer any more clarification about specific measures.&lt;/p&gt;
&lt;p&gt;It is still not clear to me whether Joy is now talking about anything similar to what Kurzweil calls &lt;a href=&quot;http://www.extropy.org/opinions.htm&quot;&gt;&lt;em&gt;&quot;fine-grained&quot;&lt;/em&gt;&lt;/a&gt; relinquishment or &lt;a href=&quot;http://www.usatoday.com/life/cyber/tech/review/crh112.htm&quot;&gt;&lt;em&gt;&quot;relinquishme nt at the right level,&quot;&lt;/em&gt;&lt;/a&gt; such things as not using genetic algorithms to program nanomachines and keeping access to smallpox restricted, or whether he is still envisioning sweeping &quot;relinquishment&quot; which many interpreted to be his intent in the original Wired Article. There also seems to be ambiguity (or contradiction, or a change in his views) about whether he is mainly talking about voluntary self-restraint, or regulation imposed with force of law (censorship, outlawing some specific kinds of research).&lt;/p&gt;
&lt;p&gt;I&#039;d be interested if anyone can point to someplace where Bill Joy lays out his views more specifically. If Joy does not have any more specific proposals, and is not interested in working through specific examples, as appears to be the case, then I&#039;m inclined to conclude that he is just raising alarm, for better or worse, and provocatively pointing out philosophical questions, but not really making any further substantive contribution to thinking on these topics.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;BTW, while prowling through this issue again on the web, I enjoyed rereading Glenn Reynolds&#039; essay &lt;a href=&quot;http://www.nationalreview.com/comment/commentprint070500c.html&quot;&gt;&quot;Wait a nanosecond: Crushing Nanotechnology would be a terrible thing&quot;&lt;/a&gt; and liked Arkuat&#039;s humorous account of the Spiritual Machines conference (go &lt;a href=&quot;http://crackmonkey.org/pipermail/crackmonkey/2000q2.txt&quot;&gt;here and search on &quot;relinquish.&quot;&lt;/a&gt;&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>Does Joy have &quot;rebuttable&quot; arguments?</strong></p>
<p>Thanks for the comments, Mark. I agree that avoiding nanotech arms races is a priority, which is why I think it&#39;s worth putting the different risks into perspective and looking at the ways they might be managed.</p>
<p>Stern&#39;s analysis does imply &quot;relaxing concerns&quot; in the sense of somewhat lower priority concerning threats from terrorist use of weapons of mass destruction, compared to other issues. However she clearly argues that a threat exists and deserves significant attention. More important, perhaps, are better priorities about how to respond to those concerns, including more preparation for coping with attacks and minimizing their damage, as well as vigilance about not allowing overreaction to destroy civil liberties, either currently or in a crisis response to a real attack.</p>
<p>The point about &quot;terrible empowerment of extreme individuals&quot; is that such people already could have access to powerful means, contrary to what seems implicit in much of the alarmist rhetoric about nanotech and biotech. In addition to the &quot;crude&quot; chemical and bioweapons that Stern discusses, one can also point to fire, a dangerous &quot;self-replicating&quot; technology. History, from medieval urban warfare to WW II firestorms to all kinds of arson, shows both the threats, and some of the factors that have constrained its use.</p>
<p>More generally, I&#39;m starting to feel that Bill Joy&#39;s ideas are so vaguely expressed that they can&#39;t be &quot;rebutted.&quot; Joy employs effective rhetoric for getting public attention, and highlights relevant philosophical issues about principles. Beyond that he doesn&#39;t seem to help advance specific discussion about what might be appropriate or &quot;effective controls&quot; to regulate nanotech or other technologies.</p>
<p>The original Wired essay, <a href="http://www.wired.com/wired/archive/8.04/joy.html">&quot;Why the Future Doesn&#39;t Need Us&quot;</a> at one point refers to <a href="http://www.wired.com/wired/archive/8.04/joy.html?pg=10&amp;topic=&amp;topic_set=">;<em>&quot;relinq uishment of certain GNR technologies,&quot;</em></a> i.e. a selective approach of some sort. In the <a href="http://dw.kqed.org/2000/08/11/transcript.html">Digital West interview</a>, which was <a href="http://nanodot.org/article.pl?sid=00/08/17/239222&amp;mode=nested">mentioned on Nanodot</a>, Joy says <em>&quot;We need international cooperation and collective sanity&quot;</em> and <em>&quot;We have the Internet that we can organize people, we can exchange information. We have some help from technology, but we have to become more mature to live with these things, and we have to do that collectively. We have to avoid an arms race in these technologies.&quot;</em> The Nanodot post <a href="http://nanodot.org/article.pl?sid=00/11/03/2213253&amp;mode=nested">&quot;Bill Joy advocates common sense, not banning research&quot;</a> linked to <a href="http://www.wired.com/news/print/0,1294,39864,00.html">Wired&#39;s discussion</a> of the Camden Technology Conference, mentioning Joy as talking about &quot;self-imposed limitations&quot; and &quot;self-regulation.&quot; <em>&quot;Although some have misinterpreted his arguments as calling for a ban on some types of research, Joy said he is simply calling for a return to common sense.&quot; &#8230; &quot;Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.&quot;</em> The most recent article mentioned <a href="http://nanodot.org/article.pl?sid=01/02/21/0143259&amp;mode=nested">on Nanodot</a> [the original Mercury News URL seems no longer available] said he raised questions about limits on freedom of speech, but doesn&#39;t seem to offer any more clarification about specific measures.</p>
<p>It is still not clear to me whether Joy is now talking about anything similar to what Kurzweil calls <a href="http://www.extropy.org/opinions.htm"><em>&quot;fine-grained&quot;</em></a> relinquishment or <a href="http://www.usatoday.com/life/cyber/tech/review/crh112.htm"><em>&quot;relinquishme nt at the right level,&quot;</em></a> such things as not using genetic algorithms to program nanomachines and keeping access to smallpox restricted, or whether he is still envisioning sweeping &quot;relinquishment&quot; which many interpreted to be his intent in the original Wired Article. There also seems to be ambiguity (or contradiction, or a change in his views) about whether he is mainly talking about voluntary self-restraint, or regulation imposed with force of law (censorship, outlawing some specific kinds of research).</p>
<p>I&#39;d be interested if anyone can point to someplace where Bill Joy lays out his views more specifically. If Joy does not have any more specific proposals, and is not interested in working through specific examples, as appears to be the case, then I&#39;m inclined to conclude that he is just raising alarm, for better or worse, and provocatively pointing out philosophical questions, but not really making any further substantive contribution to thinking on these topics.</p>
</p>
<p>BTW, while prowling through this issue again on the web, I enjoyed rereading Glenn Reynolds&#39; essay <a href="http://www.nationalreview.com/comment/commentprint070500c.html">&quot;Wait a nanosecond: Crushing Nanotechnology would be a terrible thing&quot;</a> and liked Arkuat&#39;s humorous account of the Spiritual Machines conference (go <a href="http://crackmonkey.org/pipermail/crackmonkey/2000q2.txt">here and search on &quot;relinquish.&quot;</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: MarkGubrud</title>
		<link>http://www.foresight.org/nanodot/?p=457#comment-1339</link>
		<dc:creator>MarkGubrud</dc:creator>
		<pubDate>Mon, 26 Feb 2001 00:55:32 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=457#comment-1339</guid>
		<description>&lt;p&gt;&lt;strong&gt;so you want to blow up a city...&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Nice review, Bryan; sounds like a reasonable book. The basic arguments you cite...&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;substantial technical and political obstacles that dissuade most groups from using weapons of mass destruction, including shortage of specialized skills, difficulties in obtaining or making weapons, uncertainties in delivery, and risk of political backlash.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;...have been made for many years, and I have long felt that they are persuasive in some contexts, but not conclusive, and dubious in other contexts.&lt;/p&gt;
&lt;p&gt;Certainly if a group is fighting for a goal that needs public sympathy, such as the liberation or independence of a people or region, it is not advisable to commit unforgivable acts of violence against innocent people. And it is generally not a good idea for a weaker power to provoke a stronger one.&lt;/p&gt;
&lt;p&gt;But consider Osama Bin Laden. He has incited or directed several very deadly and dramatic attacks on US diplomatic and military targets. Or consider the group that bombed the World Trade Center; their hope was to topple the building and kill thousands of people. These and other groups are engaged in what they view as a holy war against the United States, not out of irrational hatred, but outrage at the US role in the Middle East. We prop up the corrupt oil kingdoms, help to repress the discontent of the disenfranchised, impose sanctions and bom Iraq, and support Israel without conditions. Why should they shy from killing Americans? Our sanctions have killed the order of one million Iraqis, our bombing has killed tens of thousands, and Israel has killed hundreds of Palestinians in just the past few months. More to the point, if Bin Laden or one of his pals could pull off a truly damaging attack on US national territory, would that not serve their cause? Which is to get us to pack up and go home.&lt;/p&gt;
&lt;p&gt;Suppose they could plant a nuke in Washington, D.C. and pick an opportune moment to detonate it. That would kill a lot of innocent people, but it would leave the US military and foreign policy apparatus in such disarray that it would almost certainly result in a US retreat from overseas engagement. Or at least, such an argument seems very plausible, and might be persuasive to a holy warrior if he happened to have a nuke on hand.&lt;/p&gt;
&lt;p&gt;Note that plausible chem or bio attacks would not have such a drastic or certain effect on US ability or will to act. Still, a campaign of such attacks, coupled with a clear message about the reasons for it, might be expected to have a strongly demoralizing effect over the long term.&lt;/p&gt;
&lt;p&gt;The ease of obtaining and using mass destruction weapons has often been overstated, but that does not consititute an argument for making it easier. In the case of nukes which might be on the loose today, it may be easier for a loose nuke to find a user than for a would-be user to find a loose nuke. I don&#039;t think we can draw any conclusions in favor of relaxing concerns about trends that make it easier to make or aquire WMD.&lt;/p&gt;
&lt;p&gt;You imply that Stern&#039;s analysis rebuts Bill Joy&#039;s concern about the dissemination of nanotechnology capabilities resulting in a &quot;terrible empowerment of extreme individuals.&quot; I don&#039;t think this implication is warranted. We can rebut hysteria and nihilism about the inevitability of Doom or the uncontrollability of tommorow&#039;s nanohackers, but Joy is absolutely right to warn about the danger and to raise a call for effective controls.&lt;/p&gt;
&lt;p&gt;I am generally in agreement with your other comments about possible approaches to managing and reducing threats of terrorism and interstate war. I would not underplay the latter danger, though; it remains the principal and most critical threat to the survival of our civilization as it passes through this apparent technological singularity.&lt;/p&gt;

</description>
		<content:encoded><![CDATA[<p><strong>so you want to blow up a city&#8230;</strong></p>
<p>Nice review, Bryan; sounds like a reasonable book. The basic arguments you cite&#8230;</p>
<blockquote>
<p>substantial technical and political obstacles that dissuade most groups from using weapons of mass destruction, including shortage of specialized skills, difficulties in obtaining or making weapons, uncertainties in delivery, and risk of political backlash.</p>
</blockquote>
<p>&#8230;have been made for many years, and I have long felt that they are persuasive in some contexts, but not conclusive, and dubious in other contexts.</p>
<p>Certainly if a group is fighting for a goal that needs public sympathy, such as the liberation or independence of a people or region, it is not advisable to commit unforgivable acts of violence against innocent people. And it is generally not a good idea for a weaker power to provoke a stronger one.</p>
<p>But consider Osama Bin Laden. He has incited or directed several very deadly and dramatic attacks on US diplomatic and military targets. Or consider the group that bombed the World Trade Center; their hope was to topple the building and kill thousands of people. These and other groups are engaged in what they view as a holy war against the United States, not out of irrational hatred, but outrage at the US role in the Middle East. We prop up the corrupt oil kingdoms, help to repress the discontent of the disenfranchised, impose sanctions and bom Iraq, and support Israel without conditions. Why should they shy from killing Americans? Our sanctions have killed the order of one million Iraqis, our bombing has killed tens of thousands, and Israel has killed hundreds of Palestinians in just the past few months. More to the point, if Bin Laden or one of his pals could pull off a truly damaging attack on US national territory, would that not serve their cause? Which is to get us to pack up and go home.</p>
<p>Suppose they could plant a nuke in Washington, D.C. and pick an opportune moment to detonate it. That would kill a lot of innocent people, but it would leave the US military and foreign policy apparatus in such disarray that it would almost certainly result in a US retreat from overseas engagement. Or at least, such an argument seems very plausible, and might be persuasive to a holy warrior if he happened to have a nuke on hand.</p>
<p>Note that plausible chem or bio attacks would not have such a drastic or certain effect on US ability or will to act. Still, a campaign of such attacks, coupled with a clear message about the reasons for it, might be expected to have a strongly demoralizing effect over the long term.</p>
<p>The ease of obtaining and using mass destruction weapons has often been overstated, but that does not consititute an argument for making it easier. In the case of nukes which might be on the loose today, it may be easier for a loose nuke to find a user than for a would-be user to find a loose nuke. I don&#39;t think we can draw any conclusions in favor of relaxing concerns about trends that make it easier to make or aquire WMD.</p>
<p>You imply that Stern&#39;s analysis rebuts Bill Joy&#39;s concern about the dissemination of nanotechnology capabilities resulting in a &quot;terrible empowerment of extreme individuals.&quot; I don&#39;t think this implication is warranted. We can rebut hysteria and nihilism about the inevitability of Doom or the uncontrollability of tommorow&#39;s nanohackers, but Joy is absolutely right to warn about the danger and to raise a call for effective controls.</p>
<p>I am generally in agreement with your other comments about possible approaches to managing and reducing threats of terrorism and interstate war. I would not underplay the latter danger, though; it remains the principal and most critical threat to the survival of our civilization as it passes through this apparent technological singularity.</p>
]]></content:encoded>
	</item>
</channel>
</rss>