Foresight Nanotech Institute Logo
Image of nano

Bill Joy debate on “terrible empowerment of extreme individuals”

from the glad-I-wasn't-there dept.
An article in the Feb 17 San Jose Mercury News' religion and ethics section entitled Guiding Science covered a debate between Bill Joy and various others including nanotechnologist James Heath: "Bill Joy is once more trumpeting the dangers of technology run amok…" Joy quoted a rabbi: "Zalman said, 'Maybe we should declare that nanotechnology isn't kosher; and maybe the pope should declare it a mortal sin'. I said, 'That's an interesting perspective. Most of the people in my company [Sun Microsystems] don't think like that.' " Heath is quoted as saying that "nanobots" are "science fiction". CP: Sigh — let's have some higher-quality debate on this topic. We'll try at the April 20-22 Foresight meeting.

14 Responses to “Bill Joy debate on “terrible empowerment of extreme individuals””

  1. GReynolds Says:

    Phoney Bravery

    Have you noticed how people who propose sacrificing the individual for the benefit of the group always act like there's something noble and selfless — and groundbreaking — in arguing for that approach? James Tabor's book "Seeing Like a State" examines this in more detail. It's not a pretty picture.

  2. MarkGubrud Says:

    I am called Plow Man

    Bill Joy is breaking ground, and he's digging the plow in a bit deeper than I would, but I'm still glad someone's doing it, someone who can get some goddam press. People who condemn Joy's most extreme positions (or more often, misrepresentations of them), while ignoring the more difficult issues that he raises, are not engaging in "high-quality debate."

    What is technology for? What are people for? Do people create technology to serve human needs and desires, or do people exist only to create technology? Are we letting technology determine its own destiny, and ours? Is all knowledge always good? Is all technological "progress" always good? Do we have to empower individuals to the point of collective insecurity?

    A lot of people get angry just hearing such questions asked. That's why we need people like Bill Joy (and myself) to keep asking them.

  3. Adam Burke Says:

    Re:I am called Irrigation Man

    I suspect most of western society would benefit from deeper analysis of pretty much everything. Analysis, that is, not post-modern rhetorical castles in the air.

    An analogy occurs to me. Search and replace nanotechnolgy with irrigation.

    Irrigation powered the flowering of multiple major ancient civilisations, but because they didn't think through the consequences carefully enough, it eventually destroyed them. Salinity left them floundering in the desert.

    A clumsy metaphor? Perhaps … feel free to vivisect it :)

  4. Saturngraphix Says:

    Re:I am called Plow Man

    First Problem
    "Perhaps science should stop manipulating genes, he suggested, even if new gene therapies might save a child from incurable cancer."

    Sure Bill, We will allow children to die because the pace of technology makes you and a few others nervous.

    this guy takes the cake…really…

    I think that safeguarding technology and coming up with counters for misuse is necessary, however, I dont see how stopping progression in a field is a good thing…It appears that some education is in order here.

    The only good thing about this is the fact that while Bill is going off about his fears, it also makes people interested enough to take a look at what he is talking about….and usually a educated person can make up their own mind about whats going on. Bill may be using a double edged sword.

    Saturn

  5. Iron Sun Says:

    The middle path

    Participants in this debate, as in so many other areas of contention in society, must avoid the temptation to fall into either side of a mutually antagonistic dualism. Black and white, good and evil, mad scientist and luddite. The truth, as always, lies somewhere in between.

    There are so many possibilities that a mature MNT will make possible that scare me witless. We will have to do what we can to ensure that potential Unabombers don't commit atrocities on a scale previously limited to states, not individuals or small groups. The ethical considerations, questions of what it means to be human, are going to vex the greatest philosophers of our time. There are unlikely to be any easy answers, and solutions that satisfy everyone will prove to be as elusive and tenuous as Middle East peace agreements. But we have to consider the benefits as well as the risks.

    To choose a historical example, a nineteenth century U.S. President-to-be (I can't remember which one, I'm not American :-) wrote to the incumbent requesting that he not devote state resources to the development of railways, because of the negative effect it would have on the economy. Saddlers, carriage makers and others would find their livelihoods rendered unviable. People would be killed in accidents. Also, Nature itself obviously did not intend for people to move at such breakneck speeds. To go back further, the chief advisor of King Zimri-lim of Sumeria pleaded with his master not to popularise the use of horses and chariots becuase of the effect it would have on the people. To anyone who would say that these examples are specious because the changes we face are far more profound, keep in mind that the contemporaries of those mentioned above would have felt that their world was changing just as fundamentally as we feel ours is. We view the changes made in their lifetimes with the benefits of hindsight. Every generation believes that it exists at a crucial moment in history, and that the decisions they must make are the most profound that will ever face humanity.

    Few of the changes wrought by these new technologies were universally beneficial, of course. The Hurrians, with their chariots, overthrew empires that had lasted for centuries and brought untold misery to countless people. Trains allowed the 'opening up' of the American West, with all that meant for the indigenous people. They also allowed people to die in high-speed accidents far more efficiently than carriages allowed. Mistakes were made, atrocities were committed. Should we give them up entirely, or learn from our errors?

    On the other hand, enthusiasts about new technologies need to be wary of viewing the future through rose coloured glasses. Boosters of television claimed that it would be an unparalleled educational tool, and that people wouldn't watch it obsessively. Early promoters of the wonders of radioactivity extolled the health benefits of radium-impregnated underpants. Technologies seldom pan out as their most enthusiastic supporters hope.

    I don't see the extremist position advocated by Bill Joy as being one likely to work. In the article in question, for example, he suggests we may need to "set limits on free speech to stop the spread of dangerous technological knowledge. Not all information, not all computer coding, should be shared with just anyone anymore". Unless you dismantle the entire Net, that just ain't gonna happen, and the draconian, or even totalitarian, enforcement that would be necessary to make it stick would not be popular, to put it mildly. Also, we already have non-disclosure agreements and official secrets acts. Perhaps we simply need to modify these concepts, rather than institute an Inquisitional suppression of forbidden knowledge. I hope that he did not intend for that suggestion to be taken seriously, but was making a sort of ambit claim in order to provoke vigorous debate.

    We can't go back to the way we lived in the past. We can't stay where we are. We need to find a way forward that maximises human potential and happiness, all the while keeping in mind the lessons we should have learned from our past mistakes. How we're going to do that, I don't know.

    Any ideas?

  6. kurt Says:

    Bill Joy

    Bill Joy is a very intelligent man who has had a very creative career (like helping to create the Solaris operating system). However, like other intelligent men such as Fred Hoyle and Timothy Leary, is beginning to go off of the deep end of the pool.

  7. CurtisShenton Says:

    Bill Joy-Necessary Evil

    Personally I find the end Bill Joy seems to be working towards abhorrent. Perhaps I lack the proper empathy for the group but Bill Joyís ideal vision for the future seems to be a totalitarian world government that surpresses the open dialogue of ideas and is beyond conservative to being a completely static society. Presumably this society would exist up to the point the Sun swells up into a Red Giant and then the human race ends. I must admit I would really like to hear Bill Joy explain a little bit what his vision of the future would be like if everyone suddenly agreed with him, I suspect he hasnít thought through all the long term consequences of his ideas. It seems like even in a Bill Joy paradise humanity will have to either develop a mature molecular level technology (and beyond) with all the risks that implies, or become extinct. Even so I do find Bill Joyís recent comments to be useful. He has created and expanded debates on the issues of risks we face due to current and near future technological developments in mainstream media that were not taking place. This is a good thing. While I am of the opinion the only real defense against bio or nano terrorism is better bio or nanotech solutions already in place, the more people involved in the discussion of how new technologies are developed, what the priorities of development should be, and who should control them, the better. It was inevitable that as nanotechnology begins to migrate from being dismissed as science fiction to actually seeing results that the field would have at least one Jeremy Rifkin like detractor who warns of doom and wants to stop everything. If someone is going to be chosen by the media as the opposing force in the debate (as part of a balanced presentation of issues) I can hardly think the field could have done better than Bill Joy. He is smart, well versed in technological development and becoming more and more educated about nano and bio tech development and implications, and while extreme in his views he at least still seems rational.

  8. ChrisWeider Says:

    Collaborating on threat models?

    We're all sort of bouncing around here proposing different threat models, usually from quite different perspectives. Can we apply Drexler's Science Court ideas to a specific starting point (say the transformational technologies work that was done here a while back) and attempt to identify which threats are most credible in what time frames? For example, I would expect biologically-based NT threats to be much more likely in the short term than general assembler threats. I could propose a strawman along with a set of mechanisms for adding to it, if people wish… Chris

  9. G-Man Says:

    Odd man out

    I share some concerns ( maybe not to the extent of Bills') about the possible use nanotech for harmful purposes. People still haven't made the leap to the fact that assemblers will allow individuals and corporations to make most of things we have now, only cheaper and faster, it still seems like Sci-Fi. Hopefully our perspectives will change when there is actually an assembler made.

  10. DavidMasterson Says:

    Re:I am called Irrigation Man

    In the past, societies that made a devastating mistake (like your destruction through irragation) could always pack up and go elsewhere to try again. In a world of shrinking resources and greater potential (good and bad) for our mistakes, is there anywhere left to go to "try again"?

  11. RobertBradbury Says:

    Re: Collaborating on threat models?

    Chris, you are correct that the biotech based threats are more serious and do currently exist. (One can look most of Nature as a biotech based threat — just ask someone with HIV or malaria). If one carefully looks at the entire development of the biotechnology industry it provides clear precedent for nanotechnology. The debates regarding whether one should build recombinant DNA labs in cities like Cambridge in the mid-1970's provide an interesting mirror to the discussions we will have regarding where assembler factories should be built, who should operate them, what kind of safety requirements they should have, etc.

    The best place for people to start is to educate themselves more on what has been done in this area in the past. For example the Russian Bioweapons program as documented in Biohazard. Another might be Weapons, Culture, and Self-Interest: Soviet Defense Managers in the New Russia. Using the keywords "biological weapons" in a book search at Amazon turns up a number of interesting titles. If we do not educate our selves about the risks we cannot have an educated discussion.

    Biotechnology also serves as an example where we have managed it relatively safely for 26+ years. As the Iraq situation has shown however, the availability of the equipment and knowledge can allow states (or individuals) to rapidly develop rather nasty capabilities.

    As the potential ability of the "bad guys" to develop bad technologies develops, the "good guys" must develop more robust abilities to defeat these technologies. For example — rapid DNA sequencing capabilities, rapid protein structure determination, rapid drug development, rapid vaccine manufacturing capability, etc. I have a business plan that is circulating that addresses some of these areas but it certainly would help to have others join the discussion and lobby for progress in related areas.

  12. RobertBradbury Says:

    What Jim Heath really thinks

    I had some correspondence with Jim Heath about his comment and he indicated that it was taken out of context (what else is new?). Jim's comment that nanobots were 'science fiction' was directed at nanobots as envisioned by Bill Joy.

    The vision that Bill has of nanobots is of the advanced, self-replicating kind (perhaps he's seen one to many Star Trek episodes where the Borg self-replicating nanobots make them impossible to defeat…). I've pointed out to Jim that existing bacteria do fit this description quite nicely (and that is one of the reasons they are used in bioweapons). I've also noted that it is entirely unnecessary to have biobots or nanobots be self-replicating so the dangers of this aspect of nanotechnology can be minimized (noting the possible exception of the activities of stupid people or maniacs). Since the design of self-replicating nanobots is likely to be non-trivial (bacteria require ~350 parts to do this even in resource rich environments), I would tend to agree that the label of 'science fiction' is perhaps appropriate for diamondoid self-replicating nanobots for now. If on the other hand Nano@Home were to prove feasible (so computers can design the parts) and one started to see the accumulation of self-replication components in the parts library, it might be time to see who was dedicating the computer time to the design efforts and consider restricting access to such parts to a need-to-know basis.

  13. ChrisWeider Says:

    Re:Collaborating on threat models?

    Request for information: Aren't most substances required for genetic manipulation restricted? If a major part of the regulatory regimen is restricted access, isn't it possible to build atomic manipulators at home using unrestricted components? Thanks! Chris

  14. BryanBruns Says:

    Crosslink to comment and query

    In response to Mark Gubrud's comments on my review of The Ultimate Terrorist, I've posted some comments and a query "Does Joy have 'rebuttable' arguments?

Leave a Reply