Foresight Nanotech Institute Logo
Image of nano

Report sparks technology utopia dialog

from the staking-out-the-future dept.
What utopia can technology deliver?, a Tech Update article by Dan Farber, August 9, 2002 continues the dialog sparked by the NSF/DOC report Converging Technologies for Improving Human Performance. While recommending the report as an important document for considering what future technologies will bring, Farber finds some of the report's suggestions "hard to buy."

Although very much intrigued by the report, Farber has more trouble with its more extreme suggestions:

Solving all the world's problems through this convergence of science and technology requires a large stretch of the imagination, as do many of the concepts in the report. While I am fascinated by the science, the notion that "humanity would become like a single, transcendent nervous system, an interconnected 'brain' based in new core pathways of society," as suggested by the report's primary authors, is hard to buy.

Farber quotes one of the report's authors, Dr. James Canton:

"This is an optimal view of the future, not necessarily a Realpolitik of the future," Canton said. "We are trying to envision what we want to design for the future world of the next 20 years. If we don't put a stake in the ground and take innovation leadership, we can't work toward it. If we have a vision, we have the ability to transform large social systems. We are at the beginning of the process, but we want to give people a heads-up with the report and invite them to come along for the ride."

In wrapping up his take on the report, Farber echoes Foresight's theme on the importance of informed debate in preparing for future technologies:

It appears the goal of the report is to create an international initiative and funding to exploit the concepts in the report. It might be hyperbolic to say, as the report does, that nothing less than the future of humanity is at stake. But if you look at the potential that technology has to improve our lives, establishing a focus and framework for dialog is an essential step. Numerous complex ethical, legal, and policy issues will need to be resolved. The more those issues are anticipated and debated, the better chance for a successful resolution.

Attached to the article is a TalkBack forum where more than a dozen readers have given their views on the report. The Converging Technologies report was the subject of Nanodot posts on July 9, 2002, July 13, July 25, and August 7.

One Response to “Report sparks technology utopia dialog”

  1. bhoover Says:

    Trouble In Paradise?

    I'm sympathetic to Farber's concerns about conflict and human nature.

    We are still a society partly bent on self-destruction; our appetite for violence and reality TV remains intact. Advances over the next 20 years won't change human nature unless everyone gets gene therapy and we end up with a society of smiley faces. In fact, the report devotes an extensive chapter to the future of war and combat, which envisions a battlefield occupied by uninhabited combat vehicles and soldiers with enhanced physical and mental capacities.

    Here's an attempt to narrow human conflict down to two general categories: conflicting goals, and the desire to control.

    1) conflicting goals

    a) Under this category is the familiar competition for resources.

    b) Also under this category is the sort: 'I want to do such and such, which precludes another from doing this or that.' For instance, 'the Euroamerican Gateway Committee could not agree on whether it would be more economically feasible to build a transatlantic tunnel, or invest in a new high capacity teletransportation portal.'

    2) control

    a) predictability

    We often have a desire to control one another, to make a person do as we would like them to, if for no other reason than for the sake of predictibility. We humans are obviously real big on control in general – otherwise, why science :) . But I'm not sure this one alone is of war scale caliber so to speak – even Hitler's motives were (one would hope at least) resources competition related.

    b) power lust (pathological?)

    Imagine a world where everyone has everything they need, and more – brought to by Nanotechnology! Happy, happy, joy, joy! What could possibliy be the problem? Oh, but wait, here comes the ugly spector, Mr. Pathological Powerluster! Oooo, you'd better do as he says, or he'll take away your molecular french fry machine. This one's much like the control thing, except on a war or tyranny scale 'cause we're talking heads of state level control here – what did Johnny's parent's do to that poor child?. There's nothing more satisfying than a good day behind the pupetier's stage.

    c) ego

    Here, I was thinking in terms of hurt feelings, and revenge, aggression, that sort of thing.

    Theoretically, nanotechnology could eliminate competition for resources problems, and this is the really big one, and the most "reasonable" (though still not an excuse for tyrany) one. But could we side step other goal conflict problems?

    As for 2, the "unreasonable" ones, maybe gene thearpy indeed? Would such tampering dilute the very attributes that brought humans to nanotechnology in the first place?

    But I would think the only serious threat to a nanotechnology utopia, would be the power lust problem. These are the kind of folks we need to worry about.

    I don't know. I took a shot at it. I think I'm in the ballpark. At the very least, this is a beginning analysis of human conflict (not that this isn't already being done).

    The point is, there'll still be problems even with the ellimination of competition for resources. I think mainly these will be along the lines of 1-b, and 2-b with 2-b being the only serious threat (aside from psychos).

Leave a Reply