Foresight Nanotech Institute Logo
Image of nano


Foresight Update 3

page 1

A publication of the Foresight Institute


Foresight Update 3 - Table of Contents | Page1 | Page2 | Page3 | Page4 | Page5

 

Nanometer Molecule-Zapping

Steps toward controlling single chemical reactions

by Ralph Merkle

"One may now reasonably ask if it is possible to move and alter matter predictably on an atomic scale…we have evidence that we can remove a portion of a pinned molecule, effectively performing transformations on single molecules using the tunneling microscope," say John S. Foster, Jane E. Frommer, and Patrick C. Arnett of IBM's Almaden Research Center in a recent article in Nature[1].

The scanning tunneling microscope, as most of you know, is conceptually quite simple. It uses a sharp, electrically-conductive needle to scan a surface. The position of the tip of the needle is controlled to within 0.1 ångstrom (less than the radius of a hydrogen atom) using a voltage-controlled piezo-electric drive. When the tip is within a few ångstroms of the surface and a small voltage is applied to the needle, a tunneling current flows from the tip to the surface. This tunneling current is then detected and amplified, and can be used to map the shape of the surface, much as a blind man's stick can reveal the shape of an object.

In the new work, the surface is atomically smooth graphite with a drop of dimethyl phthalate (a liquid) on its surface. (The type of organic liquid does not seem critical; many other compounds have been used.) The needle is electrochemically etched tungsten, and is immersed in the liquid. Not only can the graphite surface be imaged in the normal way, but a voltage pulse applied to the needle (3.7 volts for 100 nanoseconds) can 'pin' one of the organic molecules to the surface, where it can be viewed in the normal fashion. A second voltage pulse applied at the same location can remove the pinned molecule (though it often randomly pins other molecules in an as-yet uncontrollable way). In some cases, the voltage pulse will remove only part of the pinned molecule, leaving behind a molecularly altered fragment.

The first application that comes to mind is a very high density memory. The minimum spot-size demonstrated in the new work is 10 ångstroms, though a somewhat larger size might be required in practice. If we assume that a single bit can be read or written into a 10 ångstrom square, then a one square centimeter surface can hold 1014 bits. That's one hundred terabytes. The 100 nanosecond pulse time sets a 10 megabit/second maximum write rate, though this might be degraded for other reasons. At this rate, it would take several months to a year of constant writing to fill a one square centimeter memory. Access times will probably be limited by the time needed to move the needle--which might be a significant fraction of a second to travel one centimeter--giving access times similar to those on current disk drives. The manufacturing cost of such a system is unclear, but the basic components do not seem unduly expensive. It seems safe to predict that someone in the not-too-distant future is going to build a low-cost very large capacity secondary storage device (disk replacement) based on this technology.

The larger implication of this work, however, is that it may put us on the threshold of controlled molecular manipulation. While we can easily imagine more powerful techniques than poking at objects with a sharpened stick (we clearly want a pair of molecular-sized hands) the great virtue of this technique is that we need not imagine it at all--it is real and is being pursued in many laboratories. Even better, we can imagine incremental improvements in this technique that ought to be achievable--using, perhaps, two sharpened sticks (chopsticks, anyone?) and shaping the tip of the stick in a more refined and controlled way. The tip, viewed at the atomic scale, is rather rough and there seems no reason why we cannot do better--perhaps by examining and modifying one stick with the other stick.

These larger implications have not been lost on the scientific community--in an editorial on atomic-scale engineering in the same issue of Nature, J. B. Pethica of the Oxford Department of Materials Science says that the scanning tunneling microscope has "…become one of the principle gedanken tools for nanotechnology--the proposed direct manipulation of matter, especially biological, on the atomic scale," and "The work of Foster et al.[1] represents a significant attempt at the much more important and difficult problem of the direct manipulation of the structure of biological materials."[2]

Dr. Ralph Merkle's interests range from neurophysiology to computer security. He currently works in the latter field at Xerox PARC.

References

  1. "Molecular Manipulation using a Tunnelling Microscope," by J. S. Foster, J. E. Frommer and P. C. Arnett; Nature, Vol. 331, No. 28, 28 Jan. 1988, pp. 324-326.
  2. "Atomic Scale Engineering," by J. B. Pethica, op. cit. p. 301.

Foresight Update 3 - Table of Contents

 

Science Court Concept Abused

Report by Chris Peterson

In a recent "Science Court" cover story in OMNI magazine, writer Ed Regis asked prominent scientists to decide cases involving both scientific or technological and ethical issues--the sorts of issues normally decided by social norms, legislation, or a court of law. This mixing of issues violates the most basic premise of the science court (SC) procedure, developed by Arthur Kantrowitz, which has as its ideal the separation of scientific and technological questions from legal, ethical, and emotional ones.

The SC goal is not to permit scientists to make pronouncements on public policy issues, but rather just the opposite: to enable society to extract from expert communities their best available understanding of scientific and technological facts, burdened by a minimum of personal opinion from the technical people involved. This technical understanding could then be used by legislators, judges, and other policymakers selected by society in the usual ways. Proponents of the idea readily admit that perfect separation of facts from values is not possible, but maintain that we as a society could get a clearer understanding of technical realities by means of the SC procedure than by means of media wars, secret committees, and congressional hearings.

The OMNI article features "decisions" from ten prominent scientists including physicist Stephen Hawking, MIT's Seymour Papert, Edward Teller, and computer scientist Joseph Weizenbaum. They were asked to make legal or ethical pronouncements on surrogate motherhood, genetic engineering in humans, alleged psychic powers, patenting genetically engineered animals, and ownership of ancient human bones.

One scientist, the late Richard Feynman, refused to participate on the excellent grounds that scientists have no special ability to solve legal and ethical issues: "Suppose I had one hundred percent access to the facts and one hundred percent knowledge of the laws of nature. None of this would tell me whether a surrogate mother should keep her baby or whether designer animals ought to be patented."

The other participants presumably were either unfamiliar with the original SC concept or were unable to resist the temptation to mix their personal ethical views with their scientific knowledge--a temptation the SC procedure is designed to circumvent.

The SC is not a new proposal; it has been endorsed by various presidential candidates in past elections. Originally seen as a function within government, the idea has evolved into a procedure which could be used in a decentralized way, for example at universities.

As stated by Arthur Kantrowitz (now a professor at Dartmouth) in his letter of correction to OMNI, "There have been exercises at Berkeley and Dartmouth which have helped in developing procedures. But the task is difficult partially because some scientists prefer high priests' robes to labcoats. Again some people prefer not having to stretch their minds enough to deal with the moral and ethical problems posed by a science-based technology which grows more and more powerful at an explosive rate. Those who would control this force ... must get the scientific facts from the scientific community. However, they must form their own moral and ethical judgments."

The confusion about the role of the science court is partly due to its name, which was given to Dr. Kantrowitz's idea by the media. It implies a similarity to a traditional court of law, which by its nature cannot confine itself to matters of technical fact. The name gives the impression that the SC could make public policy, just as today's courts of law effectively make public policy by determining how laws are interpreted. Here at FI we substitute the term "fact forum"; Dr. Kantrowitz now uses the term "scientific adversary procedure."

Another problem the SC meme has faced is the difficulty of arranging in-person meetings of busy technical people who, by definition, are adversaries on some issue. FI believes that the establishment of hypertext publishing systems will support online fact forum procedures. Meanwhile, existing software such as DocuForum is being investigated; readers with suggestions on this should contact FI.

For more on the SC/fact forum idea, see an account of the first three trial SC procedures in a university setting, to be published by Roger Masters and Arthur Kantrowitz in the upcoming book Technology and Politics (ed. Michael Kraft and Norman Vig, Duke University, in press). A basic explanation of the idea is available in Engines of Creation (K. Eric Drexler, Doubleday, 1986).--Editor

[Note: More recent information is available on the Web at "Twenty-Five Year Retrospective on the Science Court"]


Foresight Update 3 - Table of Contents

 

Atoms, Bits, and Mechanisms

by K. Eric Drexler

When faced with something as novel as nanotechnology, it makes sense to look for familiar analogies. Previous publications have compared nanomachines to conventional macromachines, but in important ways nanomachines more closely resemble software systems. Consider the properties of software and conventional machines, then the parallels with assembler-built nanomachines.

Macromachines are made of parts which contain vast numbers of atoms in ill-defined patterns. Having so many atoms, these parts can be made in what amounts to a continuum of sizes and shapes, formed by continuous, analog techniques--molding, cutting, grinding, etching, and so forth. These parts are always imprecise. Machines are made by fitting parts together; in a good design, imprecisions won't add up to exceed overall tolerances. In operation, parts typically change shape slowly--they wear out and fail.

Software mechanisms differ radically. Their parts consist of discrete bits in defined patterns--they do not form a continuum. There is no need to make bits, as there is to make mechanical parts. The fabrication of bit-patterns is a precise, digital process; it is either entirely correct or clearly wrong, never "just a little off." The position of one bit with respect to another is as precise as the mathematical position of "two" with respect to "three."

The digital mechanisms which underlie this precision are made of imprecise devices, but these devices have distinct patterns of interconnection and distinct "on" and "off" states. Failures in the underlying devices can cause sporadic errors in memory and logic, yet if the devices operate within their design tolerances, errors (give or take an occasional cosmic ray) will be completely absent. Digital precision emerges from imperfect devices through a process like that of the automatic alignment found in many computer graphics programs: a device in any state that is nearly-right snaps into a neighboring state that is entirely-right. Each entirely-right state follows from a previous entirely-right state, with no buildup of small errors in, say, the size or alignment of the bits.

Nanomechanisms do have obvious similarities to conventional mechanisms. Unlike software, they will be made of parts having size, shape, mass, strength, stiffness, and so forth. They will often include gears, bearings, shafts, casings, motors, and other familiar sorts of devices designed in accord with familar principles of mechanical engineering. In most respects, nanomechanical parts will resemble conventional parts, but made with far, far fewer atoms. They will little resemble the algorithms and data structures of software.

And yet their similarity to software and digital mechanisms will be profound. As software consists of discrete patterns of bits, so nanomechanisms will consist of discrete patterns of atoms. Atoms, like bits, need not be made; they are both flawless and available without need for manufacture. The parts of nanomechanisms will not form a continuum of shapes, built by inaccurate analog processes; they will instead be chosen from a discrete set of atom-patterns, and (like bit patterns) these patterns will be either entirely correct or clearly wrong. In stacking part on part there will be no buildup of small errors, as there is in conventional systems.

As in digital circuits and computer graphics programs, a principle of automatic alignment comes into play. When an assembler arm positions a reactive group against a workpiece, forcing a reaction, imprecision of the arm's alignment won't cause imprecision in the position of the added atoms. In making a well-bonded object, molecular forces will snap the atoms either into the proper position, or into a clearly wrong position. (As Marvin Minsky remarks, quantum mechanics doesn't always make things more uncertain--quantum states can be extraordinarily definite and precise.) Assembly can with high reliability yield a perfect result.

And again like software, nanomechanisms won't wear out. So long as all the atoms in a mechanism are present, properly bonded, and not in a distinct, excited state, the mechanism is perfect. If an atom is missing or displaced (say, by radiation damage) the mechanism isn't worn--it is broken.

In their shapes and functions, nanomechanisms will be much like ordinary machines. But in their discreteness of structure and associated perfection--to say nothing of their speed, accuracy, and replicability--nanomechanisms will share some of the fundamental virtues of software.


Foresight Update 3 - Table of Contents | Page1 | Page2 | Page3 | Page4 | Page5


From Foresight Update 3, originally published 30 April 1988.


Foresight thanks Dave Kilbridge for converting Update 3 to html for this web page.



Donate Now

 

Foresight Programs

Join Now

 

Home About Foresight Blog News & Events Roadmap About Nanotechnology Resources Facebook Contact Privacy Policy

Foresight materials on the Web are ©1986–2014 Foresight Institute. All rights reserved. Legal Notices.

Web site development by Netconcepts. Email marketing by gravityMail. Maintained by James B. Lewis Enterprises.