Foresight Update 2
A publication of the Foresight Institute
Many readers have requested news and references on technical
progress related to nanotechnology. The brief summaries below
cover some recent advances in the three enabling technologies of
chemistry, micromanipulation, and protein engineering.
Although many orders of magnitude larger than nanomachines,
the micromechanical systems work by W. Trimmer and K. Gabriel at
Bell Labs explores the feasibility of small electrostatic motors,
a concept also useful on the nanometer scale. Assuming the use of
standard electronic materials (silicon wafers, etc.) they present
designs for electrostatic motors with diameters as small as 1 mm
(Sensors and Actuators, 11, p189).
Joined by first author M. Mehregany, they have fabricated
silicon gears down to a diameter of 300 microns and a
micro-turbine that turns at up to 24,000 rpm (draft, "Micro
Gears and Turbines Etched from Silicon.")
On the micromanipulation path, R. Becker et al., also
of Bell Labs, have used a scanning tunneling microscope to make
atomic-scale modifications to a surface, albeit with limited
control; see discussion in
this issue (Nature, 29Jan87, p419).
On the chemical path to nanotechnology, J. Rebek of
U. Pittsburgh has found that synthetically-accessible small
molecules can be designed to "recognize" acids, bases,
amino acids, metal ions, and neutral substrates, abilities once
assumed to require complex biological macromolecules. It is
suggested that such small molecules can be designed and
efficiently assembled to recognize almost any small molecule or
ion, and that carbohydrates, peptides, and nucleotides will also
be recognizable in this way (Science, 20Mar87,
The protein design path is being pursued vigorously. Scripps
Clinic has received a $2.72 million NIH grant to study protein
folding. Principal investigator A. Olson
plans to combine NMR and X-ray crystallography data with computer
graphics techniques (Genetic Tech. News, Sep87, p8).
and E. Suchanek of Johns Hopkins School of Medicine have a
program, PROTEUS, which tests proposed modifications to a protein
and could be used in de novo design as well, perhaps by starting
with the desired backbone and then adding side chains (Biochem,
Ponder and F. Richards of Yale propose to develop templates
derived from the tertiary (i.e., high-level folding) structure of
known sequenced proteins. New sequences could be tested against
the templates to see whether they are likely to fold in a known
way (J Mol Biol, 193, p775--MEDLINE
Abstract). Both groups cite Drexler's 1981 PNAS
paper as a source for their general approach.
Blundell et al. of the University of London advocate
use of a knowledge-based approach for prediction of protein
structures and the design of novel molecules, using analogies
between the protein being modeled and known proteins. Subsequent
X-ray analysis of several modeled proteins showed good agreement
with the predicted shape (Nature, 26Mar87, p347--MEDLINE
Leszczynski and G. Rose of
Penn State have identified a new type of protein secondary
structure termed the omega loop, an omega-shaped structure
extending out from the protein surface. These loops could have
important roles in recognition and may serve as convenient
modules for use in protein design (Science, 14Nov86,
A. Napper et al. of Penn State and Scripps have used an
evolutionary approach based on variation and selection within an
organism's immune system, rather than deliberate design, to find
a new agent of catalysis for a given chemical reaction. They
obtained monoclonal antibodies elicited by an analog of the
reaction's transition state, then used the antibodies as
catalysts, facilitating formation of the transition state and
increasing the reaction rate by a factor of 170 (Science,
A. Lapedes and R. Farber at Los Alamos are using neural-net
simulation on a supercomputer to predict which short DNA
sequences code for proteins and which do not, with 80% accuracy
compared to 50% using conventional methods. Both they and, in
parallel, T. Sejnowsi of Johns Hopkins are using similar
techniques to try to predict a protein's structure from its
sequence (Sci News, 1Aug87, p76).
Jardetzky of Stanford maintain that knowing the behavior and
structure of proteins in solution, rather than in crystals, is of
critical interest. They use an expert system to determine which
structures are compatible with experimental data--e.g.
from NMR--on proteins in solution (J Biochem, 100,
Tiedje has found a novel anaerobic bacterium capable of
dechlorinating aromatic compounds, showing that natural molecular
machines can carry out unusual reactions needed to clean up toxic
waste (Science, 28Aug87, p975). Specially designed
nanomachines should be able to tackle even tougher cleanup tasks.
J. McBride of
Yale, et al., discuss the chemistry which occurs under
high stress within organic single crystals (Science,
14Nov86, p830). Such high-stress conditions, with spontaneous
mechanical forces equivalent to tens of thousands of atmospheres,
give some indication of the chemistry that could be done by an
assembler able to hold molecules firmly, and push.
A "hypertext system" can be anything from a
hyper-notepad to a hyper-Library-of-Congress
Interest in hypertext is exploding, for the time being. Dozens
of systems are in use, the University of North Carolina, the ACM,
and the IEEE have sponsored a conference, and Apple Computer has
massively promoted a new hypertext product for the Macintosh, HyperCard.
There have been hopes of a hypertext revolution bringing an
impact on the scale of the Gutenberg revolution. It seems to have
Or has it? Words can mean many things. A "programming
language" can be anything from a system of detailed
instructions for pushing bits around inside a computer, to a
system of general rules for describing logical reasoning. A
"hypertext system" can be anything from a hyper-notepad
to a hyper-Library-of-Congress. Present systems are closer to the
notepad class. We shouldn't expect them to give library-class
Different hypertext systems have been built to serve different
goals, though some aim to serve several. One goal is to improve
personal filing systems by helping people connect information in
ways that reflect how they think about it. Another is to improve
educational publications by helping authors connect information
in rich, explorable networks. Many recent hypertext systems are
actually hypermedia systems in which authors can link
descriptions to pictures, video, and sound.
Filing systems on a single machine can serve a single user or a
small group. Teaching documents written on one machine, can be
copied and distributed to other machines around the world. Both
these goals can be served by stand-alone systems on single
machines, such as HyperCard on the Macintosh. But both these
goals, though valuable, are peripheral to the goal of evolving
knowledge more rapidly and dependably, to improve our foresight
An improved medium for evolving knowledge would aid the variation
and selection of ideas. To aid variation essentially means to
help people express themselves more rapidly, accurately, and
easily. To aid selection essentially means to help people
criticize and evaluate ideas more rapidly, effectively, and
easily. Several characteristics of a hypertext system are
important to these goals.
To help critical discussion work effectively, a hypertext system
must have full links, followable in both directions,
rather than just references followable in a single
direction. That is, the system must support full hypertext, not
just semi-hypertext. In a semi-hypertext system, a reader cannot
see what has been linked to a document, hence cannot see other
reader's annotations and criticisms. Many existing hypertext
systems lack full links.
To help express criticism, a hypertext system should be fine-grained.
In a fine-grained system, anything--not just a document, but a
paragraph, sentence, word, or link--can be the target of a link.
In a fine-grained hypertext system, if you wanted to disagree
with this article, you could express yourself by linking to the
objectionable part (perhaps the definition of fine-grained in the
previous sentence). In a coarse-grained system, you might have to
link to the article as a whole. Many existing hypertext systems
To make the system a useful medium of debate, it must be public.
This in turn requires suitable software, access policies, and
pricing policies (such as fee-for-service, rather than
free-to-an-elite). No hypertext system yet functions as a genuine
public medium; many cannot do so.
To work, a public hypertext system must support filtering
software. If readers automatically see all the links to a
document, the equivalent of a presidential speech or an Origin
of Species will become incredibly cluttered. Software
mechanisms can provide a flexible way to cut through the clutter,
enabling readers to be more choosey, seeing only (say) links that
other readers (editors, colleagues, etc.) have recommended. There
are subtleties to making filtering work well, but promising
approaches are known; readers would be free to use whichever
filters they think best at the moment, so filters would be free
No existing hypertext system is full, fine-grained, filtered, and
public--yet all of these characteristics (with the possible
exception of "fine-grained") seem essential in a system
that can make a qualitative difference in the evolution of
knowledge. They are needed if we are to have a genuine hypertext
It is this sort of system--not "a hypertext system" but
a hypertext publishing system--that can make a real
difference to society's overall intellectual efficiency, and
overall grasp of complex issues. How great a difference? Even a
small improvement in something so fundamental to our civilization
would save billions of dollars, lengthen millions of lives, and
give us a better chance of surviving and prospering through the
coming technological revolutions. And there is reason to think
the improvement might not be small.
Enzymes show that a nanomachine needn't have gears and
bearings, but macroengineering shows how useful these parts can
be. Conventional gears and bearings are built to tight
tolerances--bumps a thousandth of an inch high on a one-inch part
would often be too large. Since an atomically smooth surface is
bumpy on a tenth-nanometer scale, it might seem that gears and
bearings couldn't be reduced below 100 nm or so. A complex
nanomachine using gears and bearings would then be huge--entire
A paper on "Nanomachinery: Atomically Precise Gears and
Bearings" (by K. Eric Drexler, to appear in the proceedings
of the November, 1987, IEEE Micro Robots and Teleoperators
Workshop) examines how to build these devices much smaller. The
essential insight is that an atom's surface is a soft, elastic
thing, helping to smooth interactions. Conventional gears need
precisely machined teeth if their hard surfaces are to mesh
smoothly, but nanogears can use round, single-atom teeth, relying
on atomic softness to aid smooth meshing.
- For an extension of the IEEE Workshop paper, see
section 10.4 of Nanosystems.
This principle can also be applied to bearings. In one
approach, two surfaces can slide on roller bearings. The bearings
can roll smoothly, despite atomic bumpiness, by having a pattern
of surface bumps that meshes smoothly, gear-fashion, with a
similar pattern of bumps on the bearing race.
Mathematical analysis shows that two surfaces (of a shaft in a
sleeve, for example) can slide smoothly over one another if their
bumps are spaced to systematically avoid meshing. In effect, the
bumps cancel out--while one is pushing back, another is pushing
forward. With a ring of six atoms sliding within a ring of 22,
for example, the friction force can be less than one billionth of
the force holding two atoms together in a molecule.
Yet another class of bearing avoids atomic bumpiness by using a
single atom or bond as a bearing. A fraction of a nanometer
across, these bearings are as small as the moving parts in a
nanomachine can possibly be.
The proceedings volume from MEDIII (the Third International
Symposium on Molecular Electronic Devices) has been delayed, but
we can give a few notes on the meeting here. It was indeed
international, with attendees from Europe, Japan, and Argentina
as well as the US. A speaker from the USSR was scheduled, but
failed to appear.
Besides many papers on conducting polymers and thin films, there
were talks more directly relevant to nanotechnology. S. Staley of
Carnegie-Mellon's Center for Molecular Electronics discussed work
on an optically-switched molecular NAND gate. J. Milch of Eastman
Kodak proposed using a molecular crystal as a cellular automaton,
reminiscent of Conway's Game of Life, leading to a molecular
J. Deisenhofer of the Max-Planck Institut discussed work on a
molecular electronic system found in living cells, the
light-driven charge separation process of photosynthesis in
purple bacteria. E.
Greenbaum of Oak Ridge National Lab presented experimental
results on connecting photosynthesizing molecular systems to
external electric circuits by means of colloidal platinum
directly contacting the molecules. E. Drexler presented the
nanocomputer rod logic work cited in our previous issue.
This young, chaotic field also made progress by starting to
standardize its terminology. The one-page handout distributed by
workshop organizers, "Molecular Electronics and Technology:
Some proposed distinctions and terms," is available from the
or What is Nanotechnology?
Under the headline "Funds for Nanotechnology,"
Britain's IEE News (October 1987) reports that
"Funds are now available through the National Physical
Laboratory ... for the support of projects which will lead to the
commercial exploitaton of nanotechnology techniques.
Nanotechnology covers the manufacture and measurement of devices
and products where dimensions or tolerances are in the range 0.1
to 100 nm..." This sounds exciting until one realizes that
this definition of "nanotechnology" covers everything
from memory chips to electron microscopy.
The use of the term "nanotechnology" for everything
smaller than 100 nanometers (0.1 micron) is apt to lead to
confusion. As used in recent years in the US (and in this
publication), "nanotechnology" implies a general
ability to build structures to complex, atomic specifications; it
refers to the technique used rather than to the size of the
We can see a parallel in the term "microtechnology":
the broad ability to shape bulk materials into surface patterns
having complex specifications on a scale of microns or less. This
term does not apply to all processes having micron-scale
products. Consider the case of a forest fire simulation
experiment for which micron-sized particles of smoke are
needed--the fire we set to produce these particles is not an
example of microtechnology. Like nanotechnology, the term refers
to a family of techniques and abilities, not size and scale. In
the case of nanotechnology, this means structuring matter
atom-by-atom with precise control. Some products of
nanotechnology, such as fracture-tough diamond composite
spacecraft, will not be small.
Nanotechnology is qualitatively different from microtechnology,
being based on molecular operations rather than the
miniaturization of bulk processes. It will enable a cube 0.1
micron on a side to hold, not just a single device, but the
equivalent of an entire microprocessor. It will lead to far more
than just denser circuits, more precise machines, and so
forth--it will lead to self-replicating machines, cell repair
systems, and a broad, deep revolution in technology, economics,
The advance of microtechnology into the submicron regime no more
calls for a change of prefix than did the similar advance of
microscopy--we do not speak of "electron nanoscopes."
If "nanotechnology" becomes a trendy term for submicron
technology, we are in for some confusing times and a lot of
wasted words used in describing assembler-based technology. The IEE
News article holds no hint of real nanotechnology. Readers
are encouraged to state their opinions on this matter to editors
of publications which misuse the term.
||One cubic nanometer of diamond, containing 176
atoms. A cube 100 nm on a side would contain 176 million
From Foresight Update 2, originally
published 15 November 1987.
Foresight thanks Dave Kilbridge for converting Update 2 to html
for this web page.