Foresight Nanotech Institute Logo
Image of nano

Intel’s ‘Nano Inside’

HLovy writes "Intel says it's now a master of the 65-nanometer domain. But are these nanochips truly "nanotechnology?" I was surprised when "Engines of Creation" and "Nanosystems" author Eric Drexler — whom I had assumed to be a molecular manufacturing purist — told me he thought they qualified.

"People sometimes perceive me as saying, 'Oh, you shouldn't use the term this new way,'" Drexler told me in October. "What I've actually been saying is we need to understand that it's being used in a new way … that has a certain relationship to the field."

The complete commentary can be found on Howard Lovy's NanoBot."

2 Responses to “Intel’s ‘Nano Inside’”

  1. Morgaine Says:

    Nanoscale bulk technologies

    There is relatively little to be gained from drawing some arbitrary demarcation boundary between bulk processing and atomically precise molecular manufacturing down at these scales, since in the now not-too-distant limit the distinction becomes so blurred anyway that it is not really all that useful.

    As long as engineers know what they're talking about at any given time, there usually isn't a problem with overly inclusive terminology and a degree of ambiguity. It's all part and parcel of the real world, so engineers are pretty used to dealing with it. :-)

    Furthermore, there is some real human benefit from not being exclusionary, not the least of which is that the more people there are in your camp, the greater the synergy. This may sound unimportant to some, but it is not. The key to achievement in our somewhat quirky civilization is cooperation, and social lubricants are pretty powerful instruments for success.

    And finally, there are at least two reasons why bulk materials processing is relevant to molecular manufacturing, one being that we are quite likely to want to fill in atomically precise structures with amorphous bulk material anyway, and the second being that inevitably we will need to interface our atomically precise machinery with an existing world full of crude bulk objects. Add to this the areas of overlap such as crystals and monomolecular films and it becomes clear fairly rapidly that pigeonholing can be less than helpful in some situations.

    Admittedly, we'll have even more explaining to do once the popular press starts describing bulk semiconductor fab processes as nanotechnology, but frankly, that won't be our hardest problem on the road ahead.

  2. RobertBradbury Says:

    Between 1 and 100 nm

    I would tend to agree with these comments and perspectives. The semiconductor industry is going to keep pushing things to smaller scales for the simple reason that one does not want to be the CEO or CTO of Intel, AMD, IBM, Motorola, etc. when "Moore's Law" breaks. How does that look on ones resume — "I was on watch when Moore's Law broke"? So, the fact that they are pushing into the middle of the NSF nanotechnology "region" (1-100 nm) would seem to be a good thing. Sooner or later they are going to have to deal with things at the level of precision assembly of atomic structures (they are almost there now with gate thickness in the 3-10 atom range). Now there will be a large bump in the road when they have to shift from current bulk deposition of atoms methods to more specific methods but I really doubt that that will stop the industry from facing up to the requirements. They can see it coming and they know what they have to do.

    So Eric's classification of the current methods as nanotechnology is reasonable because it falls under the current NSF classification and puts the big players on a slippery slope where they have to face up to the need for real MNT. Now recent self-assembly of nano-transistors by the group at Technion-Israel Inst. of Tech. has shown one can approach the problem from the bottom-up as well as from the top-down. So there is going to be an interesting playoff of the technologies perhaps in the 10-30 nm area.

    Robert

Leave a Reply