Foresight Nanotech Institute Logo
Image of nano

Quantum Dots may yield quantum changes in computer

from the nauseatingly-small dept.
As if nanotechnology won't be enough to deal with, it's looking increasingly as though quantum computers will play a big role in our future. Waldemar Perez writes "Interesting article on patent that could affect Quantum Dot-based electronics and non-linear optical devices for satellite shield protection against laser attacks, from Nebraska University. http://www.unl.edu/pr/science/111400ascifi.html"

6 Responses to “Quantum Dots may yield quantum changes in computer”

  1. kurt2100 Says:

    Quantum Dots

    Quantum dots aren't really that new. I integrated a control system for an MBE (molecular beam epitaxial) system that my customer was using to fabricate quantum dots in 1992. The problem at the time was getting them to work above cryogenic temperatures, which could be solved by making them smaller, say less than 5 nanometers across, which couldn't be done in 1992. Now, they can probably do that. The perpose of using quantum dots is to get around the quantum effect limitations of using CMOS logic. As you know, below the 70 nanometer design rule, quantum tunneling makes it difficult to confine the electrons within the various device components (gate, source, drain). Quantum dots is a new logic concept that allows you to use the tunneling effect as the basis of your circuit. The difficulty is the fabrication technology itself.

  2. MarkGubrud Says:

    QC hype and reality

    Referring to quantum dots for quantum computation (QC), the article says:

    "In the next few decades, they could make binary computers obsolete"

    Wrong. There is no reason to believe QC will ever make binary computers obsolete.

    So far QC are known to be fundamentally more powerful than classical binary computers for only two applications: 1) The Shor factoring algorithm (and here it still has not been proven that no classical algorithm can do as well); 2) Simulation of quantum mechanical systems. Most of the funding for QC has been driven by the first application, which would be used to break the RSA cryptosystem. But the second application is potentially of more importance, particularly to nanotechnology, if large-scale quantum computers can in fact be built.

    The only other potentially useful algorithm for QC, the Grover "search" algorithm, has recently been shown to be implementable in classical physics, and in any case gives only an algebraic speedup, which in computer science is not considered enough to make an "intractable" problem "tractable." Since classical computer power is increasing exponentially over the years, clearly this is not a killer app for QC.

    QC is not likely to be of any utility for most of the applications of most computing. It is not likely to be important for AI, for example. It is potentially of use only in a limited number of highly structured problems where the input and output are low-bandwidth but the throughput is enormous. No doubt more "magic" algorithms like Shor's algorithm will be discovered, but they will most likely have only niche applications.

    On the other hand, the ability to efficiently simulate quantum mechanics would be of enormous technical interest, particularly for designing and modeling nanosystems. But that will require really large and robust QC systems. I think it is more likely that molecular nanotechnology will prove to be an enabler for QC than the other way around.

  3. kurt2100 Says:

    Re:QC hype and reality

    This is certainly true. And nano-fabrication such as molecular electronics or molecular nanotech will certainly be developed before QCs become available. However, the long-term picture may be different. Once you get molecular-scale electronics, say in 2020, the end of the road is reached. No further progress is possible. Even though this will give you an incredible amount of computing power, that will be the limit. The long-term promise of QC is to go beyond this limit, by being able to store more than one bit per atom or molecule. Whether or not this can be done has yet to be determined. QC technology is very embryonic at this point.

  4. Saturngraphix Says:

    Re:QC hype and reality

    So…what will be the next step for the PC then…DNA based or Quantum/binary combo…quantum DNA…Just having to deal with our pentium 30 9,000,000mhz as the finality of of speed for all time… Any theories? Seems that our limitations are coming quickly and other avenues are filled with limitations… I see a frankenstein method of different trades to make things keep progressing…is that possible? (Sorry, I dont know much on the area however I am interested in knowing the summery of where its going to head) Saturn

  5. ChrisRoot Says:

    Re:QC hype and reality

    Slashdot had a story in September about University of Michigan professor Philip Buckbaum's discovery of a way to phase-encode arbitarary strings of 0s and 1s along a single electron's continuously oscillating waveform. Here's the original article in EETimes.

  6. MarkGubrud Says:

    quantum vs. classical

    Once you get molecular-scale electronics, say in 2020, the end of the road is reached. No further progress is possible.

    We might ask what you mean by "progress," but if you mean more computing power, don't worry, the end of the road won't be reached by 2020. There will be lots of room for improving and optimizing nanoelectronic systems, and building them up in three dimensions to ever-larger systems. There will also be lots of room for advance in other aspects of technology.

    The long-term promise of QC is to go beyond this limit, by being able to store more than one bit per atom or molecule.

    You're thinking in terms of a greater amount of classical information. That is not QC.

    Of course there is no reason you can't store more than one classical bit per atom or molecule. Molecules have many degrees of freedom, and atoms have many energy levels. But it may not be effective in practice to try to do this, because you want a low error rate, so it's best to limit the number of information-bearing degrees of freedom and maximize the barriers between significant states.

    QC, on the other hand, would use a modest number of quantum bits to do an exponentially large amount of computation in one run. Such "quantum parallelism" is different from classical parallel computation in that you can only get one answer out. So you have to combine all the parallel computations in a final interference step. There are only a few algorithms known which can combine the exponentially large number of parallel computations in a way that harnesses the power of all of them together. In fact, there is only one such algorithm that is thought to be important.

    Quantum registers in a quantum computer store quantum information. It is true that it would take an exponentially large classical register to represent this quantum information, but that does not mean the quantum information is equivalent to an exponentially large amount of classical information.

    First, you can't copy the quantum information (no-cloning theorem). This means that, for example, if you want to put a huge classical database into a small quantum register, you have to read the entire classical database and go through all the steps of preparing each corresponding part of the quantum state each time you want another quantum copy. And you can use each copy only once.

    Second, you can't even read out the entire classical database from the quantum state. Reading out any small part of it will destroy the rest.

Leave a Reply