Foresight Nanotech Institute Logo
Image of nano

Rate of progress: slowing or speeding up?

from the US-News-vs-Reason dept.
Tony (asnapier) writes "Here are two articles that are very much worth reading: #1 The Slowing Rate of Progress and #2 [mentioned on nanodot earlier] More More More Nanotechnology and the Law of Accelerating Returns Which one is correct? The initial reaction is to say #2. I have come to the conclusion that geometric growth will only happen if strong AI is realized — which of course, is the very definition of Vinge's singularity. What if strong AI does not emerge? Then the argument for slowing growth appears valid — human minds and mundane information systems would be the bottleneck to rapid advancement (you have to read the articles for the proper context — of course science and technology advance every day — but a new dvd player does not count as a fundamental improvement of the human condition)." CP: The US News piece ends: "Perhaps another Thomas Edison is hard at work, using nanotechnology or bioengineering to invent new machines that are truly revolutionary and transforming. But he or she has not succeeded yet."

One Response to “Rate of progress: slowing or speeding up?”

  1. MarkGubrud Says:

    the more things change…

    Phillip Longman makes a good case that the technological advances of the past 50 years have not had a more dramatic effect on the world than those of the previous 50 or 100 years, and perhaps less so. But this is, of course, a very subjective judgement. To my eyes, the world today looks very different than it did in 1950 (as best I can judge it, having been born in 1959). But you can make the case that the biggest part of that change has been growth, rather than technological revolution, and that the really big developments in technology took place earlier.

    1880-1930, for example, saw the widespread deployment of electrical machinery, assembly-line production, the automobile, aircraft, radio, the movies, sanitation and public health management, and something approximating modern medicine. How do the developments of the last half-century compare with those? It's hard to say exactly.

    Quantitative measures generally point to continuing exponential growth, but qualitative assessments may be more ambiguous.

    It once seemed to me, growing up in the 1970s, that technological progress had slowed down since the early part of the 20th century, and that history was being driven mostly by other forces than fundamental technical progress. That's part of the reason Engines of Creation made such an impression on me. By 1986, I had seen the personal computer go from a hobby toy to an important tool and a burgeoning industry, but although I'd heard a lot of hype about the dramatic progress in high technology and how it was going to transform the world, the reality so far did not seem to live up to the talk. It was Eric Drexler's work that made me realize that all this was going somewhere.

    After EoC, I began to see that technology probably was the most important force in human history, and was going to drive very dramatic developments in my own lifetime and beyond. It was this reasoning that eventually led me to go back to school to pursue studies in physics.

    I still believe Eric was (and is) basically on-target about the potentials of nanotechnology and the form it will ultimately take, but I do think some of the criticism has been justified. EoC made it seem as if all someone had to do was get an STM and start trying to pick up atoms, or else get creative with designer proteins, and we'd have assemblers in a decade or less. It was probably the simplistic imagery of tiny hands that simply put atoms where you want them (fat fingers, sticky fingers…) and the overall impression that it would be so easy, that provoked so many groans, guffaws, and angry denunciations. But EoC was mostly about addressing the consequences of nanotechnology, not about fleshing out the details. Nanosystems was about the latter, and as far as I know there still has not been a serious critique of the pioneering work presented in that book.

    Ray Kurzweil, in his books and speeches, makes far more serious overstatements and errors, but his basic message that computers are likely to overtake and surpass the capacity of the human brain in the coming decades, and that powerful artificial intelligence systems will have an immense and transforming impact on our world, is hard to dispute.

    Kurzweil's biggest technical error is his underestimation of the complexity and subtlety of the brain and of the difficulties that will likely be encountered in any effort to directly interface with, let alone to "enhance" or "upload" it. But his most serious error is his proposal that AI systems and simulations of human beings ought to be regarded, and inevitably will be regarded, as morally equivalent and in fact, superior, to our species. This is a very different vision from that presented in EoC, and one that takes us away from the dream of using technology to fulfill human desires and ambitions, toward the nightmare of a takeover by technology and the destruction of our kind.

Leave a Reply