Mathematics-driven future industrial system?
from the Looking-for-roadmaps dept.
larens imanyuel writes "In each phase of the Industrial Revolution a new industrial system has arisen on top of the previous one. Each has involved enabling technology, new organizational principles, and new major product lines. For instance, a century ago electrification with small motors allowed Henry Ford to design the modern assembly line to mass produce automobiles. Several decades ago silicon technology allowed the mass production of personal computers through an exponential refinement of technique, commonly known as Moore's law, that became the Semiconductor Roadmap. The question naturally arises as what the equivalent industrial system will be for the next half century."
"I have come up with my own analysis, and am interested in hearing how readers find that it compares with other people's insights. My first observation is that all the leaders of nanotechnology initiatives see that development is going to be highly multidisciplinary and that finding a principle of unity is a major necessary step to produce some type of "roadmap". Mathematics is the obvious common element of the multiple disciplines. A unifying principle this time, however, must be more involved than Moore's law, because we are dealing with a more complex technology than photolithography."
"Economically the driving principle is that making highly automated equipment leads to miniaturization, because smaller machines operate at higher frequencies, particularly when using "dry" technology. This implies that with embedded controllers, power sources, and wireless technology, we are going to see a system of miniature robotization. This will be multiscale all the way from human handlable scale down to molecular manufacturing. To shepherd these will require highly manouverable one person vehicles with a computer-human interface and power for tools. This will be the new transportation mode comparable to the railroad and automobile in previous epochs. In conjunction with accessories and high speed ferrying systems, such as vacuum maglev and airfoil airships, these "robohorses" will define the new major comsumer products of the technology."
"In this light the unifying mathematical principles will define two things:
1) Multiscale 3D design.
2) The computer-human interface and its 'virtual world'."
"I have some specific mathematics, which is too long for this posting, and want to work with other people on refining it."
–larens imanyuel



August 26th, 2002 at 2:19 AM
Getting ahead of yourself?
Actually, I'm expecting nanotech to usher in an Anti-industrial Revolution, by rapidly obviating the need for most factories at all.
Who decreed that there shall be a "roadmap"? Was there a "roadmap" for the Internet? (I mean for the way the Internet actually was to evolve, not the Gibsonian vision.) On what basis are you claiming that a unifying principle is a "necessary step"? For what?
August 26th, 2002 at 2:42 PM
Re:Getting ahead of yourself?
"Actually, I'm expecting nanotech to usher in an
Anti-industrial Revolution, by rapidly obviating
the need for most factories at all."
I unfortunately find that most projections of
nanotechnology are not grounded in economic
reality, so they project some type of Santa Claus
Machine scenario with a discontinuity in history.
The common fallacy is to overlook the fact that
most engineering designs don't work when they
reach the field and have to be heavily reworked
by the technicians. The speed with which the
human intervention time per machine can be
reduced sets the fundamental limit on how fast
the number of machines may increase (and their
average size decrease). The difficulty of making
fully automatic machines can be seen by looking
at the elaborate procedures used to design
spacecraft. The word "industrial" refers to the
specialization of labor described by Adam Smith.
Since we are going to have to use highly
elaborate labor processes as does NASA, we are
going to be in a "hyper-industrial" rather than
an "anti-industrial" epoch. Factories are likely
to be smaller, but with a much higher complexity.
Some continuing de-economization of economic function from factory scenarios, as is happening
with information technology, is to be expected.
"Who decreed that there shall be a "roadmap"?
Was there a "roadmap" for the Internet? (I mean
for the way the Internet actually was to evolve,
not the Gibsonian vision.) On what basis are you
claiming that a unifying principle is
a "necessary step"? For what?
The word "roadmap" has become the common word
to refer to formal goal-setting, after the
successful effort by the Semiconductor Industry
Association in the 1980's to institutionalize
Moore's Law. Without formal goal-setting there
won't be any breakthrough in molecular
manufacturing, because it is too complex to come
about spontaneously. It confounds the issue to
bring in the evolution of the Internet, because
the Internet is an application, not the base
technology. I assumed back in 1970, as did most
other futurists, that the massive development of
the Internet would require fiber optic cabling
to the home (or some other expensive broadband
solution). As it was, the exponential increase
in microprocessor power allowed real time
encoding of commercially acceptable graphics
over twisted wire pairs well before massive
broadband capability arrived. This was
mathematically foreseeable, but escaped our
attention. Even less foreseen was the rapid
development of essential Internet services, such
as search engines, provided free through advertising. None of these failures of analysis
of the application, however, would have even been
possible without the systematic undermining of
assumptions, at the level of the base technology,
by the Silicon Roadmap.
August 26th, 2002 at 3:48 PM
Mathdriven Nanotech future
Undoubtedly math will have to be the primary unifying principle of nanotechnology. Just its very nature will and does demand it. For one thing the precision required for the creation of even the most trivial componants makes it a prerequisite. The second concern is the possibility of reconstructuring/deconstructing macro objects, systems and even biological organisms, such as the recent development of a synthetic polio virus indicates maths role. It also brings some other considerations in the interim. Succeeding at the difficult task of manipulating real world structures at these precise levels brings concerns. First and formost the ethics of manipulating living things biological architecture, (humans in particular begs several questions. These ramifications are too critical to address in so casual a forum as this. Yet let this message stimulate us to take some of them into consideration. Formost probably is the absolute necessity for mass social information means, so that the greater majority of individuals can decide if these things are in the general best interest as they come on line and begin to possibly exhibit widespread influence. I hope I have not been too vaque, thank you for your patience.
August 26th, 2002 at 7:16 PM
Two possible roadmaps for nanotech theories
What about the idea that some industries develop according to theories. Chemistry is the first best example, wheree this type of product development, in which theories direct the focus of future research. Daneil Bell covered this is a 1986 paper called 'Communication Technology – for Better or for Worse'. In this paper Bell suggests that innovation and change derive from the codification of theoretical knowledge. Rather than labour it is "knowledge and information which become the strategic and transforming resources of the society".
Bell also suggests IT creates "major consequences of a shift in the modalities of the infrastructure". Nanotechnology is similiar to information technology in that products are derived from work in theoretical science. A second similiarity between the two technologies are the economic changes, with some countries gaining competitive advantage. I think one of the key questions is what other arenas are going to change…the political sphere or health provision, etc. I think the next decade or two may provide some answers.
Are you familiar with Kessler's Grand Narrative? I dont have a copy myself, nor have I discovered a direct reference online, however you can find a copy in this book ' Next:The Future Just Happened ' by Michael Lewis. The BBC made a small series about this book, including realaudio copies of the program specifically about the threat of new information technologies, like the internet. There is a book review here, [Gaurdian] and I have written a summary of some points from the book. [infoAnarchy]
The grand narrative describes a step by step cycle in which capital expenditure may be backing the smaller players with new technologies, who are threatening the established industries, grabbing customers attention with innovative and useful products. This allows for the outsiders to reduce revenue from the insiders, changing the status quo, in much the same way that the internet is posing threats to traditional firms, revenue streams and those in authority. It is this source of conflict in the marketplace which repeats itself and poses a constant threat to business.
I am not familiar enough with the mathematical theories of nanotechnology to know if both of these ideas can be applied to nanotech, but I would, for starters be looking for evidence of indicators showing an expanse in medical services, health products, statistics such as disease, life expectancy and correlations with info-, bio- and nanotech which are available to communities. Do any mathematical theories support the idea that this field is multi-disciplinary and are any related products reaching the market and what political action, if any, is occurring in response?
"The result of a mathematical development should be continously checked against one's own intuition about what constitutes reasonable biological behaviour." – Harvey J. Gold
August 27th, 2002 at 11:45 AM
Re:Two possible roadmaps for nanotech theories
Bell's "codification of theoretical knowledge"
just looks at the scientific labor process from
a different angle. Two productive factors get
multiplied together – labor time and the
codification of knowledge (embodied in training
and software). As the codification factor
increases, labor time on fixing problems
decreases, allowing an increase in the number
of machines and a decrease in their average size.
We are increasingly seeing AI-enhanced tools
working on highly codified scientific information
in support of designers and technicians.
"The grand narrative" describes what happens with
every disruptive technology. New, smaller actors
successively challenge older ones. Some of these
new actors then grow very large to become the
large, older actors at the next stage of the game.
"Nanotech" needs to be divided into three
categories – nanoscale science, nanoscale
information technology, and molecular
manufacturing. Products of nanoscale science that
aren't refinements of older technology form a
small economic sector, as pointed out by the
Nanotechnology Opportunities Report. Thus, we are
not likely to soon see many significant economic
indicators correlated to this nanotechnology.
Nanoscale information science will tend to follow
the Semiconductor Roadmap for about another
decade due to institutional inertia. Only then
will the large and disruptive sector of
molecular manufacturing be able to assert its
dynamics. The most likely alternative to the
math-driven roadmap that I have suggested is a
more business-as-usual science-driven one, where
the setting of "grand challenges" by the
National Nanotechnology Initiative and the
domination of the Semiconductor Roadmap create
too high an institutional barrier for a basically
new paradigm to enter.
A math-driven paradigm has the advantage of a low
entry cost, because it is just "information". It
still must overcome social barriers, such as
conservative opposition to new methods of
teaching mathematics in public schools, and
a commitment to existing math packages like Maple
and Mathematica. Since these are monopoly priced,
however, there is an entry pathway left open.
This requires developing a widely accessible
package with a superior "virtual world" and
visualization of mathematics, and good propaganda
to overcome the conservative opposition.
August 27th, 2002 at 1:30 PM
Re:Getting ahead of yourself?
I'm sorry, but this is just nonsense. Moore's law was (is) an empirical observation; the only things institutionalized, perhaps, are expectations that it will continue to hold, which are not based on any industry association decision but on the general tendency for people to expect (usually correctly) tomorrow to be like today.
Formal goal-setting by whom? And what basis do you have for this assertion anyway? I can't think of any historical examples of such an assertion ever being true, but I can think of numerous counterexamples. I think you need to back your thesis with something more than proclamations from on high.
August 28th, 2002 at 12:55 PM
Re:Getting ahead of yourself?
Before you whip out the perjoratives, Chip, I suggest that you read up on the last seven decades of industrial organization. Starting about 1930, government committees setting R&D goals has been the norm. In the U.S. this led to the large, well known projects leading to nuclear energy, ballistic missiles, and space travel. Currently, the U.S. agency setting goals for nanotechnology is the National Nanotechnology Initiative organized by the National Science Foundation. In the last several decades there has been a general shift in form from centralized government project to industrial coalition. In the 1980's the Silicon Roadmap was created in the U.S., because the Japanese industrial coalition was threatening to seize the lead in chip manufacture. Having beat the Japanese at their own game, it now has become the International Technology Roadmap for Semiconductors. The Roadmap is a classical case of "institutionalization", i.e., turning an informal process (Moore's Law) and turning into a formal one with commitees, goals, timelines, etc. It is true that at the beginning of new industrial epochs, the historically remembered events tend to be unexpected start-ups outside of the mainstream, e.g., Ford Motor in Detroit in 1903 or the silicon technology cluster in Palo Alto in 1958. They, however, existed within well co-ordinated industrial systems that allowed them the capitalization to rapidly grow. Texas is trying to initiate a nanotechnology cluster around Dallas, but it is questionable whether it is out of the mainstream, since it is institutionally well integrated with the national U.S. nanotechnology effort. the
August 28th, 2002 at 1:33 PM
Re:Getting ahead of yourself?
The world you are describing is a complete fantasy that has no relationship to reality. It is a world of Industrial Policy wonks who go to meetings with each other and write reports that nobody but themselves actually reads. But nobody in industry takes any of that stuff seriously. While the various institutions you describe certainly do exist, their economic significance has been modest-to-nonexistent. I suspect these organizations are tolerated mainly because they keep certain kinds of energetic fools safely distracted so the rest of us can get on with our business — a sort of stationary version of the B Ark. It sounds like you've been reading these organizations' own press releases and believing them.
August 28th, 2002 at 5:12 PM
Re:Getting ahead of yourself?
Do you have some evidence to back up your
assertions about Industrial Policy?
Have you spent years working on advanced
scientific instruments and hardware for
Silicon Valley like I have, or is your
experience almost totally restricted to doing
software for entertainment?
Is not your experience with Xanadu evidence
that life gets harder when you get away from
entertainment software into more practical
applications?
How would you control against your experience
in entertainment leading you down the
historical road to another false utopian view
of the speed of social change?
Would not considering that "most significant design problems ultimately come down to arguments about terminology" be a more constructive
approach than flaming?
August 28th, 2002 at 5:34 PM
Re:Getting ahead of yourself?
By anti-industrial, I definitely do not mean an end to specialization of labor, which has clearly continued and will continue to increase. What I do mean is a continuing erosion of the institution of big industry–the Dickensian, "military-industrial complex", Big Blue, or RIAA sense of industry, as has already been initiated through the continuing advancement and commiditization of modern digital information technology.
With regard to spacecraft/NASA: the enormous cost and "elaborate labor processes" are consequences of scale and complexity, but these are quite distinct from level of automation, as can be clearly seen by comparing the costs of manned (partially-automated) vs. unmanned (fully-automated) missions. It is also quite evident in NASA's "faster, better, cheaper" initiative of the last several years, which has emphasized the use of more autonomous probes and rovers as a major technique for reducing the time, cost, and complexity of missions, as successfully demonstrated with the Mars Pathfinder mission.
You are correct that "The speed with which the human intervention time per machine can be reduced sets the fundamental limit on how fast the number of machines may increase (and their average size decrease)." The question then becomes whether the prevailing approach to solving this problem will be the "brute force" approach currently taken with, e.g., silicon fab, whose cost has been growing geometrically over the past several decades, or whether instead such approach will prove increasingly impractical and spur the miniaturization and consequent commoditization of the manufacturing technology itself. My bet is on the latter, which would create a shift to an economic model approximating that of software, in which marginal costs are essentially zero, and an evolution of the market accordingly. Now, this in turn could simply produce even bigger Wintel-style "gorillas" than we've ever seen before, but I remain hopeful that the cumulative experience of the phenomena of the "IBM PC", the Internet, Unix/Linux, the World Wide Web, and Wi-Fi will lead us to take the non-proprietary path. But that would of course remain to be seen.
In the end I would agree with you that one key will probably be the development and/or adaptation of a powerful mathematical/logical paradigm, but any question of what paradigm to adopt begs questions of what it will be called upon to do. I think you would agree that a useful paradigm in your scenario and a useful paradigm in mine would quite likely be rather different from eachother.
August 29th, 2002 at 11:13 AM
Re:Getting ahead of yourself?
I think our main differences are just semantical.
I also see large wafer fabs being undercut at
some point by much smaller and less expensive
equipment, and my mathematics addresses that
situation. I think, however, we need a social
strategy and not just "hope".
The move of manufacturing technology to a high
fixed cost/low marginal cost situation is
guaranteed to create the same type of economic
conflicts we see with information technology.
This means the emergence of large "gorillas"
similar to IBM and Wintel, with intense battles
over copying rights similar to the RIAA fight.
It also means the further concentration of wealth
and locking large numbers of people into poverty.
The military/industrial complex is very much
intact, with new technology being funneled
through DARPA and other military agencies. This
complex is needed by the aristocracy to maintain
an Orwellian surveillance of the society and
to keep the social dissidents down. The rise of
a more civic minded generation to moderate this
trend is foreseeable. In the meantime, however,
we have an imperial Presidency that is thumbing
its nose at the opposition.
September 4th, 2002 at 9:52 AM
Detached Unemotional Clinical Math Driven Future?
I don't think anybody would want to live in a society that has become detached, unemotional, clinical, and purely mathmathics driven. According to Gestalt psychology, "the whole is more than the sum of its parts."
A new Feature Article is now posted on Wonderful Life Foundation Web Site. Click the highlight title to read:
Wonderful Life Foundation
Ponders the Question
ìWhy Did God Kill the Dinosaurs?î
As Explored on the Web Site Portal
Belief Net.com
In case the link above donít work the URL is: http://Wonderful-Life.cheapwebtricks.net/ponder.ht ml
EXCERPT AND SUMMARY:
A major obstacle in regard to scientific progress leading toward establishment of a more Advanced Civilization is division and dichotomy pertaining to science and theology. Division and dichotomy pertaining to science and theology creates a major gulf, separation, and schism between science and religion.
Science and religion share major goals. The goal of becoming a more Advanced Civilization is to raise the global standard of living and better the lives of all people regardless of race, creed, gender, and ethnic origin.
I don't think an ideal model exist that is unequivocally, absolutely, and unanimously accepted as fact. Creationism is refuted. Evolution is refuted. Theological explanations of evolution that paint evolution as a Divine Process are refuted. Likewise, my theory of Accelerated and Decelerated Genesis has a few minor quirks and flaws.
I don't think any one model has complete total predictive explanative power. In the final analysis, I think it comes down to what one chooses to believe. Perhaps God designed it this way on purpose, so that people will know that absolute knowledge is a Divine prerogative.
In order to have a close personal relationship with God, I think it is only natural to desire to use the most powerful tools and methodology available to understand God. I chose to embrace scientific methodology as one way to understand God.
It is difficult to reconcile literal Scripture accounts of creation with scientific evidence of evolution. Current theological explanations of evolution that paint evolution as a Divine Creative Process seem to deny literal Scripture Accounts of Creation.
Current theological explanations of evolution that paint evolution as a Divine Creative Process claim that each calendar day as used in the text of ancient scripture is actually a reference to a time frame of millions of years. The claim that each calendar day as used in Ancient Scripture text is actually a time period equivalent to millions of years seems to deny that God has the power to create the world in 6 days.
According to Gestalt psychology, ìthe whole is more than the sum of its parts.î It is understandable that many people who adhere to religious viewpoints do not want to participate in a future that makes an unemotional clinical analysis of a human person as a laboratory specimen.
The concept of an Accelerated Genesis Process occurring before the Big Bang is validation, affirmation and celebration of the fact that God indeed has the Power to Create the World in 6 days as literally stated in Ancient Scripture Accounts of Creation.
The theoretical concepts of Accelerated Genesis and Decelerated Genesis I propose put forth the conjecture that one does not have to choose between literal Scripture Accounts and scientific theories of evolution. It is not a question of either / or, but both.
The literal Ancient Scripture Account of creation is true. Also, the scientific Darwinian account of evolution as a Divine Creative Process is true.
The archeological evidence supporting evolution is insurmountable and difficult to logically dismiss. The theoretical concept of Accelerated Genesis proposed by Wonderful Life Foundation is an acknowledgement that God indeed has the power to create the World in 6 days. The theoretical concept of Decelerated Genesis proposed by Wonderful Life Foundation is an acknowledgement that God has allowed creation to become susceptible to chaotic random processes of Darwinian evolution.
The existence of dinosaurs demonstrate the natural tendency of random natural law to select fierce brutal creatures as most perfectly adapted to survive. Dinosaurs demonstrate the concept of survival of the fittest.
The natural chaotic random tendency of natural law to select fierce brutal creatures to survive seems to preclude the possibility that a creature could arise who is attuned emotionally and spiritually to the universe in such a way thoughts and actions of the creature reflect the image of God.
Gardeners and farmers can testify that natural selection tends to favor fierce brutal plants with thistles and thorns instead of beautiful flowering plants or crops bearing edible food. Acceptance of the natural tendency of random natural law to select fierce brutal creatures as most perfectly adapted to survive leads to acceptance of the familiar truism, ìGod helps those who help themselves.î
A truism is a common familiar linguistic expression that has been validated by the passing of the truism from generation to generation and wide acceptance of the truism by large groups of people.
The truism, ìGod helps those who help themselvesî promotes scientific progress. The truism ìGod helps those who help themselvesî points to the necessity for mankind to intervene in chaotic random natural processes.
A human being is created in the image of God and is blessed with the ability to think, create, and solve problems. The natural chaotic random tendency of natural law to select intelligent fierce brutal creatures to survive precludes the possibility that a human being reflecting the image of God could have arisen without Godís help. Recently, scientist demonstrated that a crow has intelligence to use tools to survive.
November 23rd, 2003 at 6:29 PM
Re:Mathdriven Nanotech future
maybe stephen wolfram was right. CA modelling really will become important and when the computer doing the CA is using physical bits then we have some interesting convergence.