Foresight Nanotech Institute Logo
Image of nano

More on Limits to Growth

There was a gratifyingly large response to last Friday’s post Acolytes of neo-Malthusian Apocalypticism.

duty calls

Several of the commenters seemed to think I was trying to refute the LtG model, but that would require a whole book instead of one blog post. I consider LtG to have been demolished in detail by people with a lot more expertise in economic modelling than I, more than three decades ago. My point was simply that Turner’s paper didn’t come close to showing what some people were claiming it did, and in particular did nothing to resuscitate LtG.

One commenter asks:

What’s the cheap replacement for cheap oil? Don’t you realize that spikes in oil prices have preceded the last 4 recessions?

Hmm. If this has happened 4 times, which one was because we ran out of oil?

Here are inflation-adjusted oil prices over the period:
oil
What struck me about the graph was that oil price shocks seem to correlate with gloom-and-doom fads. But most of the volatility is politically inflicted. After a spike to $150 last year it’s been hovering around $50 since. This is about twice the median of the previous 40 years, but the increase is due more to increased demand from Asian development than depletion.

On the on the other hand, it is quite true that commodities in general are counter-cyclical to general market indicators, as shown here:
dow vs crb
I would conjecture that the sharp run-up in commodities in the early 70s was one reason people were so willing to listen to LtG, but then they went flat for 30 years and you got things like the Simon-Ehrlich wager.

All of [Ehrlich's] grim predictions had been decisively overturned by events. Ehrlich was wrong about higher natural resource prices, about “famines of unbelievable proportions” occurring by 1975, about “hundreds of millions of people starving to death” in the 1970s and ’80s, about the world “entering a genuine age of scarcity.” In 1990, for his having promoted “greater public understanding of environmental problems,” Ehrlich received a MacArthur Foundation Genius Award.” [Simon] always found it somewhat peculiar that neither the Science piece nor his public wager with Ehrlich nor anything else that he did, said, or wrote seemed to make much of a dent on the world at large. For some reason he could never comprehend, people were inclined to believe the very worst about anything and everything; they were immune to contrary evidence just as if they’d been medically vaccinated against the force of fact. Furthermore, there seemed to be a bizarre reverse-Cassandra effect operating in the universe: whereas the mythical Cassandra spoke the awful truth and was not believed, these days “experts” spoke awful falsehoods, and they were believed. Repeatedly being wrong actually seemed to be an advantage, conferring some sort of puzzling magic glow upon the speaker.

What’s the replacement for cheap oil? In the short run, efficiency and conservation (replace commuting with telecommuting) are the most most important substitution effects. If higher oil prices are sustained, we’ll see major new forms of exploration and recovery, as in this post I linked to before:


More recently, companies such as Royal Dutch Shell have developed ways to tap the oil in situ, by drilling boreholes that are thousands of feet deep and feeding into them inch-thick cables that are heated using electrical resistance and that literally cook the surrounding rock. The kerogen liquefies and gradually pools around an extraction well, where the oil-like fluid can easily be pumped to the surface.
The process involves no mining, uses less water than other approaches, and doesn’t leave behind man-made mountains of kerogen-sapped shale. And according to a Rand Corporation study, it can also be done at a third of the cost of mining and surface processing.

An affordable way to extract oil from oil shale would make the United States into one of the best places to be when conventional worldwide oil production starts its final decline. The US has an amount of oil in shale equal to about 25 years of oil supply at the world’s current oil consumption rate.

Also, notice we aren’t drilling for oil in the deep oceans at all, but improved technology (and higher prices) would make this possible at some point. E.g. we aren’t even looking at over half the planet yet. The US is 2% of the Earth’s surface (10 out of 510 million sq km). Assuming an even distribution, that means total shale oil on Earth is over 1200 years’ supply at current rates. In any practical sense, the amount of oil on Earth is a function of the technology you have to extract it.

In the longer run, we have plenty of uranium and thorium (see below), and practically limitless space-based solar power is within our technological capability to harvest if we only try.

Many commenters seemed to take the “finite resources” assumptions of LtG pretty much for granted, though. One goes so far as to point out:

A simple calculation (first done publicly by Isaac Asimov, as far as I know) shows you that if population grows just one percent per year without constraint, all the matter in the known universe will be in human bodies before 11,000 AD. So there is a limit to the resources available to us.

But this is a very simplistic notion of what LtG was actually trying to say. It wasn’t “the universe is finite so we can’t grow exponentially forever.” It was “there will be a major collapse of modern civilization in the 21st century due to resource depletion.”
It’s also a pretty puzzling exegesis of what I was trying to say, which wasn’t “the human race should continue growing exponentially forever” but “there won’t be a major collapse of modern civilization in the 21st century due to resource depletion.”

This isn’t to say that resources won’t be depleted — they will. But the way they get depleted is much more complex than the simple LtG model where we voraciously up at an increasing rate and suddenly run out and starve to death. In reality, resources tend to have a graduated difficulty of extraction leading to a more-or-less evenly rising supply curve. This means that the price will rise over time, encouraging efficiency, additional exploration, and hoarding among those who see that the stuff will be more valuable in the future. This leads to growth curves that more closely resemble the standard diminishing-returns sigmoid than LtG’s collapse model.

Furthermore, projections by real experts, such as population trends by mainstream demographers (which are in fact just such a sigmoid), take into account many more factors than LtG’s simplistic model. Note, for example, that for that majority of the world’s population that’s above subsistence level, declining resources do not mean declining birth rates: it’s just the opposite; the richer, the fewer children.

Nor is this to say that collapses won’t happen. But when they have, they have usually been due to government imposition of
pernicious interference that broke the normal market responses. This doesn’t mean necesarily one’s own government, of course, especially in the case of oil.

And finally, objecting to my assertion that government regulation had stifled innovation in the nuclear industry, one commenter writes:

“Compare today’s computers with 1970s ones and see how different the reactors might be today if the same kind of development had taken place.”
Yeah, right. This is the old “if cars had developed like computers, they would go 1000 mph and get 2,000 miles per gallon” argument. To which the response is “yes – and they would seat 100 people and be the size of a matchbox.”
The analogy is misleading, at best. It would be better to compare nuclear reactors to passenger jets.

I don’t understand why cars and jets are supposed to be a better example — cars and jets are highly regulated, and computers are not. Furthermore, the performance of jets improved up through about 1970 at a nice exponential rate, and levelled off with the 747 because it hit a sweet spot in the price/performance landscape (and because cheap energy peaked about the same time). The sweet spot is just below Mach 1, above which the cost per passenger-mile triples.

The reason computers improved so rapidly in price/performance terms is due to the fact that the physics don’t present the kind of “glass ceiling” that supersonic flight does for airplanes. So the key to analyzing the question of how well reactors could have improved over the period is to see just what kind of headroom they had in the design space. An overview of the possibilities can be seen here. In particular, integral fast reactors could achieve a 99% fuel burn-up, improving both fuel efficiency and waste production by a couple of orders of magnitude over the 1970s designs we’re still using:

The Integral Fast Reactor or Advanced Liquid-Metal Reactor is a design for a nuclear fast reactor with a specialized nuclear fuel cycle. A prototype of the reactor was built in the United States, but the project was canceled by the U.S. government in 1994, three years before completion.

In other words, the physics of fission has at least a factor of 1000 free upside in power/waste and is capable of using fuels that are 100-1000 times as abundant as current practice. Instead we get this:

We had a confluence of bad design decisions at TMI, some of them made by the U.S. Congress. U.S. law specifically prohibited using computers to directly control nuclear power plants. …
Now nuclear energy can be mighty dangerous and is not something to be messed with lightly, but another irony in this story is that nuclear power is actually pretty simple compared to many other industrial processes. The average chemical plant or oil refinery is vastly more complex than a nuclear power plant. The nuke plant heats water to run a steam turbine while a chemical plant can make thousands of complex products out of dozens of feedstocks. Their process control was totally automated 30 years ago and had an amazing level [of] safety and interlock systems. A lot of effort was put into the management of chemical plant startup, shutdown, and maintenance. The chemical plant control system was designed to force the highest safety. So when manufacturing engineers from chemical plants looked at TMI, they were shocked to see the low-tech manner in which the reactors were controlled and monitored. To the chemical engineers it looked like an accident waiting to happen.
… And for the next 29 years we didn’t build another nuclear power plant, leaving that mainly to the French and the Japanese.

Yes, progress does hobble to a halt on occasion. But it’s not because we’ve run out of territory. It’s because we’ve shot ourselves in the foot.

9 Responses to “More on Limits to Growth”

  1. Says:

    This is your KEY ERROR and MISTAKE:
    “I consider LtG to have been demolished in detail by people with a lot more expertise in economic modeling than I, more than three decades ago. ”
    Economic modeling will never be able understand and model reality correctly unless it extend its boundaries to ecology, resources, biophysical environment etc.

    I would strongly recommend to do more investigation and study. You indeed do not need any supercomputer to understand basic thermodynamic laws. It is key to all around us including limits to our (humankind) exponencial expansion. Please spend sometime on basics and thermodynamic laws. No need for any calculator. Just brain, paper and pencil is enough.

  2. J. Storrs Hall Says:

    Just in this morning at Eurekalert: Nano-research on drill cores from the North Sea might help increase extraction rates of oil…

    Turns out that current recovery techniques only get about half the oil from the wells we’ve drilled so far…

  3. There Are Limits To Growth « Tai-Chi Policy Says:

    [...] and Economics. Tags: Energy, Futurism, Politics, Prediction, Science, Speculation, Technology trackback However, they’re not caused by supply, but by politics. Case inpoint. [...]

  4. Says:

    Two points Josh.

    First, nuclear power hit a glass ceiling w.r.t. proliferation. IFRs and any fast neutron reactors are quite good at producing weapons grade material, especially considering that “Integral” in the name means on-site reprocessing. You might call this political interference but more accurately its a concern about security in a world with some very radical and immature social groups. There was a trade off made in the 70s over further developing nuclear generation against security. There’s certainly no question that good engineering would have resulted in much better reactors (consider the British experience with Dounrey, France’s Super ‘Phenix’ or the US Enrico Fermi plant) . The first generation was not cost competitive due to triple heat exchanger loops. The high-level question was whether nations would have kept control of the materials produced, or (thinking further) how bad the potential accidents or intentional terrorist acts would have affected society.

    Second, LtG wasn’t particularly built to prove society would collapse, and the authors themselves knew the model was simple and flawed. What they focussed on was the feedback loops between different parts of the model. I believe that was their real point: taking a first step to modeling and understanding those interrelationships, watching out for problem areas. Unfortunately that cautionary tone regarding feedback was instead projected onto the results of certain runs of the model. Useful articles about LtG will point that out. Modeling has gone far past what was done back then.

    – Ron Fischer

  5. Says:

    Recommended reading …..http://europe.theoildrum.com/node/5239

  6. J. Storrs Hall Says:

    Hi Ron!

    Proliferation worries are solvable by engineering — consider the traveling wave design. If the political will had been there, solutions would have been found.

    But that’s a historical detail. For the coming century, nanotech could build a box the size of a phone booth (remember those?) that would capture all your body efluents and reprocess them back into clean air, water, and food for an input of about 100 watts. Total Earth footprint per human: 1 square meter, including the solar power for the gadget. (and the person eats all the meat he wants :-)

    Compare that to the ~2 hectare footprint typically quoted for Americans — a factor of 20000. So with nanotech, without increasing our total footprint at all, we could have 20000 times more people or each be 20000 times richer.

    I’ll take the latter…

  7. J. Storrs Hall Says:

    Want to save a tree? Note that printing the New York Times costs twice as much as it would for the paper to send every subscriber a free Amazon Kindle.

  8. Says:

    the story about assimov is critical.if we live in a closed universe would not an advanced civilization have invaded by now.what does one discover about the nature of true reality that seems to result in stellar isolation.i suspect our true reality is far more advance than we think. an advance species might use tempoal echos as time travel as a way to increase resources.gaining access to time travel could give resorces beyond what anyone could imagine. quantum computers will be necessary to do the job. we may use temporal echoing to get material. perhaps temporal echoing is only half the story. maybe it is part of a gigiantic superstructure only the smart know about therefore it must be important to ask why assimov was wrong. it is because we do not ask the right questions.

  9. Says:

    “… we aren’t drilling for oil in the deep oceans at all, but improved technology (and higher prices) would make this possible at some point. E.g. we aren’t even looking at over half the planet yet.”

    We’re not looking beyond about 12,000 feet, because there isn’t any oil in deeper waters. Pelagic sedimentation rates are too low, and any organics are too dispersed to form any quantities of oil before the deep ocean crust subducts.
    If you can’t bring yourself to read peak oil advocate Kenneth Deffeyes “Hubbert’s Peak”, then at least read “Oil 101″ by Morgan Downey, and learn how, why and where oil (and oil shale) are formed.
    Your credibility just took a huge hit with those who are geologically savvy.

    “The US is 2% of the Earth’s surface (10 out of 510 million sq km). Assuming an even distribution, that means total shale oil on Earth is over 1200 years’ supply at current rates.”

    If an even distribution is true, why isn’t Europe, Japan, India, etc. swimming in oil like the Saudis?
    Why not mine oil sands in Florida with nice weather vs. Alberta with cold nasty weather? (there aren’t any oil sands in Florida – that’s why!)
    Look at geological estimates, and see that the U.S. won the oil shale lottery, and has 60+% of the world’s known oil shale resources.

    “In any practical sense, the amount of oil on Earth is a function of the technology you have to extract it.”

    Sorry, oil is formed at definite places and situations, and trapped at a certain percentage depending on the local geology. While technology and economics determine how much of that oil gets found and extracted, there is only so much oil on the Earth. And we’ve used about half of the conventional oil already.
    Also to consider – the energy return on energy invested in exploration/discovery/production of oil.
    -G

Leave a Reply