The Weekly Carboholic: carbon dioxide lifetime 50-100x longer than generally reported


There is a great deal of confusion regarding how long carbon dioxide (CO2) persists in the atmosphere. There are at least two reasons for this. The first reason is that there are multiple physical and biological processes that combine to remove CO2 from the air and they behave differently and at different speeds. The second reason is that the Intergovernmental Panel on Climate Change (IPCC) has changed how it describes the lifetime of CO2 in the Summary for Policymakers twice over the course of four assessment reports. And since most politicians and media don’t dive deeper into the assessment reports than the summaries, some confusion is reasonable, if unfortunate.

A new paper titled Atmospheric Lifetime of Fossil-Fuel Carbon Dioxide and reported on by Nature Reports aims to describe and understand the confusion over the lifetime of CO2.

There are essentially three processes that remove CO2 from the air: vegetation, oceanic absorption and reaction with calcium carbonate, and reactions with weathered igneous rocks like granite and basalt. Vegetation absorbs CO2 the fastest, but it’s also one of the least well understood – additional plant growth can also release more CO2 depending on local climate variation with biosphere uptake of CO2 possibly saturating (producing as much CO2 as it absorbs) in the next few decades as a result. The next fastest is oceanic absorption, but the ocean can absorb only a limited amount of CO2 before it reaches an equilibrium state with the atmosphere and can’t absorb any more. When that happens, the two longest duration effects – reactions with calcium carbonate ocean sediments and igneous rock – are required to remove the CO2 from the atmosphere and ocean. The calcium carbonate reaction takes thousands to tens of thousands of years to remove even half of the dissolved CO2 from the ocean while the igneous rock reaction takes tens to hundreds of thousands of years.

The result is that a simple “half-life” model doesn’t work for carbon dioxide – the physical system has too many variable and components to claim that CO2 has an single atmospheric lifetime. Or, to borrow an analogy from the paper:

An analogy can be drawn with radioactive waste, for which the decay of its radioactivity as a whole will not conform to any single measure, since it is composed of a variety of different radionucleotides with a wide range of half-lives.

The half-life of radioactive waste cannot be simply and easily defined, and neither can the half-life (lifetime) of CO2 in the atmosphere Simply saying that the lifetime is 5-200 years isn’t accurate when the geologic processes that ultimately return CO2 back into the earth take tens to hundreds of thousands of years to be effective. And since the responses of the ocean and biosphere are themselves dependent on how much CO2 is emitted into the atmosphere, the lifetime of CO2 is also a function of the amount of CO2 emitted in the first place.

As stated above, though, the IPCC also shares some of the blame for the confusion about CO2 lifetime. The first assessment report defined the lifetime of CO2 as 50-200 years while the second and third assessment reports lowered the lower limit to 5 years. The most fourth and most recent assessment report changed the language entirely to say (quoted from the paper):

“Carbon dioxide cycles between atmosphere, oceans, and land biosphere. Its removal from the atmosphere involves a range of processes with different time scales. … The remaining 20% may stay in the atmosphere for many thousands of years.”

There are a number of points of information to take from this. The first thing is that a short-term removal of CO2 from the air by plants and the ocean on very short time scales (5-50 years) is not the dominant factor in what controls how fast CO2 is returned to the earth from whence it came – chemical processes that operate on geologic time scales of tens to hundreds of thousands of years are the dominant factor. And because human lifetimes are short compared to the lifetime of CO2 in the air and water once it’s been removed from the ground, a certain percentage of the emitted CO2 will persist effectively forever. The second thing to keep in mind is that the more CO2 is emitted, the more will remain behind after the ocean and biosphere saturate, making fast decarbonization of our civilization vital. The third thing to realize is that it may not be possible to return to pre-industrial CO2 concentrations without developing cost-effective absorption technology or radically altering land use to absorb as much CO2 from the atmosphere as possible.

The last thing to understand is that the quoted CO2 lifetimes arbitrarily stop calculating the global warming potential (GWP) of CO2 (to which all other greenhouse gases are referenced) at 100 years or so. If the lifetime were calculated over geologic time, however, the strength of CO2 would increase be multiplied by 50-100x, effectively lowering the GWP of all other gases by the same amount. This conclusion reinforces the importance of CO2, and of CO2 emission reduction, with respect to climate disruption.


Improvements in CO2 monitoring

Carbon dioxide monitoring started with at Mauna Loa in 1959, and because of the long duration of the measurements, the Mauna Loa measurements are often considered the gold standard. But as good as those measurements are, and have been, new scientific data shows that CO2 doesn’t mix universally into the atmosphere as initially assumed, and local variations in the ocean around Hawaii introduce errors just as land use changes can impact CO2 measurements elsewhere in the world. An article in the NYTimes talks about some of the new systems being implemented that will improve global monitoring of CO2.

NOAA’s Earth System Research Laboratory (ESRL) has been installing CO2 monitors to track the ebb and flow of CO2 concentration atop communications towers for years now, but most of the towers are relatively short. Short towers provide detailed measurements that apply only for a small area surrounding the tower, limiting their wider utility. Taller towers provide measurements that are usable for a larger area and that provide information on how the various sources of CO2 – automobiles, agriculture, forest, utility plants, etc. – are mixing together in the atmosphere. And the NYTimes article says that ESRL now has eight tall towers gathering data from around the country, with the lead scientist, Dr. Peter Tans, hoping to expand the tall tower network ultimately to 30 towers.

The most interesting news from the article, however, was that the Orbiting Carbon Observatory (OCO) will be launched in January 2009 and will nearly immediately start monitoring CO2 globally from orbit. While the orbit won’t permit the OCO to measure variations in CO2 concentration according to the time of day, it’s planned two year mission will enable global measurements of CO2 in all seasons and with near complete or complete coverage of the Earth’s atmosphere. This data will enable scientists to better understand how CO2 concentrations vary from region to region and day to day, something that scientists just don’t know yet. And two years of data may enable scientists to calibrate their existing ground-based measurements to better estimate what CO2 is doing outside the immediate vicinity of the measurement devices.

Better data, more geographic coverage, continuous measurements, independent calibration. All are good things for scientists studying CO2.


British plan for “floodable” towns

In an interesting approach that may deserve attention internationally, the Guardian newspaper reported last week that British Department for Environment, Food and Rural Affairs (DEFRA) last week recommended that new villages along the Thames River be designed to be “floodable” in an attempt to live with periodic flooding.

The basic idea behind the the plan is to build homes and other buildings slightly above grades so they don’t need to be sandbagged to be safe and to provide open areas that are designed to flood. Community parks and recreational lakes that are intended to flood would lower the overall flood level in the community, making property damage much less likely. After all, if your house floods you have to repair or replace it. But if your roads and parks flood, the cost of cleaning up is comparably small. Put another way, if you’re going to have clean up your public spaces and roads from a flood regardless of what you do, wouldn’t you rather not have to pay to repair or replace homes and businesses while you’re at it?


New technology enables electricity from slower water currents

According to a Telegraph article, most of the world’s rivers and ocean currents flow at 3 knots or less. Existing technologies for generating electricity from water tend to need 6 knots or more to operate efficiently, making most of the world’s flowing water unusable for electricity generation. However, a new technology called VIVACE (Vortex Induced Vibrations for Aquatic Clean Energy) is being developed at the University of Michigan that, in initial testing, appears to operate at speeds of 1.5-2 knots. If the technology, presently only a demonstration project, can be commercialized and deployed, it might be able to provide large amounts of electricity. And if, as developer Prof. Michael Bernitsas expects, the technology is comparably safe for aquatic life, then it might prove inexpensive and safe enough for widespread deployment.

At a projected cost of 5.5 cents per kilowatt-hour, it could be significantly cheaper than many other forms of renewable energy. For that reason alone this technology probably deserves to be looked into more seriously. But history has shown that taking marine products from the prototype stage to deployment is a serious problem due to the harshness of a water environment to real devices. Maybe the relative simplicity of this new technology compared to standard turbines or wave energy buoys will make the development cycle less onerous. Time will tell.


First “solar rooftop” completed, others in jeopardy

The Carboholic first reported on Southern California Edison’s (SCE) “solar rooftop” plan back in April. The plan is to rent rooftops on large buildings, mostly warehouses, and to install 250 MW of photovoltaic panels on the roofs by 2010. According to Environment News Service (ENS), the first installation of 2 MW is now complete atop a 600,000 square foot warehouse in Fontana, but the LA Times reports that SCE’s plan to have utility customers pay for the plan has consumer advocates criticizing the plan.

California has required that 20% of all electricity used by the state be generated using renewable methods like solar, and the “solar rooftop” program is SCE’s first planned project. The ENS article says that the first three rooftops have all been approved by the California Public Utilities Commission, but that the overall project is still being reviewed and the decision will be made in March 2009. The second rooftop site is a nearly 500,000 square foot industrial building in Chino.

While the LATimes article says that business owners who could rent out their rooftops for solar panels are generally thrilled by SCE’s plan, consumer activists are not so happy with it. Putting solar rooftops up is expected to cost $1 billion, and the utility wants customers to foot the bill. Sepideh Khosrowjah, policy adviser for the Division of Ratepayer Advocates of the California Public Utilities Commission, is quoted in the LATimes article as saying “This is not the most cost-effective renewable [SCE] could invest in,” and given that SCE claims that the first rooftop cost them about 27 cents per kWh, Khosrowjah is certainly right. And given that the article says that 250 MW was approximately the entire production capacity of solar panels in the U.S. last year, advocates who fear an increased price for solar panels also have a good point.

According to the LATimes, SCE already gets 16% of its energy from renewables of one form or another, but that the company prefers the solar rooftop plan to large-scale solar thermal in the desert or wind farms because rooftop solar is much faster and cheaper to build. SCE doesn’t have to spend years getting permits and building a large power plant and doesn’t have to pay to get new transmission lines strung at a price of between $1 and $5 million per mile.

If the California PUC doesn’t approve the funding proposal, however, SCE may not install any more solar rooftops, and that would bring their grand experiment in large-scale distributed electricity generation to a screeching halt.


Distributed book publication and printing, anyone?

Sometimes I come across an idea that is just plain neat. The idea of distributing books in an electronic format that is then printed on a special book printer close to where the books are purchased struck me that way immediately. According to the NYTimes Green Inc blog post on the technology, it solve a number of problems relating to the distribution of books – books are heavy and shipped from a centralized printer all over the country or world. Books that were printed “on demand” would never be over-printed, wouldn’t need lots of cardboard to protect the books during shipping, and wouldn’t result in lots of direct fuel carbon emissions from thousands of trucks hauling books around.

In some ways this is an adaptation of the same technology already used by newspaper organizations who print nationally and internationally – they distribute electronic copies of their papers to printers around the world who then print and distribute the print copies to subscribers. If it weren’t for the $100,000 price tag for the printers, it might be widely adopted. But until your local bookstore can afford one to print the books that they don’t keep on the store shelves, it’ll probably remain a niche market.

On the other hand, if carbon is priced via a cap-and-trade scheme, taxation, or direct regulations, the price of books may go up enough to justify the expense for larger bookstore chains with regional distribution centers to afford or maybe even for larger independent bookstores like Powell’s Books or The Tattered Cover to afford.

Image credits:
Nature Report, reproduced from “The Long Thaw” by D. Archer
NOAA, via ESRL website
Wikimedia Commons, author: Jialiang Gao

8 replies »

  1. Pingback:
  2. My goodness, what forethought with the ‘floodable’ town. Sometimes it amazes me that it takes us so long to learn…or relearn in this case…ideas that probably should be classified under “common sense”.

    Those printer/binders are pretty neat stuff, and it would be a boon for getting hard to find books without shipping. They are currently in use, but mostly by printing operations. I know one company that has one for very short runs (and they’re certainly making self-publishing far more realistic when combined with electronic pre-press technologies). I haven’t gotten to see one in action, but i have seen the finished product. They make nice books (and this is from someone who’s worked on a bindery line and still, compulsively inspects books before i buy them).

  3. If not sobering, the first article is accurate. Don’t forget the buried main point. “The second thing to keep in mind is that the more CO2 is emitted, the more will remain behind after the ocean and biosphere saturate, making fast decarbonization of our civilization vital.” Like Molly suggests…Keep Thinkin! It’s always a simple question: Who do we need to be to be there? The answers define our evolving civilization. Thinking is what is vital. Getting to the 99th monkey will always be a struggle.