The Weekly Carboholic: water vapor effect on climate measured


One of the larger problem with climate models, and with climate science in general, is a general lack of fidelity in how the water cycle will be affected by anthropogenic climate disruption. This is especially important given that water vapor in the atmosphere is responsible for the bulk of the energy absorption (aka the greenhouse effect) that keeps the Earth’s average temperature well above freezing. But because of water vapor’s relatively short lifetime in the atmosphere before it’s removed via precipitation or chemical reactions, scientists have generally had a difficult time estimating just how water vapor affects climate. A new paper by A. E. Dessler, Z. Zhang, and P. Yang titled “Water-vapor climate feedback inferred from climate fluctuations, 2003–2008”, based on eight years of actual water vapor and temperature data measured across the globe by a satellite has found that water vapor causes global temperatures to rise or fall 2.04 times what carbon dioxide (CO2) and other greenhouse gases alone would cause.

The first attempt to measure the impact of water vapor directly using measured data instead of ensemble averages of climate models was made in a 2004 paper by P. M. D. F. Forster and Matthew Collins attempted titled “Quantifying the water vapour feedback associated with post-Pinatubo global cooling.” The significant cooling following the eruption of Mt. Pinatubo enabled a direct measurement of the amount of cooling as a result of a globally-significant climatic event triggered by the mass release of sulfur dioxide into the stratosphere (a known and well understood cooling agent). Forster and Collins calculated that the climate feedback due to water vapor was approximately 1.6x. However, because the event was singular, it wasn’t possible to accurate estimate the feedback factor without the help of climate models. While those models permitted an estimate of the error (0.9 to 2.5x), the large error range and the use of climate models gave climate disruption skeptics a way to discredit (fairly or not) the research as insufficiently rigorous and scientific.

Which is why the new paper by Dessler et al is such a major improvement. It doesn’t rely on climate models at all except as a comparison, and yet it finds that a feedback factor that is within one standard deviation of the Forster/Collins paper from four years ago. That two different studies using very different methods found similar feedback factors speaks well for the accuracy of both papers.

The small amount of data (only five years) leaves much to be desired, but it is possible to draw some general conclusions. First, it’s becoming less likely that the climate models using an estimated water vapor feedback of 1.6 to 2.0x are wildly wrong due to how they’re modeling water vapor. Second, the results of the models are more likely to be considered realistic with measured data backing them up. And third, scientists who are calculating a low feedback factor (approximately equal to 1x) may have to re-examine their own calculations and data.

As Dessler said in his interview with TerraDaily about the 2x feedback factor:

That number may not sound like much, but add up all of that energy over the entire Earth surface and you find that water vapor is trapping a lot of energy.

If Dessler and the other scientists who have studied the data are correct, then Climate Progress editor and Center for American Progress senior fellow Joe Romm’s post about this paper may very well be right:

A “warming of several degrees Celsius” = the end of life as we know it.

Time will tell, and very likely sooner rather than later.


New data says southern ocean still absorbing CO2

An international team of scientists reported in 2007 that the southern ocean that surrounds Antarctica was weakening as a carbon sink as a result of increasing winds causing carbon-rich deep ocean water to upwell and release the dissolved CO2 in the cold water directly into the atmosphere. This past week saw a new study from another team that appears to partially contradict the first study. The new study, published in Nature Geoscience and reported by Reuters, finds that data stretching back into the 1960s doesn’t support the idea that increasing westerly winds around Antarctica is actually reducing the southern ocean’s ability to act like a CO2 sink.

One of the authors, Steve Rintoul of the Centre for Australian Weather and Climate Research and CSIRO, was kind enough to provide me with a copy of the paper for this article. The study involved measurements made by ships traveling throughout the southern ocean since the 1960s plus a massive amount of new data from the free-floating ARGO ocean probes. There were a number of interesting results. First, the Antarctic Circumpolar Current (ACC) was found to have been pushed closer to the coast of Antarctica by between 50 and 80 km as a result of reduced salinity and increasing temperature close to Antarctica. Second, the entire southern ocean has freshened (become less salty and also less dense), very likely as a result of additional precipitation adding freshwater to the ocean’s surface as predicted by climate models. Third, the waters closest to the Antarctic coast have warmed the most while much of the southern ocean in all three sections (Atlantic, Indian, and Pacific Ocean sections) has actually cooled somewhat. Fourth, it’s possible, although difficult to verify with the available data, that the freshening and warming of the southern ocean has increased since the early 1980s. But most importantly, the data shows that the ACC’s poleward shift does not appear to be reducing the southern ocean’s ability to act as a carbon sink, and thus far the ACC appears to be largely insensitive to changes in wind stress.

The problem that the original research ran into is one of model resolution. Global climate models are unable to model the southern ocean in sufficient resolution to accurately resolve the effects of eddies in the ACC and how they interact with the way that winds drag water along (aka Ekman transport). High resolution models of the southern ocean found that the ACC was largely insensitive to anthropogenic disruptions, but coarser resolution (as required to run global simulations due to limitations in supercomputer power) didn’t detect these effects and so erroneously detected a drop in the southern ocean’s carbon sinking capacity. With new and more powerful supercomputers coming on-line all the time, this limitation on climate modeling will gradually fade.

The fact that the global climate models are likely wrong in this case is a good thing. The southern ocean is believed to be the largest oceanic carbon sink in the world, and if it’s still absorbing carbon well, that gives human civilization longer to reduce carbon emissions. But just as interesting is the fact that high-resolution climate models detected the effect that the lower resolution models did not. And the measured data analyzed for the study supported a number of the general conclusions based on modeling, namely the systemic warming of the ACC and the freshening/warming trends of the ACC itself.

All in all, excellent news for scientists testing the accuracy of climate models and for the climate itself.


Ocean acidifying faster than expected

Over the last eight years, researchers have sampled the pH of the water in the same spot off Tatoosh Island every half-hour of every day. This record provided a massive amount of data showing that the Pacific Ocean is becoming acidic, and at a rate far faster than climate models had anticipated. According to a BBC article on the research, the observed pH was falling at Tatoosh Island at a rate that was 10-20 times faster than prior predictions based on climate models. Professor Wootton, lead author on the Proceedings of the National Academy of Sciences-published study pointed out to the BBC that “biology is affecting pH, through photosynthesis and respiration, but current models don’t include biological activity as part of the story.”

Wootton and his colleagues are planning on integrating their data from Tatoosh Island into the larger pH measurements from the rest of the global ocean in order to determine whether there is something unusual about Tatoosh Island or whether the huge measured changes over the last eight years are also occurring globally. As was pointed out in a Discovery News article about this study, the pH measurements were made at a coastal location instead of in the deep ocean, so there could be a significant delta between the wider ocean pH data and the data local to Tatoosh.

However, Wootton and the other authors also tracked how intertidal life reacted to the pH changes, and the results were stark – the number and range of mussels shrank significantly and certain kinds of barnacles moved in to take over the ecological niche that the mussels were dying out of, among other changes. The result, however, was that the coastal ecology being studied had changed dramatically over just eight years and apparently in response to changes in ocean pH. But as Wootton pointed out in a National Geographic article, “an acidity-driven shift in coastal ecosystems could spell disaster for shellfish industries that rely on mussels and other similar species.”


Missing radiation signature points to thinning Tibetan glaciers

Glaciologists have been studying glaciers for decades, and ever since the U.S. and Russia did air burst detonations of nuclear weapons, there has been a radioactive signature somewhere in the glacier from the global fallout stored in the snow that fell those years. Not any more. According to an Science Daily story, for the first time ever, a glacier has been found that has melted so much that it has neither the 1962-63 nor the 1951-52 radioactive signals. And it means that the glaciers are either no longer adding ice or, even more likely, are actually thinning due to seasonal melt and have lost whatever ice they had accumulated since the early 1950s.

The problem in this case is that this is the Naimona’nyi glacier, a Tibetan glacier that, combined with others in the area, supply water for the Indus, Ganges, and Brahmaputra rivers. And those three rivers are the main water supplies for over a billion people in India, Pakistan, and Bangledesh. If the glaciers disappear, as they may well be doing, then that could precipitate water conflicts in a region with a long simmering territorial conflict between two nuclear-armed neighbors, India and Pakistan.

The scientists took the ice core from the Naimona’nyi glacier in 2006 and only found a 1944 radioactive signature. But the team had taken ice cores from other glaciers around the region:

“We have to wonder — if we were to go back to previous drill sites, some drilled in the 1980s, and retrieved new cores – would these radioactive signals be present today?” [Lonnie Thompson, University Distinguished Professor of Earth Sciences at Ohio State] asked.

“My guess is that they would be missing.”


“Cash for Clunkers” to get old gas hogs off the road and the Center for American Progress Action Fund have combined forces to push for a program that they’re both calling Cash for Clunkers. Cash for Clunkers is a proposal to the incoming Congress and President-elect that should function as a short-term economic stimulus and as an fuel efficiency increaser for passenger vehicles.

According to the plan, cars that are 13 years or older would provide owners of older, less fuel efficient vehicles a cash trade-in value that the owners could then use to purchase a newer, more fuel efficient vehicle or for mass transit. This plan would be managed by automobile dealers who, the program summary claims, would benefit from more customers on their lots willing to buy replacement vehicles for the traded-in clunker. In addition, automakers would likely benefit as they sold new vehicles to customers. And in the long run, the car owners most likely to own inefficient clunkers in the first place – the elderly on fixed incomes and the poor – would be more insulated when oil prices rise again as a result of higher global demand and constrained supply.

The environmental benefits of such a program are pretty clear. According to the report, 75% of all automobile pollution is emitted by cars that are at least 13 years old even though they are only 25% of all the miles driven. Moving the majority of these vehicles off the road could reduce oil consumption and related pollution and carbon emissions by 33%. And with the vehicles being sold for scrap metal and salvaged for precious metals, the price of buying all the various vehicles would be partly offset as the salvage sales were returned to the Treasury. The report claims that auto part salvage would also help offset the costs, but this would only be the case in situations where the parts were usable on newer, fuel-efficient vehicles – ideally, all the old cars that could use the parts would be off the roads.

This is not the first time that such a program has been proposed, and so there are some criticisms of the programs. One criticism of a similar program came from economist Steve Levitt of the Freakonomics blog back in August. In his blog, Levitt had several problems with cash-for-clunkers programs:

  1. the programs are more likely to pull rusting, nearly undrivable hulks out of driveways than it is to pull gas guzzlers off the road;
  2. they could lead to more old cars being driven instead of less as people drive just a few more years to earn the incentive;
  3. the impact on new car purchases would be limited at best since people driving clunkers are more likely to purchase used cars;
  4. and the redistribution benefits to clunker owners would, over the the long run, end up helping all car owners instead.

I asked Jack Hidary, chairman of and an entrepreneur who has been active on these issues at federal and local levels how the “Cash for Clunkers” program addressed Levitt’s concerns. “Cash for Clunkers does not specify the age that a car has to be,” he said. “To qualify, a car just has to have less than an 18 MPG rating.” This essentially negates Levitt’s issues #1 and #4 above. In addition, with minimum registration and license timeframes for the vehicles being turned in, cars sitting on blocks wouldn’t qualify in the first place – they’re not likely to be registered.

Hidary also didn’t believe that the long term effects of the program would be a problem vis-a-vis the redistribution or used cars. The purpose of this plan is to “bias the entire market, used and new, toward high-MPG cars,” said Hidary. “We will be crushing all the cars that
come in and taking them out of circulation.”

There is no doubt that removing clunkers and gas guzzlers from the nation’s roads is a good idea. This program, or one very much like it, sounds like an excellent first step toward that goal.


Portugal commits to electric vehicles by 2011

Last week the Prime Minister of Portugal, Jose Socrates, put his nation of 10.6 million people on a path toward zero-emissions transportation with commitments to build 1,300 electric vehicle charging stations, to have 20% of all government vehicles zero-emissions by 2011, and to provide tax incentives to everyone who purchases an electric vehicle. In return, automaker Renault-Nissan will provide and promote electric vehicles nationwide in 2011, a year before the automaker will begin global marketing of electric vehicles in 2012.

The article doesn’t mention whether these charging stations will be like standard gasoline service stations or would be focused on public and private parking areas, something that may determine whether this project is successful. Batteries take time to charge and I have a difficult time believing that drivers will be willing to wait at a charging station for hours while their vehicle charges. Ultracapacitors charge quickly enough to be recharged at a “stop-and-go” charging station, but the latest information I’ve seen shows that they’re too expensive and, thus far, don’t hold enough energy for the 160 km of driving that the article claims Renault-Nissan’s vehicles will be able to achieve.

If Portugal is able to pull this transition off, then that’s excellent news for them. But Portugal is a small country, both in population and in land area (about the size of the state of Maine), so I have a hard time believing their success can be easily translated across the Pond. The U.S. had 116,855 gas service stations in 2006, or roughly one for every 2500 people in the U.S. 1300 electric charging stations in Portugal, scaled up to the population of the U.S. and our number of gas stations, would be almost 36,000 stations in the U.S., and that would still be only a third of the total gasoline service stations.

Image credits:
C. A. Pfister/University of Chicago, via
Getty Images

4 replies »

  1. Pingback:
  2. I like the vehicle trade in idea, not only from an environmental standpoint but also as a better way to help Detroit than throwing money at GM (Ford really doesn’t need a bailout and they aren’t asking for one).

    But i still wouldn’t do it. My 23 year old Toyota still gets gas mileage equivalent to any comparable model that i could buy new…how sad is that? Of course, i don’t have all those bells and whistles that American car buyers just have to have. No cup holders, no AC, no cruise control, no airbags, and no computers. And, yes, i do actually need/use a pickup truck.