- NE Pacific clouds observed to amplify warming
- Skeptical Inquirer publisher calls Inhofe’s list uncredible
- The ozone hole has reduced ocean CO2 absorption
- Clinton Climate Initiative and Microsoft partner on Project 2°
- We might as well drive Model-Ts
The question of whether clouds are a positive or a negative feedback is one of the biggest remaining questions in climate modeling. A new paper in the journal Science is another piece of evidence that clouds will amplify the effects of climate disruption instead of dampen it.
The authors of the study analyzed two unrelated observational methods and found that both showed a decrease in cloud cover over the Northeast (NE) Pacific as a result of climate changes in sea surface temperature (SST), sea level pressure (SLP), and two measurements of the troposphere (the lower atmosphere). The observational methods used were satellites and direct observation from ships, and even though the methods were different, both methods produced similar results and similar correlations to SST et al. As a result, the conclusions of the paper are stronger than most.
And those conclusions are that, as the NE Pacific heats up, the amount of cloud cover over the region declines. Less cloud cover over water means that the water absorbs more solar energy, further heating up the ocean surface. Similarly, cooler water in the NE Pacific means more clouds that further cool the ocean’s surface. In addition, observations shown in the paper illustrate this same relationship holds across the entire Pacific.
The authors also compared the observations to the 18 general circulation models (GCMs) in the Coupled Model Intercomparison Project phase 3 archive for how the models compared to the new clouds observations, and the results were striking – only two of the models produced the same statistical correlation sign (+ or -) as the observations, and one of those two is a statistical outlier for wind circulation.
According to the paper’s lead author Dr. Amy Clement of the University of Miami, one of the next steps is to extend their results out of the NE Pacific to the rest of the Pacific and, ultimately, to the rest of the globe. But according to Clement, the biggest problem is that only one GCM accurately reproduces all the observed cloud effects in the NE Pacific.
“This is the best we can do,” she said. “As more models pass this test, then we can begin to have more confidence in the sign of the low cloud feedback and how much that contributes to global climate sensitivity.”
Time, and much more research, will tell.
Thanks to Amy Clement, lead author of the paper for answering my questions and providing me with a review copy of her paper
Before Marc Morano left Sen. Inhofe’s (R-OK) staff to become highly paid climate disruption denier and professional pundit, he collected a list of 687 scientists who supposedly rejected the the science underlying climate disruption. It was titled the United States Senate Minority Report on Global Warming, and the purpose was to play a numbers game with the IPCC scientists responsible for writing and reviewing the Fourth Assessment Report working group 1 scientific review. Supposedly, the 687 “expert” in the Minority Report outnumbered the IPCC scientists because of how many scientists were responsible for writing the IPCC Summary for Policymakers. Given that 2000 or so scientists wrote and/or reviewed the WG1 study, this claim was clearly and obviously false to anyone who understood the way that the IPCC WG1 worked, but that didn’t stop Morano and Inhofe from making the claim. Nor did it stop legions of uninformed climate disruption deniers from repeating the claim ad inifinitum.
While bloggers and some journalists have put a great deal of time into exposing the Inhofe list, it’s too easy to dismiss bloggers as activists and journalists as members of the mythical “liberal media.” But the avowed skeptical organization the Center for Inquiry isn’t so easily dismissed, especially by individuals claiming the mantle of “climate change skeptic.” Which is why the Center’s scathing investigation of the credibility – or rather the lack thereof – of the Inhofe list is such a big deal.
What the Center’s investigation found is that 80% of the scientists on the list had never published a climate-related paper. The Center could verify that only 10% were for certain involved in climate science, with an additional 5% that could have been. And 4% largely or entirely agreed with the scientific theory human-caused climate disruption. I think it’s fair to say that the Center’s claim that “[t]hese results cast serious doubt on the Senate Minority Report’s credibility” is a bit understated.
What does this mean for Inhofe’s infamous list? Probably not much. Morano no longer works for Inhofe, after all, and neither man has been overly concerned with verifiable scientific fact when it comes to climate disruption. But having the publisher of what is arguably the premier skeptical publication in the world say that the list is not credible won’t make their lives any easier. And over the long run, that’s probably a good thing.
In 2007, scientists studying the Southern Ocean reported that it was not absorbing CO2 as efficiently as had been expected. A study published in November, 2008 partly contradicted the 2007 conclusion and said that the original study was flawed due to insufficient resolution in oceanic current eddies. A new study published in Geophysical Research Letters adds yet another chapter to the ongoing questions about the Southern Ocean’s ability, or lack thereof, to absorb anthropogenic CO2.
The fundamental absorption mechanism of any gas in water is related to the partial pressures of the gas in the air vs. dissolved in the water. At a constant temperature, the partial pressure at the water surface will equal the partial pressure of that gas in the air. Increase the amount of the gas in the air and more of it will slowly absorb into the water. This is fundamentally why increasing concentrations of CO2 due to fossil fuel combustion is driving ocean acidification – the partial pressure of CO2 in the Earth’s atmosphere is increasing and the ocean is absorbing more CO2 as a result.
But this simple relationship isn’t sufficient to describe the real ocean. The real ocean has waves, is in contact with wind, and has currents and upwellings that change the relationship on local and regional scales. And it’s these real ocean effects that the study’s authors have modeled using coupled-climate-carbon models (CCCMs), models that focus on the effects of the carbon cycle on climate and vice versa. And the CCCMs have found that the ozone hole over Antarctica has been a significant factor in the reduction of CO2 absorption by the Southern Ocean.
The Southern Annual Mode (SAM) is an atmospheric pattern that moves westerlies (prevailing winds in the mid-latitudes that tend to blow from west to east) closer to the pole. According to the paper, the SAM has moved poleward as a result of increasing GHGs and ozone depletion, and wind stresses on the Southern Ocean have increased. The result of increased wind is greater ocean mixing between the surface layers and deep, carbon-rich layers of the ocean. Essentially, the ocean surface layers that are carbon-poor are being blown off the deeper layers, creating a condition where the ocean can’t absorb as much CO2 because the ocean has a higher CO2 partial pressure than it would have had if the westerlies hadn’t been blowing so strongly.
The authors ran two sets of models, one with the effects of the ozone hole included and one without, and then compared the model runs with observed changes in the the partial pressure of CO2 both in the Southern Ocean and in the air above it. The model that didn’t include the effects of the ozone hole didn’t match reported observations.
In addition, as the SAM exposes deep, carbon-rich water to the surface, ocean acidification of the Southern Ocean accelerates and the effects thereof worsen. These effects include reduced calicification of plankton and larger areas of low oxygen (hypoxia) that could lead to larger ocean dead zones.
In November, the Carboholic reported on a study that claimed the Southern Ocean was still absorbing CO2 and that the problem with studies like this one was that they didn’t have detailed enough ocean models. The authors of this study addressed the criticisms of the prior one directly and concluded that more detailed ocean models were not necessary in this case because the CCCMs used were able to replicate observated changes in CO2. It remains to be seen, however, what happens when both model types from the two studies are combined.
Project 2° is an attempt to provide cities with easy-to-use software tools that can track and manage their greenhouse gas emissions. The idea is that cities all have the same sources of GHG emissions – transportation, electricity generation, energy consumption, and industry – and the similarities mean that all cities have similar needs with respect to their ability to determine where, how, and how much emissions are coming from each sector.
At the moment, the Project 2° website indicates that there are only three cities using the software – Chicago, Houston, and Rotterdam, with Chicago having the most information available on the website. Other resources at the Project 2° website are contact links for access to experts, documentation on official international GHG accounding guidelines, and a detailed demonstration of how the software works.
I don’t know if this will take off or not. Having Microsoft and Autodesk on board certainly doesn’t hurt Project 2°’s chances any, but who knows. Ultimately, though, something like Project 2° is needed if cities are going to be able to get a handle on their GHG emissions and how best to cut them in the near future.
Overall vehicle fuel efficiency hasn’t increased much in decades. Generally speaking, as gasoline and diesel engines have been made more efficient, the efficiency gains have been eaten up with increased vehicle mass and additional gadgets. According to a New Scientist article, modern automobiles, trucks, motorcycles, and buses are not dramatically more efficient than the Ford Model-T.
University of Michigan Transportation Research Institute researchers Michael Sivak and Omer Tsimhoni analyzed the total fleet fuel efficiency for cars et al from 1923 to 2006 and found that overall efficiency improved at only 2% per year average. In addition, the bulk of that 2% annual improvement actually took place between the OPEC oil embargo in the 1973 and 1991. According to the article, from 1991 to 2006, fuel efficiency improved a grand total of 1.8%.
One of the most critical points in the article is that removing old, low efficiency vehicles is critical to raising overall fleet fuel efficiency. Thus the various “Cash for Clunkers” ideas that have been implemented locally and the federal plan that was recently signed into law by President Obama.
The other critical point made in the article is not necessarily obvious:
“Society has much more to gain from improving a car from 15 to 16 mpg (6.38 to 6.8 km/l) than from improving a car from 40 to 41 mpg (17 to 17.43 km/l),” [Sivak and Tsimhoni] write in their paper. “Similarly, the benefits are greater from improving a truck from 4 to 4.5 mpg (1.7 to 1.92 km/l) than from improving a truck from 7 to 7.5 mpg (3.19 km/l).”
It’s not necesarily obvious that this claim is correct until you invert the numbers. Improving fuel efficiency from 15 to 16 mpg is 6.7 to 6.3 gallons per 100 miles, or 0.4 gallons less fuel burned. Improving efficiency from 40 to 41 mpg means burning only 0.1 gallons less fuel (2.5 to 2.4 gallons per 100 miles). In the same way, going from 4.0 to 4.5 mpg is better than going from 7.0 to 7.5 mpg because the improvement is 3 gallons per 100 miles vs. 1 gallon per mile respectively.
NASA Earth Observatory
San Francisco Chronicle
Geophysical Research Letters