- Climate disruption will disrupt volcanism too
- EPA Office of Inspector General finds standard gases not so standard after all
- Driest years in Pacific Northwest drier than expected
- Northeast Passage opened this year for commercial shipping
- 12% of the merchant marine fleet is idled
- Recycling used plastic into fuel
- US Chamber of Commerce and car dealer industry group fight California emissions waiver
- Slowing population growth more effective than renewables at slowing GHG emissions
Nature News reported last week that vulcanologists have concluded that climate disruption will increase the number of volcanic eruptions. According to the article, the reason is that climate disruption is expected to reduce the amount of ice present atop volcanoes and thus reduce the amount of material keeping volcanoes from erupting.
But there is definitely some evidence that less ice means more dramatic eruptions. “As thick ice is getting thinner, there may be an increase in the explosivity of eruptions,” says Hugh Tuffen from Lancaster University, UK.
As strange as this sounds, it’s well grounded in geologic sciences. For example, a paper published in 1999 found that there was a correlation between the eruptions of Pavlof Volcano on the Alaska Peninsula and the season (see the image above). Specifically, as a result of weather patterns in the region, the ocean gets slightly thicker in November, and the added weight is believed to be compressing the magma chamber that feeds Pavlof. As a result, small periodic eruptions at Pavlof tend to happen in November. And another paper in 2004 found that volcanoes tend to erupt globally during changes in the earth’s crust as a result of the water cycle – seasonal variations in ground and seawater. This paper studied a much larger number of volcanoes and found that volcanoes in different regions of the world respond to different changes, but the bulk of volcanic eruptions seemed to show some seasonal variation.
Other studies have found that there was an increase in volcanism as a direct result of climate change. This paper from 1999 found that there was a strong correlation (the chance of the correlation occurring by chance was less than 0.2%) between interglacial periods (like we’re in now) and increased volcanic activity in eastern California as a result of a number of possible factors, one of which is increased geologic stresses due to the weight of ice and glacial lakes.
What this means is that, as the Nature News article says, we can expect that disruption of the climate will in turn drive disruptions in how volcanoes erupt. Unfortunately, there’s very little data at this point about how climate will affect volcanism, and no modeling at all – the latest climate models all model how the climate responds to volcanism, but none of them presently model how volcanism will respond to the climate.
“The IPCC [Intergovernmental Panel on Climate Change] hasn’t addressed these kinds of hazard,” [Bill McGuire from the Aon Benfield UCL Hazard Research Centre at University College London] says. “You have a better chance of coping with any kind of hazard if you know it’s happening,” he adds. “Climate change is not just the atmosphere and hydrosphere; it’s the geosphere as well.”
Organizations that do pollutant monitoring rely on standard gases to ensure that their equipment functions properly. Each standard gas represents a specific amount of pollutant in a given volume of gas, measured in parts per million (ppm), parts per billion (ppb), or some other convenient measurement for the pollutant in question. The standard gas is then injected into monitoring equipment in order to calibrate the equipment to a known amount of pollutant. From that known amount, the equipment can then track how much pollutant is present in the test environment, either more or less than the calibrated level(s). But this process only works if the standard gases have very close to the amount of the pollutant that the gas is supposed to have.
For example, if a calibration gas used by a utility was certified to contain 100 parts per million (ppm) of SO2, but only contained 96 ppm, the system operator would unknowingly calibrate the CEMS to read 96 ppm as 100 ppm. This would result in the CEMS overestimating emissions.
Last week, the EPA’s Office of the Inspector General found that approximately 11% of the standard gases for blends of SO2, CO2, and nitrous oxide (NO) they had purchased and had independently tested were different from the stated amount of gas by 3% or more when the acceptable range was within 2% of the stated amount. And in an example of the seemingly universal rule of “you get what you pay for,” all the failures were from vendors selling inexpensive standard gases, while all of the expensive gases were acceptable.
This is a severe problem because of the sheer number of things that standard gases are used for. The OIG report points out that accurate measurements are vital for the over $5.1 billion SO2 trading market that has been responsible for a dramatic reduction in acid rain. Accuracy of measurements is similarly important for the $350 million NOX trading market. And as for CO2, the World Bank estimated that the global carbon market was $64 billion in 2007. And metropolitan areas are monitored by the EPA for air quality and are fined or forced to make changes to local utilities or transportation as a result of those air quality measurements – if the measurements are incorrect, then the EPA could be giving some cities a passing grade who actually fail air quality, or failing cities that should actually pass.
The OIG’s recommendation, which the EPA office responsible for standard gases agreed with, was for the EPA to implement a quality control process, something that the EPA does not currently have in place.
Climate models are always being improved with new understanding of how climate works (especially in two key areas, cloud and aerosol dynamics). But regional climate modeling is particularly difficult for two reasons: climate models are so processor intensive that they cannot yet model the Earth with high horizontal and vertical resolution, and scientists do not know all the regional changes that drive regional climate away from the global average. Put simply, scientists don’t know everything and can’t model in enough detail for accurate regional climate predictions.
Enter a new paper by two U.S. Forest Service scientists who have studied annual streamflow in the Pacific Northwest. They set out to determine if the annual streamflow (the amount of water flowing out of a watershed in a year) of the driest years was different than the annual streamflow of the average year or the wettest years. They used a statistical technique called “linear quantile regression” to detect any difference from the average change detected in other studies by use of another statistical technique known as “least-square regression.”
The scientists found that there was a greater reduction in annual streamflow in the driest years than the mean trend had detected. Climate models had previously predicted that there would be little overall change in the annual streamflow because the same amount of water would flow through watersheds, but at different times of the year. Instead, this study discovered that, while this was true for average and wet years, dry years were significantly dryer (in a statistical sense) as a result of changes in climate since 1949.
As a result, the authors expect that changes in water management throughout the Pacific Northwest may be necessary. The design of water storage reservoirs may need to change in order to hold multiple years worth of water. Reduced annual streamflow will have a significant impact on aquatic life living in streams that run much lower during dry years than they have in the past, and reduced streamflow will serve as a positive feedback with increased air temperature to increase the stream temperatures and possibly cause even greater reductions in fish populations. And while overall drying across multiple years already stresses forests, individual dry years can kill off large swaths of forest, lead to more forest fires, and slow the growth of surviving trees.
One of the more interesting points the authors made was that the dry years appear to be tightly correlated with El Niño/Southern Oscillation (ENSO) variation from year-to-year and with a yearly trend, but the correlation was significantly weaker when they included the cooling trend in the Pacific Decadal Oscillation. While the authors take pains to point out that this doesn’t conclusively say that the PDO isn’t affecting dry year annual streamflow, they do “favor” a model that doesn’t include the PDO as a driver of the annual streamflow. And they call for more analyses to better identify the causes of the observed dry year changes.
More sophisticated analyses considering other indices, temporal lags, and temporal autocorrelation of indices would likely elucidate more information and provide greater certainty, but this rough analysis presents interesting insights.”
I couldn’t agree more.
Thanks to the paper’s primary author, Dr. Luce, for providing a review copy of his paper.
Historically, Russia’s Arctic coast has been too iced-in for commercial vessels, most of which are designed for hauling their cargo in ice-free waters. But this year, according to a NYTimes article, two German vessels, the Beluga Fraternity and Beluga Foresight, steamed north from South Korea and transited the Northeast Passage, also known as the Northern Sea Route. This route was largely ice free this year, and the two ice-hardened specialty cargo vessels took advantage of the clear waters to cut thousands of miles off the southern route via the Suez Canal. According to the article, while the Beluga vessels were escorted by at least one nuclear powered Russian icebreaker the entire time, the icebreakers were not needed this year.
For the moment, the article points out that the Northeast Passage isn’t expected to be open regularly enough for large just-in-time (JIT) shippers like Maersk to use – schedule accuracy is more important to JIT shippers than fuel savings. But if the Arctic sea ice continues to thin and open up the shipping channel during late August and early September, then specialty shippers like Beluga could start making use of the shorter route in order to get their cargo to its destination cheaper and faster.
According to an investigative report in the UK’s Daily Mail, there is a massive fleet of idled shipping vessels anchored off the coast of Singapore and southern Malaysia. These ships, and others taken out of service around the world, represent 12% of the the entire global merchant marine fleet, sitting idle. And yet shipyards are continuing to build enough new cargo vessels to increase the total number of vessels by 12% next year.
But according to the report, there are no new ship orders for after 2011, and shipping experts expect that the number of vessels idled by the recession will rise to 25% of the merchant marine fleet in the next two years.
Does this mean that the government’s claims that the recession is over are false? Maybe. I’ve read some discussion that the recent small recovery is a result of companies having to rebuild some inventory after having finally sold off the inventory they had stockpiled before the start of the recession. But that’s a one-time event, and restocking inventory isn’t going to do much for the economy as a whole. What all these ships represent is a lack of advance purchases, either due to a general unavailability of credit or due to companies not expecting enough growth to justify re-expanding their global supply line. In either case, it’s not good news for the global economy in general. Remember – 90% of all goods are shipped on vessels like this, so a 12% reduction shipping merchant marine shipping capacity could mean as much as an 11% reduction in overall international trade – in the last year.
However, this reduction in shipping is good news for global carbon emissions and reduced marine pollution. Oceanic shipping is estimated to produce between 3 and 5% of the world’s carbon emissions, so a 12% reduction in vessels will mean a reduction of between 0.3 and 0.6% of total carbon emissions this year. That represents a reduction in total emissions of at least 100 million metric tons off CO2. That savings represents more than Romania’s entire national emissions (98 million metric tons in 2006).
And, fortunately or not, depending on your particular perspective, the longer the economy stays depressed, the slower carbon emissions will rebound to previous levels, giving human civilization an opportunity to clean up its energy production technologies some in the interim.
Nearly all plastic is made from either natural gas or petroleum feedstock. Most plastic is recyclable in some way, either by turning one bottle into another, or by turning bottles into clothing or by turning packing material into park benches. But this is simply reshaping the plastic. Now a company in Maryland has figured out how to turn plastic back into a fuel feedstock that can be blended with diesel or gasoline.
According to Green Inc article, the cost is about $10 per barrel, and the Maryland plant is large enough to convert one ton of plastic into between three and five barrels of oil. However, Environ estimates that nearly 50 million tons of plastic waste are created every year, so a plant that can only convert 6,000 tons per year is a drop in the proverbial bucket. Converting all of that waste back into fuel would take about 10,000 similarly sized conversion plants. Or a bunch of much larger plants. And the process itself is energy intensive – each barrel of fuel represents enough electricity to power 2-3 residences for a day.
So this isn’t a solution to the global plastic problem. And it certainly doesn’t help the U.S.’ oil addiction. But if it can be scaled up, then maybe it’s a step in the right direction. After all, there are more environmental problems than just climate disruption – clean water, air pollution, hazardous waste, and yes, even plastics.
According to the NYTimes Wheels blog last week, the U.S. Chamber of Commerce and the National Automobile Dealers Association have asked the EPA to review a waiver it granted to the state of California in June. The waiver allows California to regulate vehicle emissions CO2 independently and more tightly than national standards. A spokesman for the NADA, Sheldon Gilbert, was quoted in the Wheels blog as saying “That’s a fair description” when asked if the filing was a precursor to a court case.
Clearly, the EPA believes that it’s in accordance with the Clean Air Act, as does the California Air Resources Board, and the Center for Auto Safety’s Safe Climate Campaign. However, the president of Clean Air Watch, Frank O’Donnell, believes that this is just the beginning of carbon emissions lawsuits. He’s probably correct, even though there have been a few lawsuits already relating to climate. But with the courts now involved, it’s fair to say that Arctic communities will be suing energy companies, developing nations will be suing developed nations, and it’s all going to get a lot uglier before things improve. And at least one major insurer/reinsurer believes that a wave of litigation is inevitable.
There are few taboo subjects when it comes to climate disruption. Environmentalists and activists regularly discuss pollution, energy consumption, the benefits of eating local and seasonal, drinking reclaimed water, even composting human waste. But one thing that is generally considered off-limits is population growth. Given that human reproduction is considered a taboo subject by a large percentage of cultures and religions, this is perhaps unsurprising. But no discussion of humanity’s impact on climate could possibly be complete without occasionally discussing how the mere existence of more people creates climate pressure, taboo subject or not.
A new study by the London School of Economics and commissioned by the British group Optimum Population Trust (OPT) found that reducing the number of people on the planet via voluntary family planning and contraception was pretty cost effective. According to the study, it’s cheaper than all current CO2 reduction technologies except for geothermal and sugar cane-derived ethanol (see the table above).
However, this conclusion has met with significant criticism from groups opposed to family planning, contraception, and the like. The San Francisco Chronicle’s blog The Mommy Files has a post on this study, and they point out that a British anti-abortion group has attacked the study as concluding “that fewer children and more abortions means a better environment.” As the Mommy Files points out, the OPT actual study says nothing about more abortions creating a better environment. Instead, the study has the following things to say about abortion:
In addition, a reduction in unintended pregnancies (and hence, population growth) is shown to help with issues of hunger, civil conflict, water shortages, unsafe abortions, deforestation and agriculture.
Better access to contraception and sexual education, especially for girls and women, are excellent ways to reduce unintended pregnancies.
A Washington Post article on the same study also pointed out that there was an Oregon State University (OSU) study that came to basically the same conclusions, but went a different direction. Instead of estimating the monetary savings/cost of reducing CO2, the OSU study estimated now much CO2 a baby born in a given country would add to the atmosphere:
In the United States, each baby results in 1,644 tons of carbon dioxide, five times more than a baby in China and 91 times more than an infant in Bangladesh, according to the Oregon State study. That is because Americans live relatively long, and live in a country whose long car commutes, coal-burning power plants and cathedral ceilings give it some of the highest per-capita emissions in the world.
Just because the estimated costs are lower for reducing CO2 emissions via family planning doesn’t mean that this will be enough to keep cumulative emissions from exceeding what many scientists consider “acceptable.” Recent science suggests that global warming should be kept “acceptable” (< 2 °C global temperature rise) if total cumulative emissions are kept below an additional 1,000 billion tons of CO2 over what we’ve already emitted. The OPT study found that the cumulative emissions savings from 2010 to 2050 was 34 billion metric tons. In 2050, the difference between the worst case and the best case IPCC emissions scenario is about 500 billion tons of CO2, with the best case staying below the 1,000 billion tons of CO2 limit and the worst-case exceeding it by 250%. The 34 billion tons saved via population reductions from family planning represents only 6.8% of that difference, and actual emissions are worse than the IPCC wost-case scenario.
Reducing population will help solve so many problems beyond climate disruption that it’s difficult to argue against it from anything other than religious grounds. But as cheap as it could be, it won’t be enough. We’ll still need increased energy efficiency and more renewable energy and maybe nuclear power and to shut down coal plants wherever possible and to quickly transition away from petroleum-powered transportation.
Solving climate disruption isn’t multiple choice, it’s “all of the above”
Daily Mail/Richard Jones/Sinopix
Optimum Population Trust