American Culture

The Weekly Carboholic: Climate disruption will disrupt volcanism too



Nature News reported last week that vulcanologists have concluded that climate disruption will increase the number of volcanic eruptions. According to the article, the reason is that climate disruption is expected to reduce the amount of ice present atop volcanoes and thus reduce the amount of material keeping volcanoes from erupting.

But there is definitely some evidence that less ice means more dramatic eruptions. “As thick ice is getting thinner, there may be an increase in the explosivity of eruptions,” says Hugh Tuffen from Lancaster University, UK.

As strange as this sounds, it’s well grounded in geologic sciences. For example, a paper published in 1999 found that there was a correlation between the eruptions of Pavlof Volcano on the Alaska Peninsula and the season (see the image above). Specifically, as a result of weather patterns in the region, the ocean gets slightly thicker in November, and the added weight is believed to be compressing the magma chamber that feeds Pavlof. As a result, small periodic eruptions at Pavlof tend to happen in November. And another paper in 2004 found that volcanoes tend to erupt globally during changes in the earth’s crust as a result of the water cycle – seasonal variations in ground and seawater. This paper studied a much larger number of volcanoes and found that volcanoes in different regions of the world respond to different changes, but the bulk of volcanic eruptions seemed to show some seasonal variation.

Other studies have found that there was an increase in volcanism as a direct result of climate change. This paper from 1999 found that there was a strong correlation (the chance of the correlation occurring by chance was less than 0.2%) between interglacial periods (like we’re in now) and increased volcanic activity in eastern California as a result of a number of possible factors, one of which is increased geologic stresses due to the weight of ice and glacial lakes.

What this means is that, as the Nature News article says, we can expect that disruption of the climate will in turn drive disruptions in how volcanoes erupt. Unfortunately, there’s very little data at this point about how climate will affect volcanism, and no modeling at all – the latest climate models all model how the climate responds to volcanism, but none of them presently model how volcanism will respond to the climate.

“The IPCC [Intergovernmental Panel on Climate Change] hasn’t addressed these kinds of hazard,” [Bill McGuire from the Aon Benfield UCL Hazard Research Centre at University College London] says. “You have a better chance of coping with any kind of hazard if you know it’s happening,” he adds. “Climate change is not just the atmosphere and hydrosphere; it’s the geosphere as well.”


gascanistersEPA Office of the Inspector General finds standard gases not so standard after all

Organizations that do pollutant monitoring rely on standard gases to ensure that their equipment functions properly. Each standard gas represents a specific amount of pollutant in a given volume of gas, measured in parts per million (ppm), parts per billion (ppb), or some other convenient measurement for the pollutant in question. The standard gas is then injected into monitoring equipment in order to calibrate the equipment to a known amount of pollutant. From that known amount, the equipment can then track how much pollutant is present in the test environment, either more or less than the calibrated level(s). But this process only works if the standard gases have very close to the amount of the pollutant that the gas is supposed to have.

For example, if a calibration gas used by a utility was certified to contain 100 parts per million (ppm) of SO2, but only contained 96 ppm, the system operator would unknowingly calibrate the CEMS to read 96 ppm as 100 ppm. This would result in the CEMS overestimating emissions.

Last week, the EPA’s Office of the Inspector General found that approximately 11% of the standard gases for blends of SO2, CO2, and nitrous oxide (NO) they had purchased and had independently tested were different from the stated amount of gas by 3% or more when the acceptable range was within 2% of the stated amount. And in an example of the seemingly universal rule of “you get what you pay for,” all the failures were from vendors selling inexpensive standard gases, while all of the expensive gases were acceptable.

This is a severe problem because of the sheer number of things that standard gases are used for. The OIG report points out that accurate measurements are vital for the over $5.1 billion SO2 trading market that has been responsible for a dramatic reduction in acid rain. Accuracy of measurements is similarly important for the $350 million NOX trading market. And as for CO2, the World Bank estimated that the global carbon market was $64 billion in 2007. And metropolitan areas are monitored by the EPA for air quality and are fined or forced to make changes to local utilities or transportation as a result of those air quality measurements – if the measurements are incorrect, then the EPA could be giving some cities a passing grade who actually fail air quality, or failing cities that should actually pass.

The OIG’s recommendation, which the EPA office responsible for standard gases agreed with, was for the EPA to implement a quality control process, something that the EPA does not currently have in place.


Driest years in Pacific Northwest drier than expected

Climate models are always being improved with new understanding of how climate works (especially in two key areas, cloud and aerosol dynamics). But regional climate modeling is particularly difficult for two reasons: climate models are so processor intensive that they cannot yet model the Earth with high horizontal and vertical resolution, and scientists do not know all the regional changes that drive regional climate away from the global average. Put simply, scientists don’t know everything and can’t model in enough detail for accurate regional climate predictions.

Enter a new paper by two U.S. Forest Service scientists who have studied annual streamflow in the Pacific Northwest. They set out to determine if the annual streamflow (the amount of water flowing out of a watershed in a year) of the driest years was different than the annual streamflow of the average year or the wettest years. They used a statistical technique called “linear quantile regression” to detect any difference from the average change detected in other studies by use of another statistical technique known as “least-square regression.”

The scientists found that there was a greater reduction in annual streamflow in the driest years than the mean trend had detected. Climate models had previously predicted that there would be little overall change in the annual streamflow because the same amount of water would flow through watersheds, but at different times of the year. Instead, this study discovered that, while this was true for average and wet years, dry years were significantly dryer (in a statistical sense) as a result of changes in climate since 1949.

As a result, the authors expect that changes in water management throughout the Pacific Northwest may be necessary. The design of water storage reservoirs may need to change in order to hold multiple years worth of water. Reduced annual streamflow will have a significant impact on aquatic life living in streams that run much lower during dry years than they have in the past, and reduced streamflow will serve as a positive feedback with increased air temperature to increase the stream temperatures and possibly cause even greater reductions in fish populations. And while overall drying across multiple years already stresses forests, individual dry years can kill off large swaths of forest, lead to more forest fires, and slow the growth of surviving trees.

One of the more interesting points the authors made was that the dry years appear to be tightly correlated with El Niño/Southern Oscillation (ENSO) variation from year-to-year and with a yearly trend, but the correlation was significantly weaker when they included the cooling trend in the Pacific Decadal Oscillation. While the authors take pains to point out that this doesn’t conclusively say that the PDO isn’t affecting dry year annual streamflow, they do “favor” a model that doesn’t include the PDO as a driver of the annual streamflow. And they call for more analyses to better identify the causes of the observed dry year changes.

More sophisticated analyses considering other indices, temporal lags, and temporal autocorrelation of indices would likely elucidate more information and provide greater certainty, but this rough analysis presents interesting insights.”

I couldn’t agree more.

Thanks to the paper’s primary author, Dr. Luce, for providing a review copy of his paper.


nepassageNortheast Passage opened this year for commercial shipping

Historically, Russia’s Arctic coast has been too iced-in for commercial vessels, most of which are designed for hauling their cargo in ice-free waters. But this year, according to a NYTimes article, two German vessels, the Beluga Fraternity and Beluga Foresight, steamed north from South Korea and transited the Northeast Passage, also known as the Northern Sea Route. This route was largely ice free this year, and the two ice-hardened specialty cargo vessels took advantage of the clear waters to cut thousands of miles off the southern route via the Suez Canal. According to the article, while the Beluga vessels were escorted by at least one nuclear powered Russian icebreaker the entire time, the icebreakers were not needed this year.

For the moment, the article points out that the Northeast Passage isn’t expected to be open regularly enough for large just-in-time (JIT) shippers like Maersk to use – schedule accuracy is more important to JIT shippers than fuel savings. But if the Arctic sea ice continues to thin and open up the shipping channel during late August and early September, then specialty shippers like Beluga could start making use of the shorter route in order to get their cargo to its destination cheaper and faster.


ghostfleet12% of the merchant marine fleet is idled

According to an investigative report in the UK’s Daily Mail, there is a massive fleet of idled shipping vessels anchored off the coast of Singapore and southern Malaysia. These ships, and others taken out of service around the world, represent 12% of the the entire global merchant marine fleet, sitting idle. And yet shipyards are continuing to build enough new cargo vessels to increase the total number of vessels by 12% next year.

But according to the report, there are no new ship orders for after 2011, and shipping experts expect that the number of vessels idled by the recession will rise to 25% of the merchant marine fleet in the next two years.

Does this mean that the government’s claims that the recession is over are false? Maybe. I’ve read some discussion that the recent small recovery is a result of companies having to rebuild some inventory after having finally sold off the inventory they had stockpiled before the start of the recession. But that’s a one-time event, and restocking inventory isn’t going to do much for the economy as a whole. What all these ships represent is a lack of advance purchases, either due to a general unavailability of credit or due to companies not expecting enough growth to justify re-expanding their global supply line. In either case, it’s not good news for the global economy in general. Remember – 90% of all goods are shipped on vessels like this, so a 12% reduction shipping merchant marine shipping capacity could mean as much as an 11% reduction in overall international trade – in the last year.

However, this reduction in shipping is good news for global carbon emissions and reduced marine pollution. Oceanic shipping is estimated to produce between 3 and 5% of the world’s carbon emissions, so a 12% reduction in vessels will mean a reduction of between 0.3 and 0.6% of total carbon emissions this year. That represents a reduction in total emissions of at least 100 million metric tons off CO2. That savings represents more than Romania’s entire national emissions (98 million metric tons in 2006).

And, fortunately or not, depending on your particular perspective, the longer the economy stays depressed, the slower carbon emissions will rebound to previous levels, giving human civilization an opportunity to clean up its energy production technologies some in the interim.


Recycling used plastic into fuel

Nearly all plastic is made from either natural gas or petroleum feedstock. Most plastic is recyclable in some way, either by turning one bottle into another, or by turning bottles into clothing or by turning packing material into park benches. But this is simply reshaping the plastic. Now a company in Maryland has figured out how to turn plastic back into a fuel feedstock that can be blended with diesel or gasoline.

According to Green Inc article, the cost is about $10 per barrel, and the Maryland plant is large enough to convert one ton of plastic into between three and five barrels of oil. However, Environ estimates that nearly 50 million tons of plastic waste are created every year, so a plant that can only convert 6,000 tons per year is a drop in the proverbial bucket. Converting all of that waste back into fuel would take about 10,000 similarly sized conversion plants. Or a bunch of much larger plants. And the process itself is energy intensive – each barrel of fuel represents enough electricity to power 2-3 residences for a day.

So this isn’t a solution to the global plastic problem. And it certainly doesn’t help the U.S.’ oil addiction. But if it can be scaled up, then maybe it’s a step in the right direction. After all, there are more environmental problems than just climate disruption – clean water, air pollution, hazardous waste, and yes, even plastics.


US Chamber of Commerce and car dealer industry group fight California emissions waiver

According to the NYTimes Wheels blog last week, the U.S. Chamber of Commerce and the National Automobile Dealers Association have asked the EPA to review a waiver it granted to the state of California in June. The waiver allows California to regulate vehicle emissions CO2 independently and more tightly than national standards. A spokesman for the NADA, Sheldon Gilbert, was quoted in the Wheels blog as saying “That’s a fair description” when asked if the filing was a precursor to a court case.

Clearly, the EPA believes that it’s in accordance with the Clean Air Act, as does the California Air Resources Board, and the Center for Auto Safety’s Safe Climate Campaign. However, the president of Clean Air Watch, Frank O’Donnell, believes that this is just the beginning of carbon emissions lawsuits. He’s probably correct, even though there have been a few lawsuits already relating to climate. But with the courts now involved, it’s fair to say that Arctic communities will be suing energy companies, developing nations will be suing developed nations, and it’s all going to get a lot uglier before things improve. And at least one major insurer/reinsurer believes that a wave of litigation is inevitable.


opttable501Slowing population growth more effective than renewables at slowing GHG emissions

There are few taboo subjects when it comes to climate disruption. Environmentalists and activists regularly discuss pollution, energy consumption, the benefits of eating local and seasonal, drinking reclaimed water, even composting human waste. But one thing that is generally considered off-limits is population growth. Given that human reproduction is considered a taboo subject by a large percentage of cultures and religions, this is perhaps unsurprising. But no discussion of humanity’s impact on climate could possibly be complete without occasionally discussing how the mere existence of more people creates climate pressure, taboo subject or not.

A new study by the London School of Economics and commissioned by the British group Optimum Population Trust (OPT) found that reducing the number of people on the planet via voluntary family planning and contraception was pretty cost effective. According to the study, it’s cheaper than all current CO2 reduction technologies except for geothermal and sugar cane-derived ethanol (see the table above).

However, this conclusion has met with significant criticism from groups opposed to family planning, contraception, and the like. The San Francisco Chronicle’s blog The Mommy Files has a post on this study, and they point out that a British anti-abortion group has attacked the study as concluding “that fewer children and more abortions means a better environment.” As the Mommy Files points out, the OPT actual study says nothing about more abortions creating a better environment. Instead, the study has the following things to say about abortion:

In addition, a reduction in unintended pregnancies (and hence, population growth) is shown to help with issues of hunger, civil conflict, water shortages, unsafe abortions, deforestation and agriculture.

Better access to contraception and sexual education, especially for girls and women, are excellent ways to reduce unintended pregnancies.

A Washington Post article on the same study also pointed out that there was an Oregon State University (OSU) study that came to basically the same conclusions, but went a different direction. Instead of estimating the monetary savings/cost of reducing CO2, the OSU study estimated now much CO2 a baby born in a given country would add to the atmosphere:

In the United States, each baby results in 1,644 tons of carbon dioxide, five times more than a baby in China and 91 times more than an infant in Bangladesh, according to the Oregon State study. That is because Americans live relatively long, and live in a country whose long car commutes, coal-burning power plants and cathedral ceilings give it some of the highest per-capita emissions in the world.

Just because the estimated costs are lower for reducing CO2 emissions via family planning doesn’t mean that this will be enough to keep cumulative emissions from exceeding what many scientists consider “acceptable.” Recent science suggests that global warming should be kept “acceptable” (< 2 °C global temperature rise) if total cumulative emissions are kept below an additional 1,000 billion tons of CO2 over what we’ve already emitted. The OPT study found that the cumulative emissions savings from 2010 to 2050 was 34 billion metric tons. In 2050, the difference between the worst case and the best case IPCC emissions scenario is about 500 billion tons of CO2, with the best case staying below the 1,000 billion tons of CO2 limit and the worst-case exceeding it by 250%. The 34 billion tons saved via population reductions from family planning represents only 6.8% of that difference, and actual emissions are worse than the IPCC wost-case scenario.

Reducing population will help solve so many problems beyond climate disruption that it’s difficult to argue against it from anything other than religious grounds. But as cheap as it could be, it won’t be enough. We’ll still need increased energy efficiency and more renewable energy and maybe nuclear power and to shut down coal plants wherever possible and to quickly transition away from petroleum-powered transportation.

Solving climate disruption isn’t multiple choice, it’s “all of the above”
Image credits:
Daily Mail/Richard Jones/Sinopix
Optimum Population Trust

27 replies »

  1. While all of this is fascinating..a recitation of symptoms still doesn’t prove that CO2 can change climate. Just because things happen is not proof that human activities cause it.

    “State by State, Selling the Lie

    By Joseph D’Aleo, Fellow of the American Meteorological Society

    As part of a well thought out and executed plan to convince the public there is global warming despite the cold and snow records of the last two years, get state climate action plans approved, keep the grant gravy train rolling through the university systems, and get government legislation or carbon control legislation approved that will benefit Wall Street and the government at our expense is underway.

    Detailed well produced reports are being dribbled out state by state warning of a ridiculously warm and severe climate future. They are based on the same climate models which have failed miserably in the first decade showing strong warming while the globe cooled, sea levels accelerating up while they have stopped rising and heat records increasing in frequency while we have had fewer heat records in any decade since the 1800s, and disappearing snow while all time snow records occurred in the last two years. But don’t confuse the issue with facts. These reports are timed to affect the decisions made by congress w/r to Cap-and Tax. ”
    read the rest

    • Just as denying the existence of all the science I’ve been reporting over 2 years doesn’t make it go away, Judy.

      And quoting D’Aleo doesn’t help your cause either. Of course, you don’t know enough math, science, or logic to understand that he’s a flat-out liar, so you don’t even realize that quoting him destroys any credibility you think you have.

  2. That is a pathetic response, Brian. Just calling somebody a liar isn’t quite scientific…is it?
    Pretending that CO2 causes climate to change is a fraud, a scam and you are complicit.

    As for whose word I would take on these issues…D’Aleo’s over yours anytime.

    Joseph D’Aleo, Executive Director, Certified Consultant Meteorologist

    Joseph D’Aleo was the first Director of Meteorology at the cable TV Weather Channel. He has over 30 years experience in professional meteorology. Mr. D’Aleo was Chief Meteorologist at Weather Services International Corporation and Senior Editor of “Dr. Dewpoint” for WSI’s popular web site. He is a former college professor of Meteorology at Lyndon State College. He has authored and presented a number of papers as well as published a book focused on advanced applications enabled by new technologies and how research into ENSO and other atmospheric and oceanic phenomena has made skillful seasonal forecasts possible. Mr. D’Aleo has also authored many articles and made numerous presentations on the roles cycles in the sun and oceans have played in climate change.

    Mr. D’Aleo is a Certified Consultant Meteorologist and was elected a Fellow of the American Meteorological Society (AMS). He has served as a member and then chairman of the American Meteorological Society’ Committee on Weather Analysis and Forecasting, and has co-chaired national conferences for both the American Meteorological Society and the National Weather Association. Mr. D’Aleo was elected a Councilor for the AMS.

    Joseph D’Aleo is a graduate of the University of Wisconsin BS, MS and was in the doctoral program at NYU.

    Mr. D’Aleo’s areas of expertise include climatology, natural factors involved in climate change, weather and climate prediction, and North Atlantic Oscillation (NAO).

    • Judy, I’ve debunked D’Aleo and illustrated that he’s a liar several times – every time you’ve presented me with a graph that he ginned up or trumpted, in fact. And in this debate, scientific data and fact beat a degree every time. It’s really too bad that you haven’t realized that yet.

      Oh, by the way – you’re committing a logical fallacy called “appeal to authority.” Since you can’t win on the science and data, you appeal to the fact that D’Aleo is a meteorologist and therefore supposedly knows more than I do. Unfortunately for you, he’s been shown to be liar by myself and many, many, many others repeatedly, and so his “authority” is nonexistent.

      I heard it put another way by a climatologist at one point – “Climatologists can’t predict the weather on a day to day basis. What makes [meteorologists] think that they can predict the climate?”

  3. You guys are tooooo funny! You pretend that name calling can change reality.

    “In a speech last week at the U.N.’s World Climate Conference in Geneva, Professor Mojib Latif of Germany’s Leibniz Institute of Marine Sciences at Kiel University, one of the world’s foremost climate modelers and a lead author for the United Nations Intergovernmental Panel on Climate Change acknowledged that the Earth has been cooling and is likely to continue that trend for the next couple of decades. Al Gore, call your office.

    Latif has been looking into the influence of cyclical changes to ocean currents and temperatures in the Atlantic, a feature known as the North American Oscillation. When he factored these natural fluctuations into his global climate model, Professor Latif found the results brought the allegedly endless rise in global temperatures to a screeching halt.

    Latif conceded the planet has not warmed for nearly a decade and that we are likely entering “one or even two decades during which temperatures cool.” Latif still believes in a warming trend and thinks it will resume. But he at least acknowledges the empirical evidence of cooling, that there are factors at work here other than your SUV, and that doom will not occur the day after tomorrow.

    None of the alarmists and their supercomputer climate models ever predicted even a 30-year respite in their apocalyptic scenarios. Neither did they predict the sun, that thermonuclear furnace in the sky that has more influence on earth’s climate than any number of Ford Explorers, would suddenly go quiet for an indefinite period.

    Charles Perry, a research hydrologist with the U.S. Geological Survey in Lawrence, Kan., says there’s a growing sense in the scientific community that the earth may be entering into a “grand minimum” — an extended period with low numbers of sunspots that results in cooler planetary temperatures.

    In July through August of this year, 51 consecutive days passed without a sunspot, one day short of the record. As of Sept. 15, the current solar minimum — with 717 spotless days since 2004 — ranks as the third longest on record.

    Perry cites data indicating that global temperature fluctuations correspond to a statistically significant degree with the length of the sunspot cycle and variations in solar activity. 1816, the “year without a summer,” was during an 1800 to 1830 grand minimum when Europe became significantly cooler.

    Latif and others conclude that, at the very least, we have time to think about it and analyze and learn. We don’t have to fight global warming by inflicting global poverty. More things on Earth affect climate than are dreamed up in computer models.”

    • There’s certainly some questions, and maybe the climate will cool down. I, for one, would be thrilled if climatologists turn out to be wrong this time. It’s not particularly likely, but it would be better for humanity if it did. That said, however, there’s a significant amount of disagreement on what’s going to happen. The Hadley Center (UK Met) anticipates that the temperature of the planet will heat up again this year or next as the sun comes out of its recent minimum. So now we have another model that predicts the temperature will cool down. OK. Given that the Hadley Center model was the only one to correctly model the effects of clouds and heating on part of the Pacific, I’m sure you’ll understand why I’m partial to the results of the Hadley model.

      I’ve read other news sources (as opposed to editorial sources like the one you cut&pasted) like this one from >the UK Daily Mail, and they have a quote that smacks your editorial around a little:

      However Dr Latif still believes that carbon dioxide released from the burning of fossil fuels stored underground for millions of years will still warm the planet in the longer term.

      And this quote from a New Scientist article:

      “People will say this is global warming disappearing,” he told more than 1500 of the world’s top climate scientists gathering in Geneva at the UN’s World Climate Conference.

      “I am not one of the sceptics,” insisted Mojib Latif of the Leibniz Institute of Marine Sciences at Kiel University, Germany. “However, we have to ask the nasty questions ourselves or other people will do it.”

      Regardless, though, I find it wholly unsurprising that you, who have touted David Evans’ disdain for climate models in general, are touting the results of a climate model that may agree with your own views. If all the climate models are bullshit, Judy, how is it that this model is suddenly, magically correct?

      Your rank hypocrisy is showing, Judy.

  4. God, you are so funny.

    The disinformationalist trick of accusing the opponents of hypocricy when warmists get hoist on their own petards is manifest above.

    The point of the article and Latif’s speech is that the models were wrong….and it is cooling…..(in spite of CO2 levels.)

    You don’t have a leg to stand on anymore…so you try shooting the messenger from the floor.


  5. It’s simple logic, Judy, something at which you’re remarkably inept. If the models were all wrong before, then there’s no reason to assume that this one model suddenly “got it right.” The only thing that changed is that Latif’s new model agrees with you – sort of.

    And so yes, touting it is hypocrisy.

    As for your so-called cooling, I’ve addressed that as well with you more times than I care to count. And the result is that your claims don’t hold statistical water. The last time I ran the numbers, the “trend” that I looked at was so far within the noise that it would take at least 70 years for it to exceed one standard deviation (the point at which the trend is 63% likely to not be reproduced by random data, and the absolute lowest level of significance most scientists and statisticians accept). And the trend was one of D’Aleo’s 7 year trends. Oh, and it suffered from the traditional endpoint problem – it started in an El Nino year and ended in a La Nina year, artificially lowering any trend.

    Facts, Judy. Data. Come back when you actually have some.

  6. Brian…you are doing it again.

    The climate modeler admitted the models were wrong…the rest of your fiddle is faddle.

  7. The tactics are always the same.

    It’s cooling Brian. The models didn’t predict it because they gave too great a value to CO2. It is now admitted even by Real Climate that there probably won’t be any more warming until at least 2020…and all your denial and name calling can’t change that.

    The Dog Ate Global Warming
    Interpreting climate data can be hard enough. What if some key data have been fiddled?

    By Patrick J. Michaels

    Imagine if there were no reliable records of global surface temperature. Raucous policy debates such as cap-and-trade would have no scientific basis, Al Gore would at this point be little more than a historical footnote, and President Obama would not be spending this U.N. session talking up a (likely unattainable) international climate deal in Copenhagen in December.

    Steel yourself for the new reality, because the data needed to verify the gloom-and-doom warming forecasts have disappeared.

    Or so it seems. Apparently, they were either lost or purged from some discarded computer. Only a very few people know what really happened, and they aren’t talking much. And what little they are saying makes no sense.In the early 1980s, with funding from the U.S. Department of Energy, scientists at the United Kingdom’s University of East Anglia established the Climate Research Unit (CRU) to produce the world’s first comprehensive history of surface temperature. It’s known in the trade as the “Jones and Wigley” record for its authors, Phil Jones and Tom Wigley, and it served as the primary reference standard for the U.N. Intergovernmental Panel on Climate Change (IPCC) until 2007. It was this record that prompted the IPCC to claim a “discernible human influence on global climate.”

    Putting together such a record isn’t at all easy. Weather stations weren’t really designed to monitor global climate. Long-standing ones were usually established at points of commerce, which tend to grow into cities that induce spurious warming trends in their records. Trees grow up around thermometers and lower the afternoon temperature. Further, as documented by the University of Colorado’s Roger Pielke Sr., many of the stations themselves are placed in locations, such as in parking lots or near heat vents, where artificially high temperatures are bound to be recorded.

    So the weather data that go into the historical climate records that are required to verify models of global warming aren’t the original records at all. Jones and Wigley, however, weren’t specific about what was done to which station in order to produce their record, which, according to the IPCC, showed a warming of 0.6° +/– 0.2°C in the 20th century.

    Now begins the fun. Warwick Hughes, an Australian scientist, wondered where that “+/–” came from, so he politely wrote Phil Jones in early 2005, asking for the original data. Jones’s response to a fellow scientist attempting to replicate his work was, “We have 25 years or so invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it?”

    Reread that statement, for it is breathtaking in its anti-scientific thrust. In fact, the entire purpose of replication is to “try and find something wrong.” The ultimate objective of science is to do things so well that, indeed, nothing is wrong.

    Then the story changed. In June 2009, Georgia Tech’s Peter Webster told Canadian researcher Stephen McIntyre that he had requested raw data, and Jones freely gave it to him. So McIntyre promptly filed a Freedom of Information Act request for the same data. Despite having been invited by the National Academy of Sciences to present his analyses of millennial temperatures, McIntyre was told that he couldn’t have the data because he wasn’t an “academic.” So his colleague Ross McKitrick, an economist at the University of Guelph, asked for the data. He was turned down, too.

    Faced with a growing number of such requests, Jones refused them all, saying that there were “confidentiality” agreements regarding the data between CRU and nations that supplied the data. McIntyre’s blog readers then requested those agreements, country by country, but only a handful turned out to exist, mainly from Third World countries and written in very vague language.

    It’s worth noting that McKitrick and I had published papers demonstrating that the quality of land-based records is so poor that the warming trend estimated since 1979 (the first year for which we could compare those records to independent data from satellites) may have been overestimated by 50 percent. Webster, who received the CRU data, published studies linking changes in hurricane patterns to warming (while others have found otherwise).

    Enter the dog that ate global warming.


    • Statistics isn’t a “tactic,” Judy. It’s the mathematical tool by which correlation and causation are determined. Given that you’re willing to let others speak for you and have no interest in learning the science, it’s not a surprise that you’re so easily led astray on this stuff.

      You claim that “[t]he models didn’t predict it because they gave too great a value to CO2.” This is factually inaccurate, although you can be forgiven for not understanding why it’s wrong – most people don’t get this particular nuance of climate modeling any better than you do. So here’s a quick education.

      Climate models are run over and over and over in order to determine what general trends show up. Then those models are averaged together in order to remove short-term variations (aka “weather”), because those variations aren’t what the climatologists care about, and a single run of any model is statistically useless (ie it has an indeterminate amount of error) for making projections. In the process, though, the short-term variations that can “predict” periods of flat or cooling climate are intentionally removed in favor of more statistically robust conclusions.

      I’ve pointed this study out to you before, and the fact that you’re making this BS argument tells me that you still haven’t read it, but here it is – again: Paper shows cooler periods entirely expected

      It’s true that there has been no statistically significant warming since 1998. But this isn’t a statistically meaningful result. First, there was an unusually strong el Nino in 1998 that made that year unusually hot. In 2008 and thus far through 2009 there’s been la Nina conditions that have made the last year or so colder. And as a result, a statistical least-squares fit of a line to the data would be expected to show little warming for the last decade. But as the paper’s authors point out, if you choose 1999 as your starting point, all of a sudden there’s statistically valid warming again.

      The problem, says the paper, is that natural variability is added atop the carbon dioxide (CO2) driven warming signal. And when that natural variability is greater on short time scales than the CO2 signal, then it’s all but inevitable that there will be decades where warming appears to stop only to start up again later. This is illustrated in the figure [below] for two periods in a single model run.

      It’s a misconception that the Earth’s climate will warm perpetually without any variability – the real climate doesn’t work like that. Instead, the misconception is probably a result of poor communication about the differences between model averages (which intentionally smooth out the variability seen in the image above in order to discover underlying trends) and the results of a single model run. The Earth’s actual climate response is going to appear much closer to a single model run, with all the attendant peaks and valleys, than to any multiple model average.

      The authors also analyzed the statistical significance of positive vs. negative trends in models and the measured data and found that even when negative trends existed in the model results, 0.0% of the negative trends were statistically significant to the 95% confidence interval. On the other hand, 26.0% of positive trends were statistically significant at the same confidence interval. This means that even when the models show negative trends, the trends are so small as to be effectively meaningless.

      Here’s the image, BTW:

      And then there’s this excerpt from a publication by Met scientists in the Bulletin of the American Meteorological Society, which says in part:

      Near-zero and even negative trends are common
      for intervals of a decade or less in the simulations, due to the model’s internal climate variability. The simulations rule out (at the 95% level) zero trends for intervals of 15 yr or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate.

      The 10 model simulations (a total of 700 years of simulation) possess 17 nonoverlapping decades with trends in ENSO-adjusted global mean temperature within the uncertainty range of the observed 1999–2008 trend (−0.05° to 0.05°C decade–1). Over most of the globe, local surface temperature trends for 1999–2008 are statistically
      consistent with those in the 17 simulated decades. (emphasis added)

      In other words, there were 17 periods in 10 different model runs that were statistically insignificant from the real observed global temperature trend of the last decade.

      So much for your “the models don’t predict it” canard.

      I decided to take a half-step back on my calling D’Aleo a liar, though – he’s either a liar, or he’s so inept at statistics that he doesn’t understand that he’s misrepresenting the data. Either way, though, he’s not someone you should be looking up to as an authority on climate disruption.

  8. As usual, saying I wrote something I didn’t. I was referring not to your statistics, but to your misrepresentation of fact.

    There is a disconnect you keep dancing around…the fact that there is no longer a correlation between CO2 and temperature, and then there is this:

    “IPCC’s own table refutes their claim that humans are prime source of increases in CO2

    23 Sep 09 – “The IPCC is so messed up that even their own publications refute their conjecture,” says this article by Norm Kalmanovitch, P. Geoph.

    “Throughout the 1990’s global CO2 emissions from fossil fuels were increasing on average by about 500 million metric tonnes of CO2 per year. This table from the IPCC 2001 Third Assessment Report shows the annual increase in atmospheric CO2 over this time period to be 11,700 million metric tonnes of CO2 per year. REALITY CHECK.pdf

    “This clearly demonstrates that only 500 of the 11,700 million metric tones of annual increase in atmospheric CO2 was from fossil fuels.

    “This means that 95.73% of the increase was naturally sourced.

    This also means that “the increase in atmospheric CO2 concentration, regardless of source, is not causing an increase in global temperature as demonstrated by the past eight years of cooling with steadily increasing CO2 concentration and the ever increasing CO2 emissions from fossil fuels.”

    In other words, the IPCC’s own table refutes their claim that
    humans are the prime source of increases in CO2.

    • Judy, you claim that there’s no longer correlation. The two papers I refer you to, and the image above that you’re conveniently ignoring, illustrate that there will be many decades when it appears that there is no correlation between temperature and CO2 due to the effects of other factors. But in both papers, models show warming resumes after a period of statistically inconsequential warming or statistically inconsequential cooling.

      As far as that .pdf, there are so many problems with it that I almost don’t know where to start.

      First, the table comes from an Energy Information Administration report that uses the IPCC Third Assessment report from 2001 as it’s source. That doesn’t mean that the table itself is from the IPCC TAR. Without a table number from the TAR itself, the table in the .pdf could be from nearly anywhere.

      Second, even if we do assume that the table is taken verbatim from the TAR, it’s still from the Third Assessment Report, that was published in 2001. We now have the Fourth Assessment Report (AR4), which supersedes the TAR in all respects (even when the AR4 says that it’s conclusions agree with the conclusions from the TAR).

      Third, the .pdf fails entirely on its own bad math, even without potential data problems. It claims that “[t]hroughout the 1990’s global CO2 emissions from fossil fuels were increasing on average by about 500 million metric tonnes of CO2 per year” and that “[t]his clearly demonstrates that only 500 of the 11,700 million metric tonnes of annual increase in atmospheric CO2 was from fossil fuels.” The table is titled “Global Natural and Anthropogenic Sources and Absorption of Greenhouse Gases in the 1990s (emphasis mine).” So if the table is meant to represent the entire decade from 1990-1999, then the .pdf fails to multiply its “500 million metric tons of CO2 per year” by 10 years, for a total of 5,000 metric tons. That’s 42.7% of the total, not the 4.27% that the .pdf claims.

      Fourth, the .pdf fails to get the data right too. According to the EIA’s estimates of global CO2 emissions from flaring and consumption of fossil fuels (available in Excel format here), the total emissions for 1990 was actually 21,683 million metric tons, not the 500 that the .pdf claims. And the combined emissions for the entire decade is 223,477 million metric tons, or an average of 22,348 million metric tons per year. This number is so close to the number in the table that the table very likely represents the average for the 1990s as it was estimated earlier this decade.

      And given those four problems, all the conclusions reached after the wrong math are also wrong.

      So, Judy, if this little .pdf is one of the reasons you believe that CO2 can’t affect climate, then this part of your reasoning is based on wrong information.

      Do you have any more easily demolished dross like this?

  9. When the facts don’t fit the paradigm, pretend facts don’t count …or pretend they are wrong.
    How long a time would you like to get the “appear”-ance of correlation back, Brian?

    “Late Carboniferous to Early Permian time (315 mya — 270 mya) is the only time period in the last 600 million years when both atmospheric CO2 and temperatures were as low as they are today (Quaternary Period ).

    Temperature after C.R. Scotese
    CO2 after R.A. Berner, 2001 (GEOCARB III)

    “There has historically been much more CO2 in our atmosphere than exists today. For example, during the Jurassic Period (200 mya), average CO2 concentrations were about 1800 ppm or about 4.7 times higher than today. The highest concentrations of CO2 during all of the Paleozoic Era occurred during the Cambrian Period, nearly 7000 ppm — about 18 times higher than today.”

    The Carboniferous Period and the Ordovician Period were the only geological periods during the Paleozoic Era when global temperatures were as low as they are today. To the consternation of global warming proponents, the Late Ordovician Period was also an Ice Age while at the same time CO2 concentrations then were nearly 12 times higher than today– 4400 ppm. According to greenhouse theory, Earth should have been exceedingly hot. Instead, global temperatures were no warmer than today. Clearly, other factors besides atmospheric carbon influence earth temperatures and global warming.”

    Take a look at the graph of the history of temperature in relation to CO2…..there is little in the way
    correlation over history.

    But then, you know that.

    When the scam is acknowledged , I wonder if any of you climate shills will be charged with a criminal conspiracy to defraud,

    • You’re not even trying, Judy. Provide me with something that’s actually a challenge sometime.

      I do find your lack of acknowledgment that basic mathematics and easily checked references disproves your last post to be revealing.

      As for this one, I’ll simply refer you to my debunking of this exact myth (Myth #6), Judy – it describes in far greater detail why that entire passage doesn’t matter. Furthermore, I’ve debunked the logical fallacy you’re committing at Myth #21, namely “CO2 and the Earth’s temperature has been uncorrelated before, so it must be uncorrelated this time.

      [I]f you push your foot down on the gas 10 times and the car accelerates each time, you can realistically expect that the next time you put your foot down on the gas the car will accelerate again. That’s science. But what if you put your foot on the gas and nothing happens? Assuming that that car would have accelerated was a reasonable scientific expectation, but once the car didn’t accelerate, you have to leave your prediction behind and actually figure out what’s busted in the car. Or, to put this analogy in terms that more directly parallel global heating, what if you’re sitting in the car and it accelerates even before you put your foot on the gas? Did you accidentally put your foot on the gas and not realize it? Or has something broken under the hood that you’ll [need to] fix before you can slow down again?

  10. Where would you be without your favorite phrase, “logical fallacy” when you can’t find anything else to say.

    There was no correlation during various ages, there is no correlation now. There was a seeming one for a few years…it’s gone now.


    You are still on the floor Brian. TRY LOOKING AT THE SUN INSTEAD OF YOUR NAVEL.

    • “THERE CAN BE NO CAUSATION WITHOUT CORRELATION.” This is incorrect, but it’s a common and totally understandable error. I suffered from it myself for a number of years. It’s often the case, but not always. More often, long term causation is masked by short term effects, and so the lack of correlation often means that there’s noise to filter out somehow.

      In real systems, there are usually multiple causal factors that trade off as to which factor is dominating at any given moment. For example, in my professional field of electrical engineering, we have something called a “state machine.” State machines are essentially mathematical algorithms that accept inputs from the outside world, determine what the next operational state is, and then transition from the current state to the next state. The operation of the state machine is 100% correlated with the totality of the inputs – that’s how the state machine is designed – but which input matters depends entirely upon which state the machine is presently in. Sometimes the state machine may ignore all other inputs except the one that indicates a computer keyboard “enter” key has been pressed. Other times the state machine looks at every single input available and calculates what to do based on the sum total of all the inputs.

      As a result, however, if you look at only a single input, you’ll often find periods that the operation of the state machine is totally uncorrelated with that single input. After all, if the state machine is waiting for you to press the “enter” key but you’re trying the number keys, the state machine will just sit there and do nothing. Long-term causation, in this case, is uncorrelated with short-term correlation.

      The same is true of climate. The response of climate is 100% correlated with all the inputs to the climate and it’s past state. But that doesn’t mean that at any given instant, one factor isn’t dominating another, or that we understand all the inputs. So it’s entirely possible to have long-term causation without short-term correlation.

  11. It’s like talking to a mad man..What an imagination you have.

    “So it’s entirely possible to have long-term causation without short-term correlation”…is definitely a cake-taker.

    Brian…you can make stuff up, use false analogies and generally lie through your teeth while tap dancing around facts but it can’t change thereality that it will be cooling for too long to salvage the scam. The Danish Prime Minister has given up looking for a treaty to come out of Copenhagen.

    People don’t care anymore because there are real things to worry about.

    Here are some quotes from Obama’s speech at the UN followed by the reality.

    “…[T]he threat from climate change is serious, it is urgent, and it is growing.”

    Reality: global mean temperatures increased slightly from 1977 to 2000. Temperatures have been flat since then.

    “Rising sea levels threaten every coastline.”

    Reality: sea levels have been rising on and off since the end of the last ice age 13,000 years ago. The rate of sea level rise has not increased in recent decades over the nineteenth and twentieth century average.

    “More powerful storms and floods threaten every continent.”

    Reality: there is no upward global trend in storms or floods.

    “More frequent drought and crop failures breed hunger and conflict in places where hunger and conflict already thrive.”

    Reality: there is no upward global trend in major droughts. Reversals in large-scale cycles have meant that the southward march of the Sahara Desert into the Sahel has been reversed in recent years and the Sahara is now shrinking.

    “On shrinking islands, families are already being forced to flee their homes as climate refugees.”

    Reality: some Pacific islanders may want to emigrate to New Zealand or Australia and are claiming that their islands are disappearing as the reason, but shrinkage has been minimal in recent decades because sea level rise has been minimal.

    President Obama’s policy prescriptions are energy rationing and energy poverty disguised as growth and prosperity. The emissions reductions that he promises the United States will make through cap-and-trade legislation are dead in the water in the U. S. Senate and would not survive a second vote in the U. S. House. If enacted, cap-and-trade would consign the economy to perpetual stagnation and make the U. S. into a second-rate economic power.

    His policy prescription for poor countries is to promise them massive “financial and technical assistance”. The track record of paying off poor countries is that it has lined the pockets of corrupt leaders and bureaucracies with billions and tens of billions of dollars, but has done nothing to help those countries become prosperous. What these countries need is free markets and abolishing barriers to trade. The global warming policies advocated by the Obama Administration and the Democratic-controlled Congress would raise trade barriers and foster energy poverty throughout the world. Energy rationing is not the way forward and is not a message of hope for the poorest people in the world, who lack access to electricity and modern transportation.”

    and if you’d like to the the graphs which back up the reality

  12. I long ago gave up any hope of convincing you to accept that facts and data matter. You’re a lost cause. But there are two reasons I continue to engage in these discussions. First, I’m unwilling to allow your willful ignorance to go unchallenged. Second, you’re so bad at it that you’re fun to toy with. You’re a convenient example that illustrates just how flawed denier arguments actually are.

    So, feel free to ignore the analogy. It’s not false, although it’s certainly possible that I didn’t put it into familiar enough terms to be easily understood. And as for my lying, all you have to do to know that I’m being honest is some basic addition. But apparently that’s beyond you.

    As with all the cut/paste links you’ve put up here to support your arguments, this one is riddled with false information.

    The first two images at that site (the two that are from are nearly identical to ones I’ve debunked a few tims. The first compares warming that occurred during the late 20th century and thus far in the 21st to the expected century linear warming trend that the IPCC expects. But warming is not actually flat, it increases as the the amount of CO2 emissions increase, and because human activity is emitting a LOT of CO2 and emissions are accelerating, the temperature is expected to increase more toward the end of this century. In other words, the rate of temperature rise is going to go up – 0.9 degrees per century now will ramp up to to 2 degrees or more per century later in the century. It’s called exponential growth, the same kind of growth that makes your bank balance go up with interest..

    The second image I’ve debunked so many times that it’s not even funny. I’ve referred to my debunking of it twice already in the comments above. Here it is one last time. Read it this time. Study it. Understand it.

    There does appear to have been a slowdown of sea level rise in the last few years. I won’t deny that. But if you look at the rest of the satellite altimetry data (larger and more recent image than the one referenced at the link here), there are at least two other periods when someone could argue that “sea level rise stopped” – mid 1993 to mid 1996 and mid 1997 to mid-2000. And if you notice, the 60 day smoothing trend is again approaching the long-term trend. Whether it will exceed the trend or not will be for new data to show later this year. As for your claim that it hasn’t exceeded the long-term trend, care to provide data for that one? Claims without data to support them, especially from you, are meaningless.

    I’ll grant you that tropical cyclones don’t appear to be trending. How cyclones will respond to climate disruption is a question that the experts don’t know yet. I found it interesting that the source of the cyclone information, Ryan Maue of Florida State University, doesn’t necessarily agree that hurricane activity is at it’s lowest. From his FSU website:

    Global Hurricane Frequency [storms with maximum intensity greater than 64 knots] has dramatically collapsed during the past 2-3 years. When measured using 12 or 24 month running sums, the number of tropical cyclones at hurricane intensity is clearly at a 30-year low. HOWEVER, the number of tropical cyclones with intensity greater than 34-knots has remained at the 30-year average (83 storms per year). More on the distinction in an upcoming paper currently submitted for publication.

    Personally, I think I may track down the paper and try to understand it.

    As far as the drought trend goes, the table they have from the WRI doesn’t actually talk about a trend or lack thereof in the frequency of drought. Instead it talks about the number of deaths due to drought, extreme temperature, wild fires, etc. I find the numbers interesting, since they apparently ignore famine as a cause of drought deaths. I’m sure the table’s original authors had a reason for that, but given that drought played a part in the deaths of at least 8 million Ethiopians in the mid 1980s, the fact that the table has only 130,000 drought deaths shown makes it clear that it’s being misused here. And alas, there’s no other data. Claims without data are meaningless.

    As for the atolls/Pacific islands, the data in that image looks a lot different from the data in this one for the Marshall Islands, or this one for Fiji. Given that the satellite altimetry data was good enough for the link’s authors to try and debunk the global sea level rise, I wonder why they chose to use a hard to read, unreferenced image from “the Australian government” instead of the satellite altimetry data for the countries in question. Maybe because the altimetry data didn’t support their claim and was harder to spin into propaganda.

  13. Trying to “save face” is more likely to be the reason for your continuing to try to rebut what I write.
    Given that the numbers of people who read your “creative writing” are few, I continue fo play with you because it is fun to watch the spin and fantastical prevarications you are capable of.

    The fact is the case for AGW has consisted of faked data supplied by enough creeps who cashed in on the $79 billion the US government lavished on the scam.

    For the sake of the few people who read this weekly ode to science fiction, here’s the rest of “The Dog Ate Global Warming”.

    Roger Pielke Jr., an esteemed professor of environmental studies at the University of Colorado, then requested the raw data from Jones. Jones responded:

    Since the 1980s, we have merged the data we have received into existing series or begun new ones, so it is impossible to say if all stations within a particular country or if all of an individual record should be freely available. Data storage availability in the 1980s meant that we were not able to keep the multiple sources for some sites, only the station series after adjustment for homogeneity issues. We, therefore, do not hold the original raw data but only the value-added (i.e., quality controlled and homogenized) data.

    The statement about “data storage” is balderdash. They got the records from somewhere. The files went onto a computer. All of the original data could easily fit on the 9-inch tape drives common in the mid-1980s. I had all of the world’s surface barometric pressure data on one such tape in 1979.
    If we are to believe Jones’s note to the younger Pielke, CRU adjusted the original data and then lost or destroyed them over twenty years ago. The letter to Warwick Hughes may have been an outright lie. After all, Peter Webster received some of the data this year. So the question remains: What was destroyed or lost, when was it destroyed or lost, and why?

    All of this is much more than an academic spat. It now appears likely that the U.S. Senate will drop cap-and-trade climate legislation from its docket this fall — whereupon the Obama Environmental Protection Agency is going to step in and issue regulations on carbon-dioxide emissions. Unlike a law, which can’t be challenged on a scientific basis, a regulation can. If there are no data, there’s no science. U.S. taxpayers deserve to know the answer to the question posed above.

    — Patrick J. Michaels is a senior fellow in environmental studies at the Cato Institute and author of Climate of Extremes: Global Warming Science They Don’t Want You to Know.

    Ta ta! I’m off to try to educate people about something real….the genocidal swine flu vaccination program. See all the information here.

    • Judy: If by “I’m off” you mean we won’t be seeing you for awhile, that’d be great. I know Brian enjoys making a fool of you in public, and showing fools for what they are does serve a valid purpose. That said, I have concerns about signal-to-noise ratio, and am not in love with the idea of liars (perhaps paid liars) using our forum as a disinformation platform for bought up corporate interests and their deranged allies.

      If you enjoy being a tool that Brian uses for his own interests, I guess I won’t stop you. But if you slink off like the whipped sock puppet that you are, I can live with that, too.

  14. Come on. That was all disproved by Tim Ball. Here is the one real scientific article on atmosphere by Tim Ball, and he proves it all here, summarizing a lifetime of his work.

    Ball, T. Book Review, Agricultural Dimensions of Global Climate Change, Ed. HM Kaiser and TE Drennen. Canadian Journal of Agricultural Economics 42 (2): 212-214, (1994).

    “…For example, carbon dioxide from anthropogenic sources increased the most from 1940 to 1980 while global temperature decreased. Atmospheric carbon dioxide readings plummeted at Mauna Loa in the last two years. “: Tim Ball.

    Try to disprove that.