Dirt. It’s all around you. You wash it off your car. You run and hike on it. You buy it to plant your new roses in. And yet, according to an amazing Boston Herald story titled The Future of Dirt, soil scientists are only now beginning to understand how it really works. And given that the global population is expected to rise by about 50% in the next 50 years, we’re going to need to figure out ways to keep what soil we have from degrading and even improve the fertility of our dirt if possible.
By 2050, according to Rattan Lal, a professor of soil science at Ohio State University, “All the necessities of food, feed, fiber, and fuel are going to be met by less than one-tenth of an acre per person, on average. And we already have seriously degraded a lot of the available land. So unless you can restore some of it you will just run out.”
This warning is not hyperbole. There is evidence that ancient civilizations collapsed because the overused their dirt and either stripped all the nutrients out of it or irrigated it so often that it eventually turned alkaline or salty and incapable of growing much of anything. The article mentions David Montgomery, a University of Washington geologist who has written a book that traces the fall of classical Greece, imperial Rome, various Pacific Island cultures, and the Mayans to poor soil. But some of the richest soil on the planet, the terra preta of South America and parts of Africa, was created by indigenous people mixing organic matter (food wastes and manure) and biochar (a particular form of charcoal) into the unenriched native soils.
Biochar, [researchers] have found, enhances the retention of water and nutrients, decreases the need for fertilizer, encourages microbial growth, and allows more air to reach crop roots. It also breaks down at a far slower rate than traditional fertilizers and soil additives. Depending on how the charcoal is made and applied, estimates of its life span range from decades to millennia.
According to the Boston Herald article, other scientists are working on creating fully synthetic soils from industrial wastes, and while these soils apparently work, they’re very expensive and, at this point anyway, are hardly optimal. I imagine that much of the cost is in purging the industrial wastes (such as byproducts from drug manufacturing, fly ash from power plants, and aluminum smelting byproducts) of their toxic contaminants like mercury, cadmium, lead, arsenic, etc.
The article also discusses new crop rotations that use one crop to replace the nutrients that the prior crop removed and low or no-till cultivation methods being developed or already in use that reduce soil erosion and help keep sequestered carbon dioxide in the soil instead of releasing it into the air. It can take 10,000 years to build 6 inches of topsoil, averaging about 0.015 mm of topsoil per year. That means it will take about 65 years to replace 1 mm of topsoil, and that’s the amount that’s estimated to be lost per year, as a global average.
How does this relate to global heating? As the planet heats up, droughts and floods are expected to become more common. Both degrade soils, either directly via solar heating or indirectly via wind and water erosion. Agriculture releases nitrogen and carbon compounds into the atmosphere unless done very carefully. Biochar has the potential to both improve soil fertility and sequester carbon in the earth for decades to millenia. And since history has many examples of civilizations that faded as their soils collapsed, keeping our global soil alive and well will be key to keeping our civilization healthy, especially as we decarbonize our carbon-based society.
Sticking with the theme of dirt, I stumbled across an interesting article in South Africa’s Independent Online (IOL) about the junction between genetically engineered crops and organic growing techniques.
While genetic engineering can offer new varieties, such as pest-resistant corn, organic farming can help to achieve higher yields without damaging the environment, as it does not use synthetic pesticides or fertilisers, [University of California at Davis professor Pamela Ronald] said via telephone.
“Genetic engineering is a way to make seeds … Farmers rely on seeds for good yields, but seeds cannot solve everything,” she said.
“You need some way to add fertiliser and control the pests. That’s where organic farming has a lot to contribute.”
What I find the most interesting is that we have a scientist who is using genetic engineering to develop rice strains that are either disease or flood resistant, and yet she’s also calling for organic cultivation of her own products, not industrial cultivation. After all, organic cultivation requires less fertilizer, reduces topsoil erosion, and is generally lower impact on the local climate. And, given the expected speed of global heating, we cannot afford to wait several human generations (and scores or hundreds of generations of our various staple crops) while we use conventional selective breeding to produce flood, disease, drought, insect, and salt tolerance or resistance in our food crops. The population of the planet is rising too fast.
So unless you’re willing to support activities that reduce human population directly – war, disease, famine, et al – you’ll have to feed the population of the planet somehow. And it’s reasonably likely that genetically engineered crops of one kind of another will be a big piece of that particular puzzle.
Three weeks ago, the Weekly Carboholic introduced readers to Project Vulcan, a project with the goal of mapping the release of carbon dioxide emissions (CO2) in the U.S. with greater temporal and spatial resolution than had ever been attempted previously. This week we have an update, specifically that Vulcan has released data for 2002 that gives CO2 emissions for specific sectors (transportation, utilities, agriculture, etc.) on a county-by-county basis. And from this data is available the 20 top counties with the highest total CO2 emissions in the country.
Those counties are, in order:
- Harris, Texas
- Los Angeles, California
- Cook, Illinois
- Cuyahoga, Ohio
- Wayne, Michigan
- San Juan, New Mexico
- Jefferson, Alabama
- Wilcox, Alaska
- East Baton Rouge, Louisiana
- Titus, Texas
- Carbon, Pennsylvania
- Porter, Indiana
- Jefferson, Ohio
- Indiana, Pennsylvania
- Middlesex, Massachusetts
- Bexar, Texas
- Hillsborough, Florida
- Suffolk, New York
- Clark, Nevada
- Duval, Florida
These 20 counties, of a total of 3141 counties nationwide, represent 14 different states and about 11.4% of the entire CO2 emissions of the United States. In other words, 0.6% of the counties in the U.S. account for 11.4% of all CO2 emissions. That’s huge, and it suggests that we need to pour a lot of money into those counties to get their CO2 emissions down. After all, it’s a well known precept of engineering that, when addressing multiple problems of varying magnitudes, you tackle the problems that give you the biggest benefit first. Duh.
But there’s other ways to look at this data. The top 50 counties (from Vulcan’s Excel-formatted data directly), representing 24 states, account for 21.9% of all emissions. And it takes 183 counties representing 46 states to exceed 50% of the Unites States’ total CO2 emissions. In other words, even if you dropped the top 183 counties CO2 emissions to 0 by throwing massive amounts of investment at them, that’s still every state by Hawaii, Idaho, Maine, and South Dakota. If we use EIA government data on state emissions in 2002 instead of Project Vulcan’s data, we find that the top 11 states account for 52.66% of all CO2 emissions. Those states are, in order from highest emitters to lowest: Texas, California, Pennsylvania, Ohio, Florida, Illinois, Indiana, New York, Louisiana, Michigan, and Georgia.
In other words, 20 counties may be only 0.6% of all U.S. counties, but it’s still 28% of all states. And 183 counties may only 5.8% of all U.S. counties, but it’s 92% of all states. This illustrates the difficulty of the problem, because targeting only the highest magnitude emitters will still require that we pour investment and make changes civilizational changes pretty much nationwide.
Well, at least 92% of nationwide….
The greenhouse gas index, based on data from 60 sites around the world, showed that that last year’s carbon dioxide increase added 2.4 molecules to every million molecules of air, a measurement known as parts per million, or ppm.
Unfortunately, Reuters apparently chose to engage in a bit of unnecessary hyperbole with this line:
The rise continued in 2008, according to a chart of global carbon dioxide emissions online here, which showed world emissions of this gas heading off the chart at over 386 ppm.
While it’s certainly true that the graph shows the 2007 data at the upper maximum of the graph, it’s hardly “off the chart” in the sense of historical increases year-to-year. 2007’s increase of 2.42 ppm was the 4th highest increase since NOAA started tracking global CO2 concentrations, after 1998, 1987, and 2005. One correlation is that 1997-1998, 1986-1987, 2004-2005, and 2006-2007 were all El Nino years, with the El Nino of 1998 being one of the most powerful El Ninos on record, if not the most powerful.
If you choose to look at the Mauna Loa Observatory data, which goes back to 1959 instead just 1980 for the NOAA data, 2007 is the 7th highest behind 1998, 2002, 2005, 2003, 1987, and the tie between 1983 and 1988. Again, all of these years are El Nino years or within 1 year after an El Nino (1988, specifically).
So while CO2 concentrations continue to rise, the 2.42 ppm increase itself isn’t necessarily the biggest problem. The problem is that there has yet to be a decrease in the rate of increase of CO2 concentrations. If you look at the Mauna Loa Observatory data, the black line shows a localized slowing in the rate of increased concentrations (possibly a result of the La Nina event presently ongoing in the Pacific) around Hawaii – when the global trend starts to show such a flattening of the curve, THAT will be good news.