“Scientists Announce That Humanity Can Afford to Burn Twice as much Carbon as Previously Thought.”   File that thought among headlines-you-never-saw-in-the-New-York-Times.  But buried in Eduardo  Porter’s  Economics Scene column last month endorsing nuclear power as mitigation for global warming was just such a suggestion.

Porter cites an “authoritative new study” for the proposition that  “humanity must spew no more than 900 billion more tons of carbon dioxide into the air from now through 2050.”  That sounds like bad news — but the 2050 carbon budget  limit underlying Bill McKiben’s “Do the Math” campaign for divestment in fossil fuel industries is just 565 billion tons.   I used this number in an earlier post to calculate a sustainable global per capita direct carbon footprint of a scant one ton annually (about the equivalent of a single one-way flight from NY to LA).  And the study Porter relies on was produced by the same group that McKibben relies on.

What accounts for the difference between the 565 billion tons Bill McKibben has popularized and the 900 billion tons cited by the 2013 study?  The Grantham Research Institute  on Climate Change and the Environment, the authors of the study, explain that the 565 billion ton budget released in 2011 assumed no mitigation measures for non-carbon greenhouse gases such as methane, while implementation of  available methane controls in waste management and agriculture would allow a substantially larger carbon budget.  A substantial (and not yet implemented) investment in carbon capture and storage might extend the carbon budget by an additional 125 billion tons — resulting in a 2050 carbon budget of over a trillion tons, twice the previous figure.

 

This news is not enough to get me to go out and buy a car or start hopping on jet planes again.  But the implicit global per capita  direct carbon footprint of two tons is much less out of reach than a single ton.  And keep in mind that the Grantham study assumes near complete cessation of carbon burning activities after 2050 to maintain that 2 degrees celsius limit on warming — 75 million tons spread out over the second half of the century.  This suggests that the goal of policy maker, activists, and legal thinkers should be abolition of fossil fuels, not market based rationing.  (My current research includes looking at the parallels between the global abolition of slavery, which took well over a century of activism to accomplish, and the challenge of responding to climate change).

 

The  900 billion ton carbon budget posited by the Grantham study is based on avoiding catastrophic climate change in excess of 2 degrees celsius.  Also in the department of good news that may be bad news,  Andrew Revkin’s Dot Earth blog this week points to a comprehensive analysis by the Yale Forum on Climate and the Media on global temperature records for the past decade showing a slowing in the rate of warming.  The analysis is careful not to draw any conclusions, but does point out that the observed data is most consistent with those Global Climate Models that assume a lower sensitivity of the global climate to greenhouse gas forcings. The analysis considers other explanations for the disparity between the model consensus and the observed data and concludes that these other explanations are ether inadequate to explain the discrepancy, or are already accounted for in the models in question.   The less-sensitive models most consistent with the observations predict an ultimate warming of about 2 degrees celsius even under a “business as usual” scenario for carbon emissions.  In other words, it is just possible that even in the absence of measures to limit global carbon emissions, the world just might avoid exceeding the 2 degree C threshold for catastrophic climate change.

So where is the bad news?  The bad news is that, given the huge inertia in our energy economy and the psychological and political barriers against a consensus for action on climate change, scientific uncertainty becomes a driver of inaction.  While environmental law and international norms may have adopted the opposite approach in the form of the precautionary principle, political and social systems prefer inaction in the face of uncertainty.  The suggestion that maybe we can keep burning more like one trillion tons of carbon, rather than 565 billon, or that under a business as usual scenario the globe will no exceed the 2 degrees C threshold can only delay an aggressive response to climate change.

We are engaged in one huge and dangerous science experiment with Planet Earth.  Each climate model is an hypothesis about how the global climate system fits together, making a prediction based on a set of parameters.  Observations over the course of decades may validate or disprove particular global climate models, and may ultimately reject the most pessimistic (highest climate sensitivity) models.  None of the observations about the pause in global temperature change casts doubt on the fact that the planet is warming at an unprecedented rate due to anthropogenic sources.  Indeed, the observed data suggests a huge increase in deep ocean thermal energy, which may be mitigating the surface temperature increase.

We don’t have a “control” planet for our global experiment — so if the next decade’s data shows a return to rapid surface warming that is consistent with the more pessimistic global models. we can’t abandon our experiment.  And a 2 degrees C increase in global temperature may be the global consensus for avoiding catastrophe, but it is not acceptable.