Over the past 30 years, the climate research community has made valiant efforts to answer the "climate sensitivity" question: What is the long-term equilibrium warming response to a doubling of atmospheric carbon dioxide? Earlier this year, the Intergovernmental Panel on Climate Change (1) concluded that this sensitivity is likely to be in the range of 2° to 4.5°C, with a 1-in-3 chance that it is outside that range. The lower bound of 2°C is slightly higher than the 1.6°C proposed in the 1970s (2); progress on the upper bound has been minimal.

©Science Magazine
Carbon dioxide-induced warming under two scenarios simulated by an ensemble of simple climate models. (Left) CO2 levels are stabilized in 2100 at 450 ppm; (right) the stabilization target is recomputed in 2050. Shading denotes the likelihood of a particular simulation based on goodness-of-fit to observations of recent surface and subsurface-ocean temperature trends (7, 8). Simulations are plotted in order of increasing likelihood, so worse-fitting models are obscured. The bar labeled "EQM" shows the models' likelihood against their long-term equilibrium warming at 450 ppm. How these likelihoods are translated into forecast probabilities is controversial, and the more asymmetric the likelihood function, the greater the scope for controversy.


On page 629 of this issue, Roe and Baker (3) explain why. The fundamental problem is that the properties of the climate system that we can observe now do not distinguish between a climate sensitivity, S, of 4°C and S > 6°C. In a sense, this should be obvious: Once the world has warmed by 4°C, conditions will be so different from anything we can observe today (and still more different from the last ice age) that it is inherently hard to say when the warming will stop. Roe and Baker formalize the problem by showing how a symmetric constraint on the strength of the feedback parameter f (which determines how much energy is radiated to space per degree of surface warming) gives a strongly asymmetric constraint on S. The reason is simple: As f approaches 1, S approaches infinity. Roe and Baker illustrate the point with the information provided by recent analyses of observed climate change, atmospheric feedbacks, and "perturbed physics" experiments in which uncertain parameters are varied in climate models.

It might be objected that some models that displayed high sensitivities in perturbed physics experiments also poorly reproduce the energy
budget at the top of the atmosphere (4) and hence perform poorly in short-term climate forecasts (5). Likewise, the fact that direct studies of atmospheric feedbacks provide only a weak constraint on S does not mean that no stronger constraint is possible. But these objections miss Roe and Baker's main point: The fact that uncertainties in climate processes add up to give an approximately Gaussian uncertainty in f means that there are innumerable ways of generating a climate model with f close to unity and hence a very high S. Ruling all of these out requires us to find observable quantities that are consistently related to S in all physically plausible climate models, and to show that observations of these quantities are inconsistent with a high S. Despite much searching, such observations remain elusive.

There are even more fundamental problems. Roe and Baker equate observational uncertainty in f with the probability distribution for f. This means that they implicitly assume all values of f to be equally likely before they begin. If, instead, they initially assumed all values of S to be equally likely, they would obtain an even higher upper bound. This sensitivity of the results to prior assumptions shows that the real problem with the upper bound on climate sensitivity is not that it is high (in which case we could hope that more data will bring it down), but that it is controversial: Opaque decisions about statistical methods, which no data can ever resolve, have a substantial impact on headline results.

All this would be very bad news if avoiding dangerous anthropogenic interference in the climate system required us to specify today a stabilization concentration of carbon dioxide (or equivalent) for which the risk of dangerous warming is acceptably low. Fortunately, we do not need to.

To understand why, consider two scenarios for carbon dioxide-induced warming, based on large numbers of runs of a simple climate model constrained by recent temperature observations (6-10). In the first scenario, carbon dioxide concentrations are stabilized at 450 ppm from 2100 onward. If S turns out to be close to our current best estimate, then achieving this concentration target gives an eventual equilibrium warming of 2°C (see the figure, left, dashed line). But S is uncertain; thus, even if we stabilize at 450 ppm, we cannot rule out much more than 2°C of eventual warming, as shown by the shaded plume. Notice that observed temperature trends provide a much stronger constraint on forecast warming even 50 years after stabilization than on the long-term equilibrium response (shown by the bar labeled EQM). Hence, if the true climate sensitivity really is as high as 5°C, the only way our descendants will find that out is if they stubbornly hold greenhouse gas concentrations constant for centuries at our target stabilization level.

In reality, of course, our descendants will revise their targets in light of the climate changes they actually observe. Suppose that, in 2050, they simply divide our 450-ppm target forcing by the fraction by which the observed carbon dioxide-induced warming trend between 2000 and 2050 over- or under-estimates our current best-guess forecast (11). They then recompute concentration paths to stabilize at this revised level in 2200.

The long-term carbon dioxide concentration consistent with a 2°C warming (which we call C2K) is currently uncertain, but the risk of a low (and hence expensive) C2K is much better constrained by data than is the risk of a high (and hence dangerous) climate sensitivity. This is because C2K, like f, scales approximately with things we can observe, and hence is not subject to the problems that bedevil efforts to constrain sensitivity. The uncertainties in how the available policy levers translate into global emissions, and how emissions translate into concentrations through the carbon cycle, are so large that uncertainty in the final concentration we are aiming for in 2200 is probably the least of our worries--provided we resist the temptation to fix a concentration target early on. Once fixed, it may be politcally impossible to reduce it.

The temperature response to this adaptivestabilization scenario (see the figure, right) is much better constrained because it depends on current trends, not on S. If S turns out to be toward the upper end of the current uncertainty range, we may never find out what it is: Some models with S = 4°C are effectively indistinguishable from others with S = 6°C under this scenario. But provided our descendants have the sense to adapt their policies to the emerging climate change signal, they probably won't care.

An upper bound on the climate sensitivity has become the holy grail of climate research. As Roe and Baker point out, it is inherently hard to find. It promises lasting fame and happiness to the finder, but it may not exist and turns out not to be very useful if you do find it. Time to call off the quest.

References and Notes
1. Intergovernmental Panel on Climate Change (IPCC), Climate Change 2007: The Physical Science Basis, S. Solomon et al., Eds. (Cambridge Univ. Press, Cambridge, 2007).
2. J. Charney et al., Carbon Dioxide and Climate: A Scientific Assessment (National Academy of Sciences, Washington, DC, 1979).
3. G. H. Roe, M. B. Baker, Science 318, 629 (2007).
4. B. M. Sanderson, C. Piani, W. J. Ingram, D. A. Stone, M. R. Allen, Clim. Dyn., 10.1007/s00382-007-0280-7 (2007).
5. M. Rodwell, T. N. Palmer, Q. J. R. Meteorol. Soc. 113, 118 (2007).
6. The model is a simple energy balance atmosphere coupled to a diffusive ocean as used in (7), with data constraints updated as (8). In such simple models, feedbacks are assumed to be independent of the climate state, despite evidence to the contrary (9, 10).
7. D. J. Frame et al., Geophys. Res. Lett. 32, 10.1029/2004GL022241 (2005).
8. D. J. Frame et al., Geophys. Res. Lett. 33, 10.1029/2006GL025801 (2006).
9. C. A. Senior, J. F. B. Mitchell, Geophys. Res. Lett. 27, 2685 (2000).
10. G. Boer, B. Yu, Clim. Dyn. 21, 167 (2003).
11. Note that this does not require our descendants to "discover" the true S, unlike the similar "learning" scenario of Yohe et al. (12).
12. G. Yohe, N. Andronova, M. Schlesinger, Science 306, 416 (2004).
13. We thank the James Martin School and the Tyndall Centre for support.

10.1126/science.1149988

M. R. Allen is in the Department of Physics, University of Oxford, Oxford OX1 3PU, UK. E-mail: myles.allen@physics.ox.ac.uk

D. J. Frame is in the Oxford University Centre for the Environment, Oxford OX1 3QY, UK. E-mail: dframe@atm.ox.ac.uk


Comment:
Newcientist summarizes this article.


Climate is too complex for accurate predictions

by Jim Giles

Climate change models, no matter how powerful, can never give a precise prediction of how greenhouse gases will warm the Earth, according to a new study.

The result will provide ammunition to those who argue not enough is known about global warming to warrant taking action.

The analysis focuses on the temperature increase that would occur if levels of carbon dioxide in the atmosphere doubled from pre-Industrial Revolution levels. The current best guess for this number - which is a useful way to gauge how sensitive the climate is to rising carbon levels - is that it lies between 2.0 C and 4.5 C. And there is a small chance that the temperature rise could be up to 8C or higher.

To the frustration of policy makers, it is an estimate that has not become much more precise over the last 20 years. During that period, scientists have established that the world is warming and human activity is very likely to blame, but are no closer to putting a figure on exactly much temperatures are likely to rise.

Positive feedback

It now appears that the estimates will never get much better. The reason lies with feedbacks in the climate system. For example, as the temperature increases, less snow will be present at the poles. Less snow means less sunlight reflected back into space, which means more warming.

These positive feedbacks accelerate global warming and also introduce uncertainty into estimates of climate sensitivity, say Gerard Roe and Marcia Baker of the University of Washington in Seattle.

What is more, they found that better computer models or observational data will not do much to reduce that uncertainty. A better estimate of sensitivity is the holy grail of climate research, but it is time to "call off the quest", according to a commentary published alongside the paper.

Deep uncertainties

That is likely to fuel attacks by critics in the oil industry and elsewhere who argue against investing in measures like clean energy until more is known about climate change. Others say that we need to act even if climate sensitivity lies at the low end of the scale, since coastal areas would still be threatened by rising seas, for example.

Ultimately, the papers also illustrate the limits to which models, even those produced by powerful supercomputers, can help politicians make decisions.

"This finding reinforces not only that climate policies will necessarily be made in the face of deep, irreducible uncertainties," says Roger Pielke, a climate policy expert at the University of Colorado at Boulder, US. "But also the uncomfortable reality - for climate modellers - that finite research dollars invested in ever more sophisticated climate models offer very little marginal benefit to decision makers."