climate sciencethe investment forecast

2
T he world’s climate changed during the twentieth century and is continuing to do so. At least part of this change is caused by anthropogenic emissions of greenhouse gases and sulphate aerosols, according to the latest report of the Inter- governmental Panel on Climate Change (IPCC) 1 . Changes in extreme climate 2 , such as hot spells, droughts or floods, potentially have a much greater impact on society than changes in mean climate, such as summer- time temperature averaged over several decades 3,4 . So the ability to assess future risks associated with extreme events is increasingly important to policy-makers. Two new studies (on pages 512 and 514 of this issue) 5,6 report evidence that increases our confidence in observed and projected changes in extreme rainfall and flooding. Notably, Palmer and Räisänen 5 link the results of several climate models with a decision-making tool, and show that some approaches to forecasting are more valuable than others. Such tools will become increas- ingly important to both climate-modellers and policy-makers. For their part, Milly et al. 6 show that the frequency of severe floods in large river basins has increased during the twentieth century (Fig. 1), with only a small likelihood that these changes are due to natural climate variability. Palmer and Räisänen 5 analyse the output of 19 climate models and estimate that, over the next century, very wet winters will be up to five times more likely than today for much of central and northern Europe. They also predict that the probability of very wet sum- mers in the Asian monsoon region will rise by a similar magnitude, increasing the risk of flooding. By applying a simple ‘cost-loss’ analysis to a hypothetical investment deci- sion, they show that for extreme events it is better to consider the changing frequency of extreme events in a set of different climate-change projections (the ‘probabil- istic’ approach), than to rely on a single projection or on the averaged projection from several climate models (the ‘consensus’ approach used regularly by the IPCC 1 ). To estimate future climate change caused by anthropogenic emissions of, say, CO 2 , climate scientists use numerical models that simulate the effects of atmospheric changes on climate. But such climate models are inherently imperfect owing to physical processes that are either not completely understood or have yet to be adequately represented because of limited computa- tional power. So there is a fundamental modelling uncertainty attached to a given simulation from any climate model, over and above any naturally occurring climate vari- ability. The consensus approach to solving this problem is to assume that the influence of modelling uncertainty can be limited by averaging several different models that differ slightly in their representation of climate. The IPCC report 1 relies heavily on this approach to provide ‘best-guess’ projections of future climate change that describe the features common to an ‘ensemble’ of differ- ent model simulations. But there is a prob- lem if this consensus approach is applied to the analysis of extreme climate events (Fig. 2, overleaf). Using a consensus projec- tion for planning purposes would inevitably underestimate the risk associated with extreme rainfall. Palmer and Räisänen take a more explicit approach to the problem of accounting for modelling uncertainty and adopt probabil- istic techniques that are now routinely used in short- and medium-range weather fore- casting, in which uncertainties owing to incomplete knowledge of initial conditions have to be considered 7 . The authors analyse 80-year projections from 19 different global- climate models that include interactions between the oceans and the atmosphere. First, the models were run at a fixed CO 2 concentration to define a ‘baseline’ ensemble of simulations corresponding to today’s climate conditions. Next, they were run with a 1% per year increase in CO 2 , which is slightly faster than is predicted for the twenty- first century, but seems reasonable because NATURE | VOL 415 | 31 JANUARY 2002 | www.nature.com 483 The investment forecast Reiner Schnur Figure 1 (above) Heavy floods, such as this one in Oregon City, United States, in 1996, can have a huge impact on human affairs. Milly et al. 6 show that the frequency of severe floods has increased during the twentieth century, with only a small chance that these changes are caused by natural climate variability — anthropogenic climate change seems more likely. BOB GALBRAITH/AP PHOTO New studies predict that the risk of extreme rainfall over Europe and Asian monsoon regions is increasing, with more floods likely worldwide. Such long-range forecasting is pushing at the limit of current climate models. news and views © 2002 Macmillan Magazines Ltd

Upload: reiner

Post on 21-Jul-2016

220 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Climate scienceThe investment forecast

The world’s climate changed during thetwentieth century and is continuing to do so. At least part of this change

is caused by anthropogenic emissions ofgreenhouse gases and sulphate aerosols,according to the latest report of the Inter-governmental Panel on Climate Change(IPCC)1. Changes in extreme climate2, suchas hot spells, droughts or floods, potentiallyhave a much greater impact on society thanchanges in mean climate, such as summer-time temperature averaged over severaldecades3,4. So the ability to assess future risksassociated with extreme events is increasinglyimportant to policy-makers.

Two new studies (on pages 512 and 514 ofthis issue)5,6 report evidence that increasesour confidence in observed and projectedchanges in extreme rainfall and flooding.Notably, Palmer and Räisänen5 link theresults of several climate models with a decision-making tool, and show that someapproaches to forecasting are more valuablethan others. Such tools will become increas-ingly important to both climate-modellersand policy-makers. For their part, Milly et al.6

show that the frequency of severe floods in large river basins has increased during the twentieth century (Fig. 1), with only asmall likelihood that these changes are due to natural climate variability.

Palmer and Räisänen5 analyse the outputof 19 climate models and estimate that, overthe next century, very wet winters will be upto five times more likely than today for much

of central and northern Europe. They alsopredict that the probability of very wet sum-mers in the Asian monsoon region will riseby a similar magnitude, increasing the risk offlooding. By applying a simple ‘cost-loss’analysis to a hypothetical investment deci-sion, they show that for extreme events it is better to consider the changing frequencyof extreme events in a set of different climate-change projections (the ‘probabil-istic’ approach), than to rely on a single projection or on the averaged projectionfrom several climate models (the ‘consensus’approach used regularly by the IPCC1).

To estimate future climate change causedby anthropogenic emissions of, say, CO2, climate scientists use numerical models thatsimulate the effects of atmospheric changeson climate. But such climate models areinherently imperfect owing to physicalprocesses that are either not completelyunderstood or have yet to be adequately represented because of limited computa-tional power. So there is a fundamental modelling uncertainty attached to a givensimulation from any climate model, over andabove any naturally occurring climate vari-ability. The consensus approach to solvingthis problem is to assume that the influenceof modelling uncertainty can be limited byaveraging several different models that differslightly in their representation of climate.The IPCC report1 relies heavily on thisapproach to provide ‘best-guess’ projectionsof future climate change that describe the

features common to an ‘ensemble’ of differ-ent model simulations. But there is a prob-lem if this consensus approach is applied to the analysis of extreme climate events (Fig. 2, overleaf). Using a consensus projec-tion for planning purposes would inevitablyunderestimate the risk associated withextreme rainfall.

Palmer and Räisänen take a more explicitapproach to the problem of accounting formodelling uncertainty and adopt probabil-istic techniques that are now routinely usedin short- and medium-range weather fore-casting, in which uncertainties owing toincomplete knowledge of initial conditionshave to be considered7. The authors analyse80-year projections from 19 different global-climate models that include interactionsbetween the oceans and the atmosphere.First, the models were run at a fixed CO2

concentration to define a ‘baseline’ ensembleof simulations corresponding to today’s climate conditions. Next, they were run with a 1% per year increase in CO2, which isslightly faster than is predicted for the twenty-first century, but seems reasonable because

NATURE | VOL 415 | 31 JANUARY 2002 | www.nature.com 483

The investment forecastReiner Schnur

Figure 1 (above) Heavy floods, such as this one inOregon City, United States, in 1996, can have ahuge impact on human affairs. Milly et al.6 showthat the frequency of severe floods has increasedduring the twentieth century, with only a smallchance that these changes are caused by naturalclimate variability — anthropogenic climatechange seems more likely.

BO

B G

ALB

RA

ITH

/AP

PH

OT

O

New studies predict that the risk of extreme rainfall over Europe and Asianmonsoon regions is increasing, with more floods likely worldwide. Such long-range forecasting is pushing at the limit of current climate models.

news and views

© 2002 Macmillan Magazines Ltd

Page 2: Climate scienceThe investment forecast

these model integrations neglect the contri-butions of other greenhouse gases.

By comparing the relative frequencies ofextreme events in the baseline and climate-change ensembles, Palmer and Räisänenobtain a measure of the changing risk associ-ated with anthropogenic climate change.Although this technique is not new, this is the first time it has been applied to a largemulti-model ensemble in a climate-changecontext to assess the frequency of extremeevents. This approach is better at accountingfor model and sample uncertainty, althoughsystematic errors common to all models cannot be avoided. To date, most studieshave used single projections or ensemblesfrom one climate model (see, for example,Milly et al.6 or Kharin and Zwiers8).

This work5 also demonstrates the eco-nomic value of probabilistic forecasts ofextreme events to policy-makers. By apply-ing a simple cost-loss decision model to ahypothetical long-term investment decisionin a region prone to flooding, the authorsshow that it is consistently better to basedecisions on a probabilistic forecast than ona single simulation. (For assessing extremerisks, a consensus forecast is never betterthan simply assuming that the frequency ofevents will remain constant, because ensem-ble averaging always underestimates extremeevents; Fig. 2). Although the cost-loss modelused by Palmer and Räisänen is too simple tobe used in real-world decision-making, theirapproach suggests a reliable way to make better use of ensemble predictions in climate-change research and risk assessment.

Studies such as these make a strong casefor improving computational resources in

climate research. Much larger ensembles are needed for more reliable estimates ofextreme events, particularly for very rareevents that have a great potential for a disas-trous impact on a country’s economy andinhabitants. Multi-model ensembles of thesize used by Palmer and Räisänen are available only for future changes in CO2

concentration. Similar model integrationswill be needed for a more complete picture of the factors contributing to climate change,

such as other greenhouse gases and sul-phate aerosols, including any uncertaintiesattached to these agents.

Today’s climate models are not good atpredicting extreme climate events in localareas, such as flooding in a given river basin,because they are limited in their resolutionto a coarse grid size of about 200 kilometres.In Palmer and Räisänen’s study, only a fewriver basins — such as those of the Brahma-putra and Ganges in India and Bangladesh— are large enough to allow their results forextreme seasonal precipitation to be trans-lated into a heightened risk of river flooding.This is also why Milly et al.6 use only thelargest basins in their analysis of great floods.For the average river basin, climate-changesimulations would need a much higher resolution of tens of kilometres, but this willnot be available for quite some time. Until computational power increases significantly,climate scientists will have to patch modelstogether, taking the results of ensemble climate projections, for example, and insert-ing their output into a high-resolution hydro-logical model for a specific river basin. ■

Reiner Schnur is at the Max Planck Institute forMeteorology, Bundesstrasse 55, 20146 Hamburg,Germany.e-mail: [email protected]. Houghton, J. T. et al. (eds) Climate Change 2001: The Scientific

Basis (Cambridge Univ. Press, 2001).

2. Meehl, G. A. et al. Bull. Am. Meteorol. Soc. 81, 413–416 (2000).

3. Easterling, D. R. et al. Science 289, 2068–2074 (2000).

4. McCarthy, J. J. et al. (eds) Climate Change 2001: Impacts,

Adaptation and Vulnerability (Cambridge Univ. Press, 2001).

5. Palmer, T. N. & Räisänen, J. Nature 415, 512–514 (2002).

6. Milly, P. C. D., Wetherald, R. T., Dunne, K. A. & Delworth, T. L.

Nature 415, 514–517 (2002).

7. Palmer, T. N. Prog. Rep. Phys. 63, 71–117 (2000).

8. Kharin, V. V. & Zwiers, F. W. J. Clim. 13, 3760–3788 (2000).

news and views

484 NATURE | VOL 415 | 31 JANUARY 2002 | www.nature.com

Figure 2 The pitfalls of forecasting extreme climate events. Palmer and Räisänen5 show that it is better to consider the frequency distribution of such events among a number of different climatesimulations than to use only a single simulation or to take an ensemble (consensus) average of severalmodels. For example, the probability of summertime rainfall at a given location exceeding a certainthreshold (solid horizontal line) will be unrealistically low in an ensemble average (green line) ofseveral simulations (thin lines) because of the smoothing effect of averaging. In the example above,this probability would be zero for the ensemble average, but positive in each one of the threesimulations. Using only a single simulation for forecasting the probability of extreme rainfall wouldgive answers of between 5% and 15% in this example (corresponding to one and three flooding events over 20 years), depending on the model used.

Threshold

5 10 15 20Time (years)

Mean

* Extreme rainfall

*

* **

* *

The first step in rationally treating a disease is to assess the patient against a classification of diseases, the results

being used to predict the person’s response tovarious therapies. The effectiveness of theprocess depends on the quality of the classifi-cation. Cancer is no exception to this rule,and the advent of microarray methods toanalyse DNA, RNA or proteins from tumourcells has started to refine the classification ofcancer1 to levels that classical methods havebeen unable to reach.

The report by Friend and colleagues onpage 530 of this issue2 provides a good example of how such ‘molecular profiling’ is improving cancer classification. It shows

that the results of gene-expression profilingof breast tumours, carried out after they have been surgically removed, can be used topredict which patients will develop clinicalmetastases (the spread of the tumour toother sites, where secondary tumours thendevelop). If molecular forecasting of the outcome of cancer is indeed possible, as thiswork suggests, it is a significant advance onexisting prognostic methods.

Breast cancer is common: in the UnitedStates and Britain, one in ten women willdevelop the disease, and half of those will die with it. Treatment for individual patientsis chosen according to various criteria, suchas the extent of tumour spread (also called

Cancer

The molecular outlookCarlos Caldas and Samuel A. J. Aparicio

Many breast-cancer patients receive unnecessary treatment for possibletumour spread after the removal of a primary tumour. Molecular profilingshould offer more accurate predictions of who needs such treatment.

© 2002 Macmillan Magazines Ltd