The task of calculating the possible effect on the temperature of the Earth/atmosphere system is an awesome one. There are simple models which give plausible answers and much more complex ones called general circulation models which give a multiplicity of answers even though they use the same physics. This section begins with some of the IPCC model results.

These are some results from 19 GCMs. They were given the problem of showing what would happen to the global temperature and to global rainfall if the concentration of CO2 were to increase by 1% per year. The CO2 concentration would be doubled in 70 years. The temperature graph (a) shows a large range of possible values for the proposed changes and, on average, the models predict a temperature increase of about 1.8oC for the doubling of CO2. The range of the temperature results is from 0.8-3.0oC, the result being given as 1.8oC ± 1oC not inspiring much confidence in the procedures.

   The graph (b) showing the predictions of rainfall are even less inspiring. Will it get wetter or drier if the concentration of CO2 increases? We just don't know.

There are serious difficulties with large complex models and the quote from Goody & Yung's book; Atmospheric Radiation, sums up the situation.

Line-by-line calculations are often adopted as a standard against which to test certain approximations. Their value in a relative context is indisputable, but that should not be taken to mean that line-by-line calculations are necessarily of high absolute accuracy. This comment is relevant to an implicit assumption in much of the current literature: that more and more detailed physics encoded onto larger and larger computers will eventually yield accurate weather and climate predictions. This is more an article of faith than a demonstrable proposition. It is also possible to argue that numerical complexity hides or introduces its own sources of error, in addition to making it impossible to penetrate the algorithms of another investigator.

Another apt comment on modelling difficulties comes from William Kininmonth:

High powered computers allow us to carry out more complex modelling but the veracity of models relies on the specification of the individual interactions between the variables.

Weather forecasting models rely on initial specification of mass and momentum fields and, largely, the ability to conserve momentum (conservation of mass, although not of critical importance, is generally a basic specification). Within a few days of simulation the mass and momentum fields have diverged irreconcilably from the true evolution of the atmosphere as errors and computational uncertainty expand and propagate.

Climate forecasting is quite different from weather forecasting. It relies on an ability to conserve energy and to accurately reflect the transformation of energy within the climate system (solar radiation, sensible heat, latent energy, potential energy, terrestrial radiation, etc). Climate forecasting is a much more difficult task because of the exchange of energy and momentum between mediums, especially the gaseous and liquid fluids of the atmosphere and oceans. 

Although computing power might allow us to do modelling in powerful new ways the integrity of the models is only as good as the specifications of the interactions between the components. Climate models are severely limited in this respect, despite the access to powerful computers. 

And from Grant Petty:

Even now [2004], however, a fully comprehensive treatment of radiation and other physical processes remains too complex a problem for the most powerful computers to tackle for the entire atmosphere at once. General circulation models therefore continue to rely on grossly simplified representations of these processes, with the attendant risk of error in the model's predictions. Finding ways to improve the accuracy and other physical parameterizations within the limits of available computing power is a major focus of current research in atmospheric science.

Grant W. Petty [A First Course in Atmospheric Radiation, Sundog Publishing, Wisconsin, (2004)]

MORE PREDICTIONS COMPARED WITH OBSERVATIONS

Here is a diagram from the latest IPCC report [FAR] that shows some predictions of future climate based upon 'scenarios' that are described below. The terretrial temperature record has been extended to 2010 [black dots] and the temperature anomalies from satellite measurements [MSU - microwave sounding units run by the University of Alabama at Huntsville] are aslo drawn onto the diagram [crosses].


 

Figure TS.26 from IPPC's FAR 2007. Model projections of global mean warming compared to observed warming. Observed temperature anomalies are shown as annual (black dots) and decadal average values (black line). Projected trends and their ranges from the IPCC First (FAR) and Second (SAR) Assessment Reports are shown as green and magenta solid lines and shaded areas, and the projected range from the TAR is shown by vertical blue bars. These projections were adjusted to start at the observed decadal average value in 1990. Multi-model mean projections from this report for the SRES B1, A1B and A2 scenarios are shown for the period 2000 to 2025 as blue, green and red curves with uncertainty ranges indicated against the right-hand axis. The orange curve shows model projections of warming if greenhouse gas and aerosol concentrations were held constant from the year 2000 - that is, the committed warming.

Since 2005 the wheels seem to have come off the IPCC computer as the Earth cools down significantly, although this year [2009] there has been a temperature increase. The observations are significantly below those predicted by the models.

Here are some of the scenarios referred to in the Figure

A1. The A1 storyline and scenario family describes a future world of very rapid economic growth, global population that peaks in mid-century and declines thereafter, and the rapid introduction of new and more efficient technologies. Major underlying themes are convergence among regions, capacity building and increased cultural and social interactions, with a substantial reduction in regional differences in per capita income. The A1 scenario family develops into three groups that describe alternative directions of technological change in the energy system. The three A1 groups are distinguished by their technological emphasis: fossil intensive (A1FI), non-fossil energy sources (A1T), or a balance across all sources (A1B) (where balanced is defined as not relying too heavily on one particular energy source, on the assumption that similar improvement rates apply to all energy supply and end-use technologies).

A2. The A2 storyline and scenario family describes a very heterogeneous world. The underlying theme is self-reliance and preservation of local identities. Fertility patterns across regions converge very slowly, which results in continuously increasing population. Economic development is primarily regionally oriented and per capita economic growth and technological change more fragmented and slower than other storylines.

B1. The B1 storyline and scenario family describes a convergent world with the same global population, that peaks in mid-century and declines thereafter, as in the A1 storyline, but with rapid change in economic structures toward a service and information economy, with reductions in material intensity and the introduction of clean and resource-efficient technologies. The emphasis is on global solutions to economic, social and environmental sustainability, including improved equity, but without additional climate initiatives.