Flaws in Climate Models
I have spoken with several senior figures from the world of finance. They have considerable familiarity with very complex models, which play a major role in the activities of financiers. They all told me that it was extremely unwise to put too much trust in complex models, citing the collapse of Long-Term Capital Management, which invested in accordance with a complex model devised by two Nobel prize winners. All the predictions of catastrophic effects because of the increase in human emissions of CO2 stem from climate models. For supporters of cuts to human emissions, these seem to have the same infallibility as if their results had been brought down from Mt Sinai, engraved in stone by the finger of God! But even supporters of climate models are aware that such is not the case. As pointed out in an article in the New Scientist of 27 January, 2011, “Casting a critical eye on climate models”, the author, Anil Ananthaswamy, stated: “Our knowledge about the Earth is not perfect, so our models cannot be perfect. And even if we had perfect models, we wouldn’t have the computing resources needed to run the staggeringly complex simulations that would be accurate to the tiniest details.” More fundamentally, the climate models fail miserably to meet many of the 127 principles of good models put forward by the Institute for Forecasting (available at http://www.forecastingprinciples.com/). A 2007 paper found that GCM Climate Models violated 72 out of the 127 agreed principles of good forecasting. (see Green, K. & Armstrong, J. S., 2007, “Global warming: Forecasts by scientists versus scientific forecasting.” Energy and Environment,18:997-1022. also see http://www.forecastingprinciples.com/images/stories/pdf/ags2011congress.pdf )
Ananthaswamy also stated: “There are important phenomena missing from the IPCC’s most recent report. Consider a region that starts warming. This causes the vegetation to die out, leading to desertification and an increase in dust in the atmosphere. Wind transports the dust and deposits it over the ocean, where it acts as fertiliser for plankton. The plankton grow, taking up CO2 from the atmosphere and also emitting dimethyl sulphide, an aerosol that helps form brighter and more reflective clouds, which help cool the atmosphere. This process involves carbon flow, aerosols, temperature changes, and so on, but all in specific ways not accounted for by each factor alone.”
Climate models are deterministic: that is, every factor that is known to influence climate significantly must be included in order for the model to be able to predict a future climate state. But some climate processes are too complex or small in scale to be properly represented in the models; or scientists know too little about the processes in question. These include atmospheric convection; land surface processes such a reflectivity and hydrology; and cloud cover and its microphysics. Modellers parametrise these factors: that is, they make guesses about their values. Depending on the values selected, they could make the model show no warming; or very large increases in warming, which would strain credulity. Unsurprisingly, the modellers have chosen values which produce a mild warming, in order to make their scenarios seem more credible. But these values are largely arbitrary. One suspects the the notorious “tweaking” of the results of their climate models, which were done to be able to generate the current climate, were done by manipulating these arbitrary parameters. But of course one cannot be sure because the modellers refuse to specify how they achieved different results.
In fact the problem is much worse than that. Gabriele Gramelsberger, a philosopher of science, who has specialised in “Embodied Information-Lifelike Algorithms and Cellular Machines’. Available at http://www.rethinkclimate.org/debat/rethink-nature/?show=fww stated that today’s climate models use boxes that represent an average of 100 to 60 kilometers in the horizontal and several hundreds of kilometers in the vertical depending on the number of levels used by the model and the maximum altitude. The problem is that no process, which takes place on a scale smaller than global resolution can be considered in the model. Therefore, subscale processes have to be considered in the model explicitly as parametrisations. Each parametrisation computes results that are then added to the dynamically computed results. This is one of the reasons why climate models become fatter: the role of parametrisation is growing constantly. Typical parametrisations include processes in clouds, which are too small to be resolved in global resolution. Clouds are important agents for weather and climate processes; therefore every climate model includes a cloud file.
Current climate models are AOGCM coupled atmosphere-ocean general circulation models designed to create complete earth systems. The dynamics of the ocean also follows the basic equations which drive the atmosphere. Subscale processes have to be parametrised as well. This creates a major problem. Not only are climate models weakened by their inability to model some larger scale processes, because simply not enough is known about these processes. But when it comes to the parametrisation of smaller processes, there is simply no relevant information available at that level of detail for a large part of the earth’s surfaces. The information may be available for most of Europe (6.8% of the earth’s land area), but is almost certainly missing for much of Asia (29%); South (12.0%) and Central (1.54%) America, northern and Arctic North America, most of Africa (24.4%), and much of Australia (5.9%). Thus for something like ninety per cent of the earth land territory, modellers are forced to guess much of the information in these parametrisations. This can hardly be called true “science”.
Kevin Trenberth, IPCC senior scientist and lead author has pointed out that it is incorrect to say that the models offer predictions, as they are too simplified for that. What they offer are “scenarios”: In fact, since the last report it is also often stated that the science is settled or done and now is the time for action. In fact there are no predictions by the IPCC at all. And there never have been. The IPCC instead proffers “what if” projections of future climate that correspond to certain emissions scenarios. There are a number of assumptions that go into these emissions scenarios. They are intended to cover a range of possible self consistent “story lines” that then provide decision makers with information about which paths might be more desirable. But they do not consider many things like the recovery of the ozone layer, for instance, or observed trends in forcing agents. There is no estimate, even probabilistically, as to the likelihood of any emissions scenario and no best guess.
Trenberth continues: “Even if there were, the projections are based on model results that provide differences of the future climate relative to that today. None of the models used by IPCC are initialized to the observed state and none of the climate states in the models correspond even remotely to the current observed climate. In particular, the state of the oceans, sea ice, and soil moisture has no relationship to the observed state at any recent time in any of the IPCC models. There is neither an El Niño sequence nor any Pacific Decadal Oscillation that replicates the recent past; yet these are critical modes of variability that affect Pacific rim countries and beyond. The Atlantic Multidecadal Oscillation, that may depend on the thermohaline circulation and thus ocean currents in the Atlantic, is not set up to match today’s state, but it is a critical component of the Atlantic hurricanes and it undoubtedly affects forecasts for the next decade from Brazil to Europe. Moreover, the starting climate state in several of the models may depart significantly from the real climate owing to model errors. I postulate that regional climate change is impossible to deal with properly unless the models are initialized.”