The standard line of argument used to support climate change (global warming) mitigation measures starts with a number of claims:
In principle, all of these claims are true. A conclusion is then drawn from these established facts, that:
When written or said like that, it seems totally convincing. Point of fact, logic of this kind has convinced millions to accept the predictions of Al Gore, the IPCC, Michael Mann and others. The latter conclusion does not necessarily follow, though. It might, but it need not. To understand why, we need to remember that science and engineering are both based principally on a process of making measurements, not simply the comparing of qualities. Thus, to either validate or refute this conclusion we need to look more carefully at what carbon dioxide actually does to climate, and how much of that effect is has. Then, and only then, are we doing proper science.
The relationship between the amount of carbon dioxide in the atmosphere -normally measured in parts per million- and the resulting greenhouse effect is well established and understood. The equation describing it was laid down by Svante Arrhenius in the 19th Century, and has never been seriously challenged. It is a logarithmic relationship. To get the change in retained energy due to any increase in carbon dioxide, we divide the new concentration by the old, take the natural logarithm of that ratio, and multiply the result by a constant:
The unknowns here are the sensitivity constant alpha, and the relationship between forcing F (energy balance) and actual temperature rise. The IPCC used to reckon on a value of seven for sensitivity, but have recently reduced their estimate to nearer five. If we take it as 5.5 and assume that human activities have increased the carbon dioxide from 270ppm to 400ppm over the last century, then we get a figure for forcing of 2.16 Watts per square metre. If we then take 0.8 as the relationship between forcing and temperature difference -which seems to be the generally accepted value- that gives us a theoretical rise of 1.72 degrees Celsius.
OK, so that is a temperature increase, but most people would not, I think, regard it as sufficient to cause catastrophic effects.
The important thing to realise is that all logarithmic effects are a 'law of diminishing returns' -thus if we were to add another 130ppm of carbon dioxide by doubling our fossil fuel use, that will not create another 1.7C rise. In fact, the additional warming would be about 1.2C. Not all that much of an increase, considering just how much extra fossil fuel we'd have to burn to achieve it.
It's worth taking this relation a bit further and seeing where it leads. Let's suppose the whole world went SUV-mad, and we quadrupled our rate of fossil fuel usage as compared to today. What would that do to temperatures? Well, we take the natural log of the ratio 790ppm/270ppm and multiply the result by five, then by 0.8, then subtract the 1.72C we already have. That gives us a planet 4.7C hotter than in the pre-industrial era, and almost exactly three degrees hotter than the present day. Now, that is starting to sound like it might be a concern, but hey, we'd have to go stark crazy with those fossil fuels to reach that kind of figure.
The above is an oversimplification, since some of the carbon dioxide we add to the air will dissolve in the oceans and thus be taken out of the cycle. Exactly how much is hard to quantify. Extra CO2 will also have the effect of strongly promoting plant growth, and the consequent reforestation may absorb more CO2 than was previously absorbed. But, it is probably a good enough ballpark figure.
The point here is, even the IPCC recognise the relatively minor and harmless nature of such a temperature rise. The more alarming concerns which they they raise assume that some kind of 'positive feedback' mechanism will be invoked once a specific temperature threshold is reached, and that positive feedback mechanism will multiply the temperature rise by several times over, to perhaps a 6C-10C rise. Proposed positive feedback mechanisms include:
This is the point at which we step out from under the auspices of classical, reliable science, and into into the realm of guesswork. None of these positive feedbacks can be shown to exist, or if they do exist, that they have the claimed effect. The conclusions of the IPCC are based on computer modelling which includes such positive feedbacks.
Analysis of these proposed feedbacks is extremely complex, in fact so complex that there is no classical science available to predict their effects in any detail. A situation which I have to say, strongly favours the alarmists among us by providing a rich source of unprovable -but also undismissable- arguments for scenarios of doom. What I will say is that an application of commonsense will dismiss most of them.
The known temperature rise due to human activities is less than 2C, and in reality more like 1C. Meanwhile, the diurnal and seasonal temperature changes in most parts of the world are at least ten times greater. Here in the UK we sometimes experience -20C on a winter night, and 30C on the odd summer day. That is a 50C range, even in a temperate zone. So, the enhanced greenhouse effect is at most one twenty-fifth of the natural range of temperature variation. That raises the question, if a 2C inrcease could cause a catastrophe, why has a 50C change never done so? To claim that (say) 31C is needed instead of just 30C would seem to be facile, after all it would be an extremely unlikely coincidence that the tripping-point for climate catastrophe just happens to be one degree warmer than that scorcher day (phew) we had last year.
The counterargument might of course be that aggregate temperature, not local temperature, is what counts. Such an argument might apply to arctic ice mass, I guess. Where it does not make any sense is with 'trigger-point' phenomena such as clathrate release. If the region with the clathrate deposits reaches the critical temperature on a particularly warm day then gas is released regardless of whether or that high temperature is due to human activities or local weather. Once released the gas cannot be released again, thus that region has been rendered innocuous and cannot contribute to any future feedbacks. A bit like Wigner release in graphite-moderated reactors. If the temperature changes of regional weather go high enough to trigger gas release on some days, then over time, all such deposits would be rendered innocuous. The fact that this does not seem to happen in the Arctic, suggests that the trigger point for gas release is well above the range of regional or seasonal variation. Which in turn is much larger than the 1-2C effect of human CO2 release. If that is indeed the case, then a trigger effect of that kind is not all that likely, at least not within any feasible CO2-induced temperature increase.
Wikipedia, Arrhenius' equation.
IPCC website - Climate data section.