The Models – Uncertainty? In her presentation at the Workshop on Handling Uncertainty in Weather and Climate Prediction sponsored by the Royal Society, Judith Curry presented a few slides that must be unsettling to climate modelers as well as the IPCC – she dared question the purpose of expanding the climate models as matters stand now. Among her major points are:
· Increasing the complexity of the models does not necessarily yield greater scientific certainty.
· There are too many degrees of freedom in the models resulting in great uncertainty and large areas of ignorance. [Degrees of freedom can be roughly defined as independent pieces of information that are allowed to vary within the model.]
· Highly unlikely scenarios should not dominate political decision making.
· Improving the models for societal needs is based on three dubious assumptions: The models are fit, useful, and the best choice for this purpose.
In her summary, Curry raises a fundamental issue: that with the high costs of model production runs, “climate models are becoming less fit for the purpose of increasing our understanding of the climate system.”
TWTW would add that the resources would be better used towards understanding the natural drivers of climate change than on chasing carbon-dioxide-caused climate change. According to a summary of government estimates, the US has spent over $35 Billion on climate science and over $150 Billion on global warming / climate change. Although the US has some great instruments onboard satellites, the bulk of the monies have been mal-appropriated.
Please see links under Seeking a Common Ground and on the new US super-computer for climate change under Defending the Orthodoxy.
*******************
The Data v. A Statistic: TWTW is not enamored with the reporting of global air surface temperatures with a single statistic, such as a global average. Causes of concern include issues with air-surface instruments, the likelihood that minimum temperatures have been increasing with slight changes in the surrounding area, and the frequent manipulation of the data by reporting agencies without clear justification for the manipulation. In addition, the single statistic buries the global composition of the warming / cooling. As the satellite data show, the warming is concentrated in the northern part of the Northern Hemisphere. Please see: http://nsstc.uah.edu/climate/2011/November/trend_Dec78_Nov11_alt.png
Even with satellite data there is a tendency to develop a trend using regression analysis or similar tools. An examination of the historic satellite data itself shows two long periods of no warming trend separated by period of discontinuity. The exact dates of the discontinuity depend upon the views of the researcher, but they are around the time of the great El Niño of 1998 or shortly thereafter. This discontinuity is a jump in the average temperature anomaly from -0.1 deg C to +0.1 deg C. The cause of this discontinuity should be a subject of great interest, but it is lost to the Climate Establishment. Please see:http://nsstc.uah.edu/climate/.
*******************
The Quote of the Week: In the oral arguments on the litigation on the EPA finding that greenhouse gases endanger human health and welfare cited in last week’s TWTW and the TWTW on March 3, 2012, Angeline Purdy was introduced as the technical expert on the scientific evidence that there is 90 to 99% certainty in the findings of the IPCC and its models. Clearly, there is some difference of opinion between Ms. Purdy and researchers such as Phil Jones of CRU as well as Judith Curry. Four-time IPCC expert reviewer Vincent Gray of New Zealand would take great exception to Ms. Purdy’s comments.
Validation is a rigorous process during which unknowns are greatly reduced or eliminated. Even basic assumptions in the IPCC climate models, such as a warming caused by carbon dioxide will be amplified by an increase in water vapor over the tropics, have not been validated.
The IPCC grossly overstated the certainty of its science and its models and understated the natural variability. The gross overstatement is now part of the US legal system. The EPA used the overstatement of certainty in its endangerment finding which the US Federal Court of Appeals accepted.
The entire episode reveals that the public has no protection from the Federal courts against overzealous government agencies, which claim scientific support of their regulations. These miscarriages of justice must be remedied either by permitting challenges to government pronouncements of science or by establishing special scientific courts in which the jurists are well versed in the principles of science, the scientific method, scientific knowledge, and uncertainty.



























