GWPF: Computer predictions of climate alarm are flawed

Posted: February 21, 2017 by oldbrew in alarmism, climate, Critique, Forecasting, predictions, Uncertainty
Tags:

Image credit: relativelyinteresting.com

Image credit: relativelyinteresting.com


Results so far from climate models are very unconvincing, despite huge resources of manpower and technology.

London, 21 February: Claims that the planet is threatened by man-made global warming are based on science that is based on inadequate computer modelling. That is the conclusion of a new briefing paper published today by the Global Warming Policy Foundation (GWPF).   

The report’s author, eminent American climatologist Professor Judith Curry, explains that climate alarm depends on highly complex computer simulations of the Earth’s climate. 

But although scientists have expended decades of effort developing them, these simulations still have to be “tuned” to get them to match the real climate. This makes them essentially useless for trying to find out what is causing changes in the climate and unreliable for making predictions about what will happen in the future. 

Professor Curry said: “It’s not just the fact that climate simulations are tuned that is problematic. It may well be that it is impossible to make long-term predictions about the climate – it’s a chaotic system after all. If that’s the case, then we are probably trying to redesign the global economy for nothing.” 

Prof Curry recently announced that she was abandoning academic life due to the attacks on her research and the “craziness” of the climate debate. 

Full paper – ‘Climate models for the layman’ (pdf)

Source: Press Release: Computer Predictions Of Climate Alarm Are Flawed — GWPF

Comments
  1. oldbrew says:

    The attribution problem:

    ‘…current GCMs are not fit for the purpose of attributing the causes of 20th century warming or for
    predicting global or regional climate change on timescales of decades to centuries, with any high level of confidence.’
    – Prof. Curry

    This is where the public gets taken for an enormous ride, by some politicians and their supporters pretending there is a high level of confidence – ‘the science is settled’ and suchlike nonsense.

  2. rishrac says:

    As reported in the Denver Post by Mead Gruver, a new super computer, it’s the 20th fastest computer to do climate change research by the National Center for Atmospheric Research. It doesn’t matter how fast the computer is if it’s based on flawed programs. They are still going to get the same results. And they still are not going to be able to tell us any more about whether Australia or California are going to get rain, when, how much, or how long dry spells will last. Or when a drought in the American mid west will begin, or what to plant when.
    More arm flailing about an overheated planet. They have a huge credibility problem. Who’s temperature data are they going to be using ?

  3. catweazle666 says:

    “In sum, a strategy must recognise what is possible. In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”

    IPCC Working Group I: The Scientific Basis, Third Assessment Report (TAR), Chapter 14 (final para., 14.2.2.2), p774.

    IOW, all a faster computer does is get the wrong answer quicker.

    Ironically, it was climate scientist Ed Lorenz who first pointed this out.

  4. AlecM says:

    IPCC pseudoscience is based on science fraud in 1976. However, US teaching is based on Goody (1964) and later, Goody and Yung. Yung was involved with the 1976 fraud. So, anyone who states that Goody misinterpreted Planck’s radiative physics gets shouted down because not only was Planck perfect, Bose and Einstein created from it what is now called Quantum Electrodynamics, so are super-perfect.

    However, just like Planck, the latter assumed a vacuum: the atmosphere is not a vacuum and GHG IR physics has some little quirks which make atmospheric IR emission occur at the surface. Put in Maxwell’s Equations and that surface vanishes for all self-absorbed bands, partially so for non self-absorbed bands. For the atmospheric window and no clouds the surface radiates directly to Space.

    Bottom Line: the models assume 40% more radiant energy than reality, offset by a spurious Kirchhoff’s Law argument devised by the late husband of the MO’s recent Chief Scientist. It depends on incorrect cloud aerosol optical physics created by Sagan and Pollack, and Hansen in the 1960s.

    That bad optical physics creates positive feedback, an artefact of the modelling.

  5. Graeme No.3 says:

    I am always intrigued by the claim that the models closely match the temperature of the last X years so their projections for 2050AD or 2100AD must be taken seriously.
    I have a graph of the last 40 years of the Stock Market which plots all the variations with 100% accuracy. It is quite useless for predicting what the market will be in 6 months let alone the next 50 years.

  6. Ron Clutz says:

    The essay by Curry is good, but can benefit from the incisive critique by R.G. Brown. A complilation of the two is here:

    https://rclutz.wordpress.com/2016/11/16/putting-climate-models-in-their-place/

  7. oldbrew says:

    The fact that their computer models always predict too much warming points strongly in the direction of the thesis about man-made warming being either faulty or downright wrong, although other reasons for error could also be in play.

  8. […] Source: GWPF: Computer predictions of climate alarm are flawed | Tallbloke’s Talkshop […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s