Why climate predictions are so difficult

Posted: March 31, 2019 by oldbrew in atmosphere, climate, opinion, predictions, radiative theory
Tags: ,

.
.
So climate sensitivity… is likely somewhere between 1.5 and 4.5 degrees. This result has not changed until today, about 40 years later. And that’s exactly the problem.
– – –
Maybe the problem lies in thinking that that is the problem.

Climate Etc.

by Judith Curry

An insightful interview with Bjorn Stevens.

View original post 1,104 more words

Comments
  1. JB says:

    “The computational power of computers has risen many millions of dollars, but the prediction of global warming is as imprecise as ever.”

    –Computational power (which includes AI) has never resolved the problem of understanding. If a person does not understand the data set being computed, as well as the effects of the filtering criteria used, they will not understand the result either. GIGO. A supercomputer is no more insightful than an abacus.

    “The accuracy of the predictions has not improved, but our confidence in them has grown,”

    –Come again? Listen to yourselves.

    “Simulating natural processes in the computer is always particularly sensitive when small causes produce great effects.”

    –Whenever this kind of response appears in computer programming, it is usually a sign that something is very amiss with the programmer’s assumptions. Obviously the large-scale change in the model is not reflecting Natural behavior.

    “You have to make do with more or less plausible rules of thumb.”

    –Another indication what is lacking is understanding, not modeling efficiency. Rules of thumb are ONLY used to establish a general reference point, which can easily be out of harmony with actual operating conditions.

    “We need a new strategy,”… “We need new ideas,”

    –In summation, “We’re completely stumped, and don’t know where to go from here. None of what we’ve tried so far makes sense.”

    Sorry if such an article tends to confirm my bias. What we have here is a simulacrum of Keystone Kops.

  2. oldbrew says:

    Don’t be sorry JB. They are going round in circles with the climate sensitivity conundrum and will continue to do so, presumably because giving up means loss of faith in the ‘greenhouse gas’ climate theory.

  3. Gamecock says:

    True, JB.

    To model the atmosphere, you have to understand the atmosphere. We don’t.

    Modeling requires codification of behaviors, which is then programmed. Data is entered, and the model tells you what will happen under those data conditions, based on known behaviors. Global Circulation Models attempt to report what is going to happen to the atmosphere over long periods of time, without knowing all behaviors affecting the atmosphere – or even left out as, oh, that’s too complicated. See: clouds.

    GCM is intellectually vacant in design.

  4. E.M.Smith says:

    I would also note that the GCM codes I have read are remarkably devoid of bounds checks on iterative math. Computer math has strict limits on the size of numbers, both at the large end, and the small. Then you get overflow or underflow. Your max large number becomes max negative, or your min small number does not reach zero, but jumps over it to become very small but negative sign.

    There are all sorts of ways digital math is tricky. This normally isn’t too much of a problem for a few functions in a row, but repeated hundreds (or worse, thousands) of iterations: you approach certainty that your math will be wrong. CGMs iterate for millions of operations.

    There is infinite precision math available, but it is terribly slow, so not used in the CGMs.

    I have zero confidence that the models even do what the researchers think they programmed.

    Take a chaotic system with intense iterations and use regular float or double precision math and you have created a random number generator.

    (On of my first FORTRAN assignments was to write a random number generator… using floating point math and iterations about zero….)

  5. oldbrew says:

    RESEARCH ARTICLE Open Access
    How much has urbanisation affected United Kingdom temperatures?
    Ian L. M. Goddard Simon F. B. Tett
    First published: 28 March 2019

    [From the abstract]
    For an urban fraction of 1.0, the daily minimum 2‐m temperature was estimated to increase by 1.90 ± 0.88 K while the daily maximum temperature was not significantly affected by urbanisation. This result was then applied to the whole United Kingdom with a maximum T min urban heat island intensity (UHII) of about 1.7K in London and with many UK cities having T min UHIIs above one degree.
    . . .
    4 DISCUSSION AND CONCLUSIONS

    The observed increase in T min can be attributed to an increased intensity of the UHI during the hours after sunset and into the night. Many studies have previously shown that UHII is maximised during the night (Arifwidodo and Tanaka, 2015; Montávez et al., 2000; Ripley et al., 1996). The intensity is maximised during these hours, as heat absorbed by urban structures will be re‐radiated back into the atmosphere at a slower rate, due to smaller sky views, than natural structures. [bold added]

    https://rmets.onlinelibrary.wiley.com/doi/full/10.1002/asl.896

  6. hunterson7 says:

    Scott Adams, the “Dilbert” creator, has another interesting post up on YouTube.
    Here is an excerpt from his summary:
    “The Russian climate model is the only one predicting current data
    Other top 30 models aren’t accurately predicting new data
    Russian climate model says climate change NOT a danger
    NOBODY has challenged the accuracy of Russian model”

    He speaks directly to the issue of climate model failure.

  7. cognog2 says:

    Yes. It is the thinking that is the problem. Few, if any, comrehend that water just does NOT comply with the concepts behind the greenhouse effect.(GHE). Currently the mindset is grooved into statistical research in attempts to justify this GHE, based almost exclusively on matters of radiation.
    A Shift to basic thermodynamics, particularly to that of water would reveal much and challenge the view that water somehow produces a positive feedback to GHE, when in fact it is strongly negative, through the behaviour of clouds.

    The problem, I suggest, lies in the charter of the IPCC which precludes consideration of natural processes other than AGW in the context of climate change.

    http://cognog2.com

  8. ivan says:

    There are a few things that ensure that their models will never predict anything to do with climate.

    For a start the climate is a chaotic system and they can’t measure it let alone model something like that with any accuracy.

    As Donald Rumsfeld said;
    There are known knowns; these are the things that we know we know.
    There are known unknowns; these are the things we know we don’t know.
    But there are also unknown unknowns; these are the things we don’t know we don’t know.

    It is the unknown unknowns that always mess up everything. Even if they started measuring everything in the atmosphere in one metre cubes, including over the oceans and mountains they would still miss something.

    As cognog2 says the IPCC has to change, but they will never do that because it is designed to support the socialist idea of a UN lead one world government and is the head of the UN Church of Climatology Cult.

  9. stpaulchuck says:

    “The accuracy of the predictions has not improved, but our confidence in them has grown,”
    what a blazing load of cod’s wallop! Well, our confidence in them has too. We are more and more certain your models suck as does the GHE nonsense itself.

    As for the so-called greenhouse effect, it is about as scientific in light of the many current papers by people like Scafetta and Nikolov and Zeller, as the search for transmutation of lead into gold. But just like ‘lead into gold’ this farce is a rent seeker’s gold mine of sponsorship/patronage. The hundreds of millions spent on going through the climatic ‘entrails of sheep’ looking for the magic numbers for climate sensitivity for the GHE farce is truly sad when that money could be spent pursuing real science.
    ———-
    “The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.” – IPCC TAR WG1, Working Group I: The Scientific Basis
    ———-
    so would they be willing to bet the lives of their entire family on the accuracy of their 50 year and 100 year predictions? If only.

  10. Gamecock says:

    E.M.Smith:

    (On of my first FORTRAN assignments was to write a random number generator… using floating point math and iterations about zero….)

    Some 30 years ago when things were slow at work, I decided to make myself a little 5-card draw poker game. Using my native FORTRAN.

    I used the Random system function to “shuffle the deck.” It was awful. Hardly random.

    So I set about creating my own random number generator:

    I used the system clock.

    Get system floating point time A. Move to integer I, which truncates the decimal part. Subtract integer value I from floating point into floating point B, yielding the decimal part. Multiply I times B, the whole number part times the decimal part.

    I found it very reliably random. I.e., never repeating. Every deal was different.

  11. gbaikie says:

    –Gamecock says:
    March 31, 2019 at 3:26 pm
    True, JB.

    To model the atmosphere, you have to understand the atmosphere. We don’t.–

    I would say you have understand the ocean, to understand the atmosphere. And we don’t.

    And understand the variable star, our Sun and it’s effects upon Earth is also important.

  12. hunterson7 says:

    “Predictions are hard, especially about the future”.
    -American proverb attributed to Tommy Lasorta

  13. Phoenix44 says:

    I would add that every model requires its starting points to be accurate,even if it is a really good model. That is pretty much impossible for climate. You cannot model change without knowing what you are changing.

    I believe modellers know their forecasts are rubbish but instead of admitting the problems keep pushing down a blind alley .

  14. oldbrew says:

    Science of Doom has a go at climate sensitivity numbers…

    Surely the patterns of warming and cooling, the patterns of rainfall, of storms matter hugely for calculating the future climate with more CO2. Yet climate models vary greatly from each other even on large regional scales.

    https://scienceofdoom.com/2019/03/31/opinions-and-perspectives-9-pattern-effects/
    – – –
    Radiation-only theorists will never get anywhere with so-called ‘climate sensitivity’ IMO.

    They pretty much admit they aren’t getting anywhere – no noticeable progress in 40 years – but the penny doesn’t drop 😐
    (From the blog post: This result has not changed until today, about 40 years later.)

    How about some new thinking?

  15. tom0mason says:

    It’s well known these models are poor at showing clouds and precipitation, a couple more items these models fail at —

    Not good at modeling pressure changes —
    https://www.sciencedaily.com/releases/2018/10/181016132032.htm
    The findings raise serious questions about the accuracy of regional climate projections in the UK and neighbouring parts of Europe because meteorological conditions in those regions are closely linked to air-pressure changes over Greenland.

    https://www.nature.com/articles/s41598-019-41334-7
    Uncertainty in hydrological analysis of climate change: multi-parameter vs. multi-GCM ensemble predictions …

    When model predictions are found to be inaccurate, … large while uncertainty in their temperature projections was small, and vice versa. … based on scenarios from regional climate models.

    These models aren’t accurate enough to be the basis of changing the fundamental economics of the world.

  16. tom0mason says:

    Also of note —
    Of course the model can’t see a hiatus because they’re tuned to force too much warming with rising CO2. This is already known, as this shows —
    https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2018EA000401
    From the conclusion …

    Temperatures in the tropical 200‐ to 300‐hPa layer meet all four conditions. We present a trend model robust to general forms of autocorrelation and the possible existence of a step change associated with the 1979 PCS. Comparing observed trends to those predicted by models over the past 60 years reveals a clear and significant tendency on the part of models to overstate warming. All 102 CMIP5 model runs warm faster than observations, in most individual cases the discrepancy is significant, and on average the discrepancy is significant. The test of trend equivalence rejects whether or not we include a break at 1979 for the PCS, though the rejections are stronger when we control for its influence. Measures of series divergence are centered at a positive mean and the entire distribution is above zero …

    [my bold]

  17. oldbrew says:

    With their models which don’t work they can detect which bits of the climate system have been affected by humans, as if by magic. Remarkable 😆

  18. stpaulchuck says:

    totally agree Rog. The entrails of sheep would do as well as this CO2/forcings/GHE farce.

    I had a nice little chat on another blog with a fellow who brought up the error range issue. We’d both worked in areas where equipment needed to be certifiably calibrated from traceable standards. We had a fun time picking apart nonsense like quoting temperatures and forecasts to the tenth or hundredth of a degree when the readings are in one degree increments and thus already +/- 0.5 degree right out of the gate. I understand one of the most common measuring devices is instrumentally accurate to +/- about 0.7 degrees. Etc., etc. Has anyone ever seen or even heard of the calibration charts for these recording stations? and calibration traced to where?

    The other bit of math is that CO2 is about 3.5% of total greenhouse (*spit*) gases and that humans produce about 3.5% to 4% of total contribution. Worst case (idiotic, actually) predictions of warming are for about 4 degrees over the next 100 years. Doing simple calculator math if we wiped out humanity entirely we could reduce total warming by about 0.0056 degrees.

    As anyone paying attention knows, water vapor is a MUCH larger factor in this, but even if their idiot ‘forcings’ nonsense was true and we leveraged the CO2 issue by the upper end of the value to say 4:1 then we would reduce the future warming by 0.0224 degrees, That right there tells you that this whole thing is a cash cow for rent seeking scam artists posing as scientists.

    IMAO It’s about the Benjamins.

  19. oldbrew says:

    Follow-up post: What’s the worst case? Climate sensitivity
    Posted on April 1, 2019 by curryja

    Are values of equilibrium climate sensitivity > 4.5 C plausible?

    https://judithcurry.com/2019/04/01/whats-the-worst-case-climate-sensitivity/
    – – –
    Only in climate cloud cuckoo land 🙂

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s