Paul the Weatherman strikes again

Posted: August 13, 2010 by tallbloke in solar system dynamics

My local TV weatherman Paul Hudson has stuck his neck out on his BBC blog again – Good Lad! You’ll get a mention in the ‘Hall of Fame‘ for this. 🙂

Should the sun’s role in weather and climate be re-assessed?

Paul 'The Weatherman' Hudson

Paul 'The Weatherman' Hudson

The idea that changes in solar activity can affect our weather and climate has very much fallen out of fashion in recent times. Most climate scientists’ efforts have been directed towards the impact of greenhouse gases on global temperatures, and what a warmer planet could mean to weather and climate.

Most, but not all, meteorologists dismiss the idea that the sun could play an important role in determining our weather, and hence climate.

This may, at least in part, be down to the fact that forecasts these days are heavily reliant on powerful supercomputers that can’t incorporate the influence of the sun, simply because the precise mechanism of how the sun impacts our weather is either not understood, or impossible to model.

But it wasn’t that long ago that eminent climatologists such as Professor Lamb at the University of East Anglia conducted research which showed, amongst other things, a link between low solar activity and pressure patterns over Greenland.

In his day forensic analysis of weather data was the only way to forecast the weather, but sadly much of his work, and work like it, has been mostly forgotton, as the weather industry becomes more and more reliant on computer simulations of the atmosphere.

But it seems that it may becoming a fashionable area of research once more.

Read the rest here.

  1. DirkH says:

    There was more money to be made with CO2, that’s all.

  2. Tim Channon says:

    Nice when people start talking sensibly.

    A somewhat significant solar linkage has been there in the data for many years.

    Now I will prove it, or I hope it will be sufficiently self evident.

    I was doing some work using the Armagh Observatory data. This inland Northern Ireland location is broadly similar to a good deal on England.
    There are problems with this data, which I will mention later.

    Such data is widely available to professionals but is withheld from the likes of you and I, where I am talking about proper instrument readings before mangling via bad practices.

    Daily minima and maxima data is available over a fairly long timespan.

    It is therefore trivial to produce transforms with reasonably good detail of a solar signature if there is one but the sampling is still significantly violating Nyquist. (a long important story in there)

    Critical here is the difference between minimum readings and maximum.

    The solar signature is different, therefore the weather is different.

    And what happens when the “standard” maths average forms mean

    Does that mean that solar cycle length is showing and affects max and min differently?

    Note that even with data that long, this is pushing the limits of Fourier. This includes an increasing uncertainty on numbers. This is why I developed novel software which can remove some of the normal limits.
    Amusement here right now, it just occurred to me I have never shown plots as spectra from the output. Dumbo.

    Many of these spectral terms are vague yet eg. some kinds of modulation can be usefully mimicked, large subject. Experience probably comes into reading such output, spotting when rather more is going on.

    The visible solar cycle still resists decomposing into components (folks around the world have been trying for eons), making it very difficult to clearly identify what is solar and what isn’t. Matched filters doesn’t help because we do not know what facets of solar actually have an effect and what law is involved, other than it is clearly not linear.
    More ambiguity still appears when magnetics come into play.

    The Armagh data.
    I discovered there are data errors, such as data for 29th Feb on non-leap year and other peculiarities. This has been untangled well enough but so far I have not fed the corrected data back, still some outstanding problems.

    Nicely the scanned paper records are online (National Lottery seems to have paid for this). Further work has led me to be increasingly unsure of what exactly has been done with the data.
    This semi-hidden starting page might be a good one

    The Fahrenheit/cent conversions are low grade which might introduce artefacts.

    There are poorly documented adjustments so the paper record does not match the published. How valid is it? Sorting this out would be a lot of work.

    Some digging shows there were more site changes at the observatory than are mentioned in the papers I have seen.

    My broad take on the data is that like most parts of the world not very much happens apart from weather. I suspect change is played up more than it ought to be.

    The past chief at Armargh has published what could be viewed as somewhat sceptical.

    The Nyquist problem is usually ignored, moreover I suspect most of the people working in meteorology are poorly aware of the problem. It takes very many years for some ideas to percolated through to solidly taught.

    For temperature hourly readings are probably good enough. Today many automatic stations report at 10 minutes.

    The big problem is the maths. Correct signal processing for decimating the data from say 10 minutes to monthly is impossible to do in the short term: always hits the filter the end of a dataset issue.

    On top of that it is not possible to describe the correct shape at monthly using one sample, might need three. For that matter reconstruction filters are necessary. Seems to spin off into meanings, the purpose of the data, it may or may not matter.

    Then there is the Shannon problem. If decimation occurs the trade is between time resolution and amplitude resolution. Sampling to say 0.1C becomes finer 0.01 or 0.001 if you get my meaning, otherwise information is thrown away.

    Does this matter? Yes if further data analysis is done.

    I am only aware of one published exercise which discusses the effect of processing and even that is far from complete. (I expect there are more papers)

    As an exercise to eyeball what happens I found a reasonable set of hourly data and processed this both conventionally as in meteorology and by signal processing.

    The work is incomplete but it was immediately clear the processing matters and that is on top of knowing that conventional processing introduces processing artefacts. (I’ve even seen papers reporting on what was probably fictitious)

    The general effect is the data is less noisy and has a different spectra. This is not because filtering is used, although that might be counter intuitive.
    Quite often with datasets a look at the noise floor speaks volumes on what has been done.

    The actual numbers for monthly etc. are different.
    A counter intuitive, or unexpected by those not familar with this stuff, if there is for example a very cold patch during January, this affects the preceding December data, which is correct, but conventional math will not show this.

    The world has a long instrumental temperature record? Ho ho ho. It is a mess and then some. All we can do is make the best of what exists and at least try and correct when the opportunity arises. Data hiding is very bad…

    If there was good data things like site changes would be clearly in the data.

    I’d better shut up.

  3. DirkH says:

    Very impressive, Tim!

  4. tallbloke says:

    Tim, those are really interesting plots! I’ve put the max above and min underneath so we can see the differences more easily. Right click the image and view in a new window to see the detail.

    If you’d like to make a graph with the max in a different colour, we could overlay the curves and make a new post out of this.

    Tim Channon Armagh temperature Max and Min.

  5. Tim Channon says:

    Not so simple. Overlay is confusing, an orgy where it’s difficult to tell which lump and bump belongs to which.

    Der team might have ideas on how best to present the data and so…

    Just reproduced the chirps, dumped the data including raw min/max here

    An attempt at overlay is inside.
    Driving gnuplot on something like this is too hard.

    The chirp data is 5 columns (doric, red sandstone, fluted (use earplugs))
    Column 1 and 2, complex data values
    Column 3, machine computed maxima
    Column 4 and 5, XY data

    Plots shown use 3, 4, 5 where 3 is data labels in a suitable mode

    Use log X axis

    The 45 year is dodgy, bin width is excessive but is or thereabouts and not shown is “something” at longer period.

    Raw data has missing values marked na

    Data generating software is in C and I wrote it.
    A todo sometime is figure out a good way to automagically do octave or decade versions. Quite tricky.

  6. Tim Channon says:

    Goodness me now here is a surprise.
    PS just added a coda

    I had let the sythesizer run on the minimum temperature data, fiddled with it and wondered about making some software changes, undo some recent changes until I figure out an optimisation which works, speed is critical even in C, so much so I avoid the co-pro.

    Been running for hours in the background. Was going to delete but took a look anyway.

    In there is the 11 year term you see in the chirp for min temperature. I’d kicked it to allow amplitude modulation, with fundamental if it wants.

    The whole thing looks like errm, Armagh min temperature but twiddle, remove the uninteresting annual etc. leaving an approximate simulation of the longer term factors in the data. (the effect is similar to a low pass filter without end effects)

    Plot it, looks sane, seen much the same before, similar to a low pass filter.

    Out of curiosity I plotted the 11 year term. This is a component of the whole simulation. Sometimes they give insight, sometimes not, huge story.

    This result is in last years parlance, awesome and hey we are dealing with cool.

    We have modulation big time in bursts just like solar data. Even the timing looks sane, note though that I cannot mimic solar data at all well. More twiddles could be included.

    This is suggesting minimum temperature is modulated , could it hint at greater temperature swings at some times more than others? This needs input on what the weather was like. I think there were harsher winters much as shown.

    Reminder, this is a tiny signal in the whole. Right hand scale.

    I’ve allow the output to run past the last input data (2004) to 2011. A model cares not a jot about the date.

    Easiest to put a plot here but I am worried about space.


    I cannot explain in a few words what this software does. Various ways of trying to explain. One is if you put a fourier transform and inverse inside a feedback loop and adjusted according to the error term against real input data, producing a model of the data, which is the output. Something like that but actually not that. Various good side effects such as in essence is an FT on DFT data, no bin width or picket fence. On analysis work is accurate on all parameters without interpolation etc.

    It gets more crazy. Wondering, I switched the generator for the 11y to an incomplete type, not finished, called orbit which does work, but the modulated versions are missing a parameter adjustment. Orbit is potentially asymmetric and the form is based on 1/(1-x)

    Something rare happened. R2 was about 0.57, fair enough on temperature data, especially long period. Instant jump to 0.73, a high figure, not sure I’ve even seen such a jump before.

    On looking at the output it is bizarre. Other stats say it isn’t a bad match, r2 is just there as a human confidence thing. I will have to look into what is going on. Turning on annual etc., looks entirely normal.

    Too way out to show. Think about it and see how it relates to the original data.

    Sigh, I need to fix the missing stuff, a horrible long task.

  7. tallbloke says:

    Tim, this blog is about emerging ideas and novel methods and interpretations. Criticism is encouraged, but ridicule is not. So don’t worry, and feel free to use this thread for your musings, your method does create predicitions, which is what weather forecasting is all about. I’ll try to make time to use your ODS file to make an overlaid graph of Armagh max-min which is clear enough to read.

    On the subject we discussed last week, this paper may be of interest to you:
    It studies absorption lines in co2 of IR re-reflected from Earth to Moon to Earth. Some novel results which may have a bearing on your concerns about co2 re-radiation of incoming solar wavelengths.

  8. Tenuc says:

    Interesting article from the BBC back in April this year:-

    “Low solar activity link to cold UK winters”

    Perhaps this organ of political propaganda is setting the stage for a volte-face on the CAGW myth, which will be replaced by CAGF(reezing)?

    Interesting to note that the above report did not mention that the rest of the NH had a severe winter too!

  9. DirkH says:

    Tim, what you’re doing looks like an optimization algorithm to me; like a gradient descent to find the minimum of a cost function.

    Did you consider using a genetic algorithm to give the machine more freedom in overcoming local minima?

  10. DirkH says:

    Tim, your amplitude modulation has a full period of about 200 years (if we interpret it as a beat, german “Schwebung”, an addition of two sinusoidal functions with slightly different frequencies). This can’t be a coincidence IMHO…

    What are these 200 or 206 year solar cycles called? tallbloke, you might know that…

  11. tallbloke says:

    De Vries cycles.

  12. Tim Channon says:

    TB: You found an interesting paper there

    A lot of work has been done by amateur astronomers on equipment compensation, much the same as mentioned in the paper.

    As I read it they are showing reflection from the sunlit earth, not self emission. Unfortunately they show no same experiment on emission from darkside earth.

    The surface reflection seems to be shown as zero, ought to be there at non-absorption frequencies, data shows as opposite to that (fig 4). Don’t understand.

    Take another read after a few days.

  13. Tim Channon says:

    DirkH: I am using a technique I developed during the 1980s for massive multi-dimensional optimisation. This was after trying just about every technique I could find. This was subsequently very successful in solving hard problems in commerce, you might well have used some of the results.
    Nothing clever about it except simple, boring and works. Slowly.

    I am quite sure that elegance could be used but I am alone and have to do things within my own capability.

    Genetic has crossed my mind. (ooo look girlies… 😉
    I will have dabbled way back in that kind of thing. Okay, I’ll leave that ambiguous.

    What would genetic actually do? The hard problem is discovering and doing something with chaotic behaviour. I think it is very likely there are loosely timed systems like that to do with earth. It might even be this is much of what is seen, with the primary synchronising timing the annual cycle.

    As it stands the software is well able to jump mode, sometimes has to be prevented, nope, do what you are told, iie. lock onto something.

    Other terrestrial externals are few. I did an exercise to try and figure out what gravitational force acts on earth, excluding the sun and earth moon. The answer was not what I expected, is dominated by one term. Trouble is that this does not obviously appear in terrestrial datasets, nor so far obviously as modulation.
    I was looking for a specific item to do with the Chandler wobble which I can simulate very precisely two ways, which one is correct. another story.
    This in my mind tends to downplay externals apart from solar.

    A lot of people talking about mathematically based cycles but numbers without mechanism to do anything is not a lot of use.

    The AM is faster than that but I am going to rework because this looks like a problem which needs more serious addressing.

    As TB says devries or De Vries or … which is circa 200 years.

    This very clear in Fujidome 10Be data. On looking at that I note it seems to be a related sequence as is suggested by the associate paper, without actually saying that.
    If treated as 200 y modulated by 1000y the result is notable, suggesting a greater amplitude during the little ice age.
    However, snow accumulation is related to errm… precipitation and if that is related to solar and solar is to do with the 10BE isotope it is self referential, making conversion from 10Be amount to a flux very dicey.

    Wasn’t intended for publishing but I have put this available anyway.
    I used this dataset as a test for a new function in the software to do with irregular sampling.

    Click to access fuji-200.pdf

    I need to do the same exercise on the Armagh maximum temperature data.

    Hopefully it is complementary in some way.

    Or nothing, a null result is still a result.

  14. DirkH says:

    Ok, just wanted to bring it to your attention. I’m not saying genetic algorithms are the silver hammer for everything. Still no success with my own attempts… which are in a different area. 😉

  15. Tim Channon says:

    If you want to discuss optimisers DirkH, it’s not difficult to find my email address, but getting past filtering is not so easy.

    The same term came out of the Armagh max temperature data, so it is not that directly. (surprise surprise)

    I did a complex subtract of the two chirp pairs of complex number series, scaling one to try and get a sensible compare.

    This didn’t really add much insight. Two terms seem to stand out: –

    Very close to 7 years (why such a precise number?) and circa 26 years, neither of which make much sense.
    There might be something around the longer solar cycle periods, not a clear result.
    A little digging in the data yielded up nothing.

    This puts those datasets back to sleep here, know about them and my brain knows. Wait and see.

    Turning to another long dataset, one that is little known,
    WMO 10384 -0 Berlin Templehof

    The data is poor, only aware of monthly and very little other information, such as site changes, instrument changes.

    This runs from year 1701 although the early records are obviously of little interest, way off.

    The data is fairly complete and without too obvious in the way of changes from January 1756.

    For a little fun I have put this together

    Click to access berlin-temperature-text.pdf

  16. tallbloke says:

    Hi Tim,
    I’m not too surprised Berlin temp data doesn’t show much of a solar signal. Being to the leeward side of the Alps and in the middle of the continent makes it subject to anomalous cloud and precipitation. It is a nice long record though. Could you put up a plot with a higher resolution on the y axis for us?

  17. Tim Channon says:

    This do?

    That explains most of what has been done. It does not cite a data source, from a .zip which came in my direction but seems to part cross check with other sources.
    Signal processing has been used. Full cutoff is just over annual and three datapoints have been retained per year.

    The stations break need further explanation. I have noticed a shorthand in many published datasets where a single missing month in an otherwise complete but composite dataset coincides with known butt joining of individual datasets. In this instance that is what I suspect.
    The break 1932 makes sense in this context, when Nazi Germany built a flagship world aerodrome. Similarly 1993 makes sense in the context of a major airport as well as movement to fully automated weather stations.

    The datasets seems to show

    Low around the 1810 cool time.
    Low 1890s when Alpine glaciers were known far advanced
    Blip duing WWII.

    End fo the record doesn’t entirely make sense. Probably growing UHI followed by moving the weather station?

    This might be an interesting text including discussion about thermometer scales. The record for Upsala, Sweden.

    I’m not happy with local global change, isn’t going to happen. We probably could have a long discussion about what is sensible, data errors, assumptions, widespread abuse of math and so on.

  18. tallbloke says:

    very nicely done, thank you.