Peter Berenyi: Hey – Climate Modellers! Wake Up!

Posted: November 11, 2012 by tallbloke in solar system dynamics

My thanks to Peter Berényi for this short-sharp and to the point wake up call to climate modellers worldwide. Kevin Trenberth would be wise to take heed, given the Nature article he wrote in 2010. Peter is a mathematician by training, with a firm background in physics. He used to
work in acoustics, where he designed, supported and performed experiments. 
12-14 years ago he  switched to IT to make a living. His earlier article here at the talkshop on Earth’s energy balance is well worth a read too.

The fundamental issue with computational climate models is an epistemological one. Fitting multiple models, and computational ones of high Kolmogorov complexity at that, to a single run of a unique instance is not science, never was and never will be. The very paradigm of climate modelling, as it is practiced in the last several decades is flawed, and should be abandoned immediately.

The proper approach is to seek a simple theory, that fits many experimental runs of multiple physical instances, but GCMs are as far away from this requirement, as anything ever can get.

Therefore it should be realized, there is no such thing as “climate science” as an autonomous field, independent of the general and as yet unresolved problems of physics.

Non-equilibrium thermodynamics of complex systems (with a vast number of non-linearly coupled degrees of freedom) in the presence of significant radiant heat is one of the few areas of semi-classical physics, where little progress is seen, basically because of difficulties around proper lab models of such systems. That is, we still
do not understand on a general level what’s going on.

But terrestrial climate is just one example of such systems. Why would
anyone at her right mind expect to understand it better than the general
case?

Go back to the lab, guys and gals, and do actual experiments on real physical
objects. Not on a simulacrum of Earth of course, because that’s impossible.
Study other objects, filled with semi-transparent fluids of complex
chemical composition, on a rotating table to induce as much turbulence as
possible. Send a vigorous flow of free energy through it with a high rate
of entropy production, isolate it from its environment in all respects
except radiative coupling. Put it into a vacuum chamber whose walls are
kept at a low temperature, by liquid nitrogen perhaps. Have its effective
temperature as high as permitted by construction materials (at several
hundred centigrade at least). It’s even better if some component of the
fluid has phase transition close to the operating temperature of the device.

As soon as such a system is understood adequately, that is, you can get
into a position when you are able to construct a computational model of it
based on full theoretical insight, that can predict (not project!) its
behavior reliably in multiple experimental runs, even if it is perturbed in
any number of ways, notably by changing the optical depth of fluid filling
it, in various spectral bands, then, and only then, you can return to
climate.

That’s the way science is done, not the other way around.

Please note that this requirement is not applicable to collecting adequate
climate data. That’s a must, because later on, even with more insight,
measurements missed in the past would still be missing, forever.

_______________________________________

On lack of proper experiments & theory, wiki is your
friend<http://en.wikipedia.org/wiki/Non-equilibrium_thermodynamics#Quasi-radiationless_non-equilibrium_thermodynamics_of_matter_in_laboratory_conditions>,
in this case.

There is much theoretical work going on in the field, of course,
unfortunately with little experimental backup. But it is ignored
by *mainstream* climate science anyway.

For a review see:

Invited contribution to
*Variational and Extremum Principles in Macroscopic Systems*,
H. Farkas and S. Sieniutycz, eds.,
Amsterdam: Elsevier Science, 2004
*The Nonequilibrium Thermodynamics of Radiation
Interaction*<http://home.earthlink.net/~dckennedy/pubs/nonEqThermoRad.pdf>
*Christopher Essex, Dallas C. Kennedy, Sidney A. Bludman*

I could imagine some promising directions for theory to go, like SOC
(Self-Organized Criticality), SAD (Sandpile Avalanche Dynamics), MEPP
(Maximum Entropy Production Principle) and the like, but with no
experiments theory can only go so far. We definitely need more constraints
to be able to handle such systems properly.

Why, I don’t even think we can do a detailed computational model of a
boiling pot on a stove yet.

Comments
  1. Hans Jelbring says:

    All modellers wake up!
    What Peter basically tells is that when a complex reality is simplistcally modeled the output will be trash with no scientific value (a validated approximation of nature). In short: Whatever input into that kind of model will produce BS as output. I fully agree to such a conclusion. For a while it was claimed that more computer power would solve all problems. It did not.

    Peter is advocating an educational effort in a laboratory environment which might teach a serious scientist the difficulty to predict the output in a complex system which for sure is hard or even impossible to achieve. It still won´t be enough to explain the riddle of climate change recorded in geological archives (ice ages etc.). If it was the answers would already been found. The major physical factors that decide climate change on earth, Jupiter, Mars and our sun cannot be replicated in a laboratory since gravity is constant in such an environment.

    The gravity constant isn´t constant at celestial scales. One strong indication of that fact is the variation in LOD (Length Of Day). Earth slows down when moon is passing earth´s equatorial plane regardless of its distance to earth, which must be strange observational evidence for a Newton scholar.

    What is needed for predicting climate change is an understanding of how the energy exchange between celestial bodies actually works (including atmospheric mass transfer oscillations in atmospheres on earth, Mars, Jupiter, sun etc.). Knowledge in this area can only be improved by examining observed data which is at hand today. Existing models are too simplistic mainly because the complete physical understanding of gravity is lacking.

    This lack of understanding has led to the introduction of a number of strange concepts in the world of physics such as dark matter and dark energy which can be associated with another well known concept named the dark ages. It is a shame when scientists are becoming the leaders and advocates of superstition. “Selling” bad modelling results is a way to do just that.

  2. tallbloke says:

    Judy curry has a thread running on this subject too:

    Climate model discussion thread

    I was impressed with her Royal Society presentation, which I watched her deliver at the Uncertainty meeting at Chicheley Hall last month. You can hear it yourself here:

    [audio src="http://downloads.royalsociety.org/audio/KAV/TM/2012_07/Curry.mp3" /]

  3. Paul Vaughan says:

    The problem is not so much with lack of understanding of physics as with lack of deep intuitive understanding of how to quantitatively identify universal constraints in chaotic systems via carefully tuned spatiotemporal aggregation criteria.

    Lab experiments like those described by Peter could potentially be of help, but only if the experimenters develop deep intuitive understanding of aggregation criteria fundamentals. Without this, they’ll just end up haphazardly modeling dozens of lab systems using the same type of strictly untenable assumptions they use to model climate.

    There’s no escaping the need to learn how to tune summaries to bring universal constraints on spatiotemporal chaos into focus (via Central Limit Theorem).

    There are analogies with the mechanical systems of a car, but in fluid systems multiple wheels run off differentials and gearing doesn’t advance in discrete steps.

    That is why I have developed a complex wavelet based metric that can simultaneously measure both amplitude & frequency modulation. It succeeds in identifying mutidecadal terrestrial waves from sunspot numbers and it also succeeds in identifying ENSO from semi-annual length of day.

    I cannot believe that this metric is not in widespread use in fluid mechanics, but I do not know what it is called by specialists in that field. I’ve no doubt I’ve reinvented the wheel, as continually happens in applied mathematics. Although my background spans 7 fields of study, it is humbling to realize that I am not sufficiently hybridized to solve the multidecadal climate puzzle 100% independently.

    If anyone knows what this metric is called by fluid chaos experts, please let me know. I informally provide a coarse overview here:

    Open thread weekend

    Giving a detailed overview is far beyond the scope of constraints on my current time & resources, but I can offer brief clarification here if needed. If I ever make it back to local academia, I’ll be willing to work with a group of multidisciplinary hybrids to formalize.

    The issue raised by Peter is absolutely fundamental. Over the coming years & decades it needs to be addressed sensibly by sensible mainstream leaders to correct the fatally corrupting accumulating impacts of dark deception & dark ignorance that have arisen via unchecked incompetent leadership, something we all share responsibility for tolerating.

  4. oldbrew says:

    ‘The proper approach is to seek a simple theory’

    As long as it’s not the GHG theory of course 🙂

  5. Berényi Péter says:

    Actually, I don’t believe the current generation of climate modellers are inclined to wake up. Probably no climate modeller will ever wake up, because that would put them out of job. Just see the reception of the most important (and, unfortunately, unique) lab experiment in climate science, CERN CLOUD of Dr. Kirkby, within the community, along with the funding issues accompanying it.

    But, even if truth never triumphs, its opponents die out eventually, don’t they?

    On the other hand I have high hopes that some bright young PhD student may make it, possibly from outside “climate science”. For that field is all but doomed by now. This is why any poll on opinion of climate scientists (like 97% of them believes whatever) is irrelevant. If a field of science slips into pseudo scientific practices, it is always experts from neighboring fields, who can deliver reliable judgement, not the guys within. How much chance a homeopath has to deny effectiveness of solutions diluted until not a single molecule of the advertised agent is left behind?

  6. Berényi Péter says:

    Paul Vaughan says:

    Lab experiments like those described by Peter could potentially be of help, but only if the experimenters develop deep intuitive understanding of aggregation criteria fundamentals. Without this, they’ll just end up haphazardly modeling dozens of lab systems using the same type of strictly untenable assumptions they use to model climate.

    Yes, they may need deep intuitive understanding and such to get anywhere, but only as a heuristic device. That’s the nice thing about experiments conducted under lab conditions. They can be repeated over and over, while controlling all the important parameters. If a model does not fit perfectly, it can easily be falsified.

    Any haphazard modelling attempt using strictly untenable assumptions will end up like that.

    If actual experiments were done on a multitude of physical instances of closed non-equilibrium thermodynamic systems, coupled radiatively to their environment, one would expect one of two outcomes.
    1. It may turn out, that such systems are not bound by any general law beyond simple and well known ones (like conservation of energy and non-negativity of entropy production). It probably means their long term behavior can follow any number of trajectories within wide margins, i.e. it’s pretty unpredictable. But then there is no reason to expect the terrestrial climate system, a humble member of that wide class, be any different.
    2. A more likely outcome, in my opinion, is to find proper variational / extremum principles that apply to wide classes of such systems, especially to those having a vast number of coupled internal degrees of freedom. That’s a step forward in theory, which would advance our understanding of the climate system on a general level. In practice it would put strict restrictions on expected statistical behavior of trajectories, a benchmark GCMs could be checked against and rejected if their long term behavior would not conform to it. That is, they would become falsifiable, in the Kuhnian sense, which they are currently not.

  7. Berényi Péter says:

    It may well be the case, that there are many, perhaps fundamentally different classes of non equilibrium thermodynamic systems. Each and every living being, animals, plants, fungi, bacteria included, is such a system after all.

    This diversity may extend to inanimate systems as well. If so, we have no idea what class our climate system belongs to, or if there’s such a classification at all. It is even less clear what observables might define class boundaries and what dynamical properties would follow from class membership. Provided of course, this approach is workable.

    Another question, that can never be settled theoretically, much less by in silico experiments. One needs lots of dirty experimentation, not on binary representations, but on actual physical objects tortured in the lab until they confess.

  8. Berényi Péter says:

    Computers while on, by the way, are also belong to a prominet class of non equilibrium systems. Just take away the power or stop the fans to prevent waste heat removal, and you’ll know.

    I wonder if the climate system is Turing complete or not. Or if the Church–Turing thesis is applicable to it at all.

  9. Paul Vaughan says:

    Péter Berényi,

    I can go along with most, if not all, of what you say, assuming the parties conducting the experiments are sufficiently competent to undertake the task with integrity. Probably only a fraction of climate scientists would be qualified to oversee such a project.

    I hope some agencies will see the merit in your proposal and fund it. In addition to yielding a physical system classification system based on aggregate constraints, the suite of experiments might also yield a classification system for quantitative methods used to identify the aggregate constraints using (potentially very small) subsets of the collected data. I know many academics who would find such a project quite delectable.

    Thank you for volunteering a stimulating contribution.

  10. tallbloke says:

    I recall a rotating table experiment undertaken by geophysicists investigating geomagnetism and the terrestrial ‘dynamo’ using a big cylinder full of liquid sodium.

    http://www.sciencedirect.com/science/article/pii/S0031920111001592

  11. Michael Hart says:

    We live in the era in which mass-computing power first became available to almost all citizens, not just a very few. In future, I suspect that climate-modeling will be more widely seen as one of the disciplines in which technology briefly ran ahead of the skills being taught to employ it correctly-to the detriment of science.

    An example of another mis-step in human science that comes to my mind is analytical techniques such as Gas-Chromatography/Mass-Spectrometry enabling detection of truly tiny amounts of, say, dioxins: That some dioxins are very toxic and were first associated with synthetic pesticides led some to erroneously think that a) all environmental dioxins are of human origin, and b) that they are necessarily harmful in the amounts detected.

    In many cases “environmental toxins” were already present, and claims of their harmful effects were not substantiated at the new lower detection levels.

    Give Greenpeace a Geiger-counter and you just know that they will find some ‘harmful’ radio-isotopes somewhere that can be blamed on humans.

  12. Some food for thought.

    Entropy 2010, 12, 613-630
    doi: 10.3390/e12030613
    The Maximum Entropy Production Principle: Its Theoretical Foundations and Applications to the Earth System
    James Dyke and Axel Kleidon

    “For example, the Earth’s atmosphere is not in a state of MEP with respect to short to long wave radiation absorption and emissions because there are no real degrees of freedom for the system to do otherwise.”

    What they mean is that most of the entropy production happens in the climate system, when short wave radiation coming from the sun gets absorbed and thermalized. Therefore a pitch black planet would produce a considerably larger outgoing entropy stream from the same incoming SW radiation, than Earth does, which is partially covered by clouds (and to a lesser extent by snow/ice), which makes it pretty bright, i.e. it reflects part of solar radiation (~30%) back to space without increasing its enrtropy much. That makes MEPP inapplicable to the entire climate system.

    However, it remains an enigma, what is meant by the pharase “there are no real degrees of freedom for the system to do otherwise”. Terrestrial albedo surely depends on lots of things and there are considerable regional / temporal variations. At the same time overall albedo seems to be well regulated, as far as it is measured, its average value is restricted to a narrow range.

    One can put it more directly: overall albedo is determined by a vast number of internal degrees of freedom, and why those are considered unreal is not explicated at all.

    My ten cents go for a different explanation. We should look for some other extremum principle for systems radiatively coupled to their environment, not MEPP:

    In the Archean, in spite of the Sun being much weaker, Earth did have liquid water on it most of the time. The most straightforward explanation is that its albedo was lower due to less clouds, while exposed water surface is almost black as seen from above.

    It is not new, that radiation behaves somewhat differently, than material objects. If we know for example the starting and end points of a light beam in a medium with varying refractive index, we can calculate the entire crooked path based on the assumption that light follows the fastest possible route.

    It does not work that way with objects having non zero rest mass. Then we have to rely on the principle of least action, which is a bit more complicated. At the end, of course, it turns out behavior of light described above is only a specific case of this, more fundamental principle.

    But a theoretical derivation of terrestrial albedo is moot. The proper way to shed some light on this question is to bring it to the lab and measure properties of irradiated globs of fluids, both radiative ones and others. It’s only then we’ll be able to verify any theory about it and see if MEPP is a specific case of a more general extremum principle in the absence of radiative interaction or it is something entirely different, more marvelous occurrence we’ve run into.

  13. Berényi Péter says:

    I have stumbled upon this page, Extremal principles in non-equilibrium thermodynamics in the archives of the Azimuth Project. Saving the planet, indeed. While basic physics is still in fragments, unfinished.

  14. I like to describe the situation seen by Peter Berenyi in somewhat different terms. Today’s climate models possess no underlying statistical population. It follows that the counts of observed events, called “frequencies” do no exist for modern climatology. The ratios of frequencies called “relative frequencies” do not exist. The theoretical counterparts of relative frequencies called “probabilities” do not exist as scientific concepts. The concepts called “true” and “false,” which are simply the results from confining the numerical values of probabilities to 0 and 1, do not exist as scientific concepts. Logic, which relies upon the concepts of “true” and “false” does not exist. Thus, modern climatology lacks the means for determination of whether an inference is correct or incorrect. If this state of affairs is unclear, this is a result from multiple applications of the equivocation fallacy on the part of climatologists. Applications of this fallacy make it seem to many as though the elements of a scientific methodology that are missing for climatology, including logic, are present (http://wmbriggs.com/blog/?p=7923).

    For logic and the scientific method to be brought to global warming climatology, a statistical population must be installed underneath the climate models. The models must be addressed to predicting the outcomes of events in this population. For statistical significance, the resulting models will necessarily describe the climate at a much greater level of abstraction than current models..