Archive for the ‘methodology’ Category

This article is part II  of “A new Lunar thermal model based on Finite Element Analysis of regolith physical properties“,  written primarily by gallopingcamel (Peter Morcombe), edited and prepared for WordPress by Tim Channon.


Figure 1 (click full size)

Modeling the Moon

A few months ago an analysis of the Moon’s equatorial temperature was posted here using two different types of engineering software. Tim Channon used SPICE circuit analysis software originally developed at Berkeley while I used Quickfield, a finite element analysis program developed by Tor Cooperative, a Russian firm, marketed outside Russia by Tera Analysis. In addition, several detailed comments were received from “br” who used LTSPICE from Linear Technology Inc.

Two very different methods. The results were identical.

Both Quickfield (in Student edition) and LTSPICE are freely available for download for those interested in replication or for further investigation.


Jennifer Marohasy has a new post “Revisionist Approach Destroys Information About Natural Cycles Embedded in Climate Data” where there is underlying interest for Talkshop readers. Mention of Ken Ring is perhaps not so good given a reputation for excessive claims, caveat emptor.

Her take is from an Australian perspective mentioning a Senator and the lead author is Australian.

Periodicities in mean sea-level fluctuations and climate change proxies: Lessons from the modelling for coastal management
R.G.V. Baker, , S.A. McGowan
BCSS, Faculty of Arts and Sciences, University of New England, Armidale, NSW 2351, Australia
Available online 12 July 2014

Elsevier so it is paywalled


Science in action

Science in action

The Hockeyshtick highlights a peer reviewed paper that puts media personality and ‘official climate science’ promoter Bill Nye in the spotlight.

‘Not only did the authors find that addition of the non-greenhouse gas Argon had similar heating effects to CO2, the Argon control actually heated up slightly more than in the greenhouse gas CO2 experiment, definitively proving that such experiments assume the wrong “basic physics” of radiation were responsible for the heating observed, instead of the limitation of convection due to CO2 having a greater density compared to air.’



Academic economist Richard Tol has been on the receiving end of some nasty misrepresentation published by notoriously alarmist UK small circulation newspaper ‘The Guardian’. One of it’s ‘columnists’, Dana Nuccitelli, an employee of a big oil and gas outfit called Tetra-Tech, has been writing inaccurate and scurrilous pieces on Tol since he decided to check the quality and accuracy of a paper Dana co-authored with cartoonist John Cook.

Cook runs a parody website called ‘Skeptical science’ which sends up the climate debate with a collection of joke impressions of climate-sceptical talking points and ‘mainstream climate science responses’ to them. Somehow, the Guardian, a self important and supposedly highbrow newspaper, mistook Dana for a real commentator on science and gave him a job as a blogger. Richard writes:

The Guardian has published six hatchet jobs impugning me and my work. The first four are under investigation by the Press Complaints Commission.

For hatchet job #5 and #6, the Guardian granted me the right to reply by return email. They were published together, without a clear structure and in the wrong order, with the first piece heavily edited. Here are the originals.


Reblog from The Hockey Schtick, new paper in Quarterly Journal of the Royal Meteorological Society. Pesky radiosonde data again. Maybe the balloon has gone up on models Climate Sim World.

New paper finds climate models violate the ‘basic physics’ of the 2nd law of thermodynamics
A paper published today in the Quarterly Journal of the Royal Meteorological Society finds climate models violate the ‘basic physics’ of the Second Law of Thermodynamics with respect to simulating conventional turbulent heat flow, one of the most important mechanisms of heat transfer in the atmosphere.

According to the authors,
“Numerical models of the atmosphere should fulfil fundamental physical laws. The Second Law of thermodynamics is associated with positive local entropy production and dissipation of available energy.”
i.e. entropy always increases and energy always dissipates per the second law of thermodynamics. …

Link to THE HOCKEY SCHTICK and paywalled paper.


From EUrActiv:


A big challenge for the next European Commission will be to disconnect its evidence gathering processes from the “political imperative” that’s driving policy proposals, according to Anne Glover, the EU’s chief scientific advisor.

Speaking before the EU elections last week, Glover reflected upon her role, which was introduced by the outgoing President of the European Commission, José Manuel Barroso.

Glover was appointed in December 2011 to provide the President of the EU Executive with first-class independent scientific advice. A trained biologist who holds a chair in Molecular and Cell Biology at the University of Aberdeen, she previously served a as chief scientific advisor for Scotland (2006-2011).

More than two years into her job, she seems to have learned a great deal about the internal working of the EU’s flagship institution.

And her assessment of what goes on inside the Commission’s walls is not rosy.


Figure 1

“A review of Holocene solar-linked climatic variation on centennial to millennial timescales: Physical processes, interpretative frameworks and  a new multiple cross-wavelet transform algorithm”
Willie Soon, Victor M. Velasco Herrera, Kandasamy Selvaraj, Rita Traversi, Ilya Usoskin, Chen-Tung Arthur Chen, Jiann-Yuh Lou, Shuh-Ji Kao, Robert M. Carter, Valery Pipin, Mirko Severi, Silvia Becagli”


This is a review work introducing a new proxy, nitrate, and fancy wavelet methods. It discusses the 1400 .. 1800 year “cycle” but sits on the wall on whether these are one and the same.


Physics secret of the pyramid builders?

Posted: May 3, 2014 by oldbrew in methodology

Great Pyramid of Giza [image credit: Wikipedia]

Great Pyramid of Giza
[image credit: Wikipedia]

‘There have been many hypotheses about the Egyptian pyramid construction techniques. These techniques seem to have developed over time; later pyramids were not built the same way as earlier ones. Most of the construction hypotheses are based on the idea that huge stones were carved with copper chisels from stone quarries, and these blocks were then dragged and lifted into position. Disagreements chiefly concern the methods used to move and place the stones.‘ – Wikipedia

Q. How do you haul a sledge with a massive stone on it through the desert?


Article by Peter Morcombe (gallopingcamel) with some assistance from Tim Channon.


While investigating Nikolov & Zeller’s “Unified Theory of Climate” it seemed odd that professional scientists could not agree what the temperature of an airless Earth should be. Given that one needs to know this in order to compute the Greenhouse Effect (GHE), I tried to settle the question by analyzing the Diviner LRE data that accurately mapped the Moon’s surface temperature. This effort failed as my spreadsheet could not handle even the “Level 3” data. The Diviner team did much better and showed that the Moon’s average temperature is 197.3 Kelvin.

While the temperature of the Moon is now known with impressive precision, would an airless Earth have the same temperature or would the different rates of rotation have an effect?


My thanks to Tony Thomas for giving the talkshop the exclusive of his take on this breaking news item:

Gergis findings re-surface – the Hockey Stick lives!

By Tony Thomas 31-03-2014

josh-gergisHello again Hockey Stick, goodbye global Medieval Warming Period.

These are the conclusions of a multi-proxy 1000-year climate reconstruction published today (March 31) in Nature Climate Change, by Dr Raphael Neukom of the Oeschger Centre at the University of Bern, and Dr Joelle Gergis of the University of Melbourne.

Dr  Neukom   summed up for a University of Melbourne press release:

The study showed the ‘Medieval Warm Period’, as identified in some European chronicles, was a regional phenomenon. 

During the same period, temperatures in the Southern Hemisphere were only average. Our study revealed it was not a common climate event that many people have previously assumed.

The paper claims that in 99.7 percent of the results, the warmest decade of the millennium occurred after 1970.

The press release says,“And surprisingly, only twice over the entire past millennium have both hemispheres simultaneously shown extreme temperatures.

One of these occasions was a global cold period in the 17th century; the other was the current warming phase”.”[1]


Guest Post emailed to me  by Tony Thomas, originally published at Quadrant online:

Finally, Some Real Climate Science
Tony Thomas 18-3-2014

Shulz_cartoon_for_APSThe American Physical Society has been amongst the loudest alarmist organisations whipping up hysteria about CO2, but a review of its position that has placed three sceptics on the six-member investigatory panel strongly suggests the tide has turned.

The 50,000-strong American body of physicists, the American Physical Society (APS), seems to be turning significantly sceptical on climate alarmism.

The same APS put out a formal statement in 2007 adding its voice to the alarmist hue and cry. That statement caused resignations of some of its top physicists (including 1973 Nobel Prize winner Ivar Giaever and Hal Lewis, Emeritus Professor of Physics, University of California, Santa Barbara).[1] The APS was forced by 2010 to add some humiliating clarifications but retained the original statement that the evidence for global warming was ‘incontrovertible’.[2]

By its statutes, the APS must review such policy statements each half-decade and that scheduled review is now under way, overseen by the APS President Malcolm Beasley.

The review, run by the society’s Panel on Public Affairs, includes four powerful shocks for the alarmist science establishment.[3]


Nicola Scafetta and Richard Willson have a new paper in press which contains the most thorough analysis yet of the intercomparison of the empirical ACRIM and modeled PMOD TSI series. It’s a comprehensive yet readable paper of high interest to all diligent climate researchers interested in determining the relative strengths of various climate drivers. It is also an important historical document for philosophers of science investigating the shift from observation based empirical solar science to model based  dogma underpinning preconceptions of the power of trace gases to control Earth’s surface temperature. The IPCC and Team Wassup’s Leif Svalgaard are not going to like it, and will therefore try to ignore it, thus further underminng their credibility.


ACRIM total solar irradiance satellite composite validation versus TSI proxy models
Nicola Scafetta & Richard C. Willson

From the paper:

PMOD TSI composite (Fröhlich and Lean 1998; Fröhlich 2004, 2006, 2012) is essentially a theoretical model originally designed to agree with Lean’s TSI proxy model (Fröhlich and Lean 1998). It relies on postulated but experimentally unverified drifts in the ERB record during the ACRIM Gap,and other alterations of the published ERB and ACRIM results, that are not recognized by their original experimental teams and have not been verified by the PMOD by original computations using ERB or ACRIM1 data.


Nicola Scafetta has reminded me to revisit his global average surface temperature (GST) forecast (Cyan area), which he derived from a simple phenomenological model using solar system planetary frequencies at the start of 2012. It is clearly much more accurate than the IPCC projection from 2000 (shown in Green on the figure below).

Click for full size image

Click for full size image

See below the break for more information on the Solar-Planetary Theory Nicola used.


prof. Giovanni P. Gregori - Docente di Fisica Terrestre e ricercatore CNR all'Istituto di Acustica O.M.Corbino C.N.R. di Roma. 1963-2001  Ricercatore CNR all'IFA/CNR (Istituto di Fisica dell'Atmosfera), Roma, con l'incarico di studiare le Relazioni Sole-Terra. Le aurore polari ed il geomagnetismo (1963-1975) lo hanno portato ad un modello di magnetosfera (1970-1972) considerato uno dei suoi migliori risultati.

prof. Giovanni P. Gregori – Docente di Fisica Terrestre e ricercatore CNR all’Istituto di Acustica O.M.Corbino C.N.R. di Roma. 1963-2001 Ricercatore CNR all’IFA/CNR (Istituto di Fisica dell’Atmosfera), Roma, con l’incarico di studiare le Relazioni Sole-Terra. Le aurore polari ed il geomagnetismo (1963-1975) lo hanno portato ad un modello di magnetosfera (1970-1972) considerato uno dei suoi migliori risultati.

One of our merry band of collaborators on our Special Edition of Pattern Recognition in Physics, the journal axed by executive officer Martin Rasmussen of parent publishing house Copernicus, and castigated by science blogger Anthony Watts, is Italian physics professor Giovanni P. Gregori. here’s the letter he sent to Rasmussen:

Martin Rasmussen, Esq.,
Copernicus Publications.

Ref.: Pattern Recognition in Physics

Dear Mr. Rasmussen,

following the letter by the Viscount Monckton of Brenchley, I guess I have to spend a few words on this unfortunate controversy.

I like to begin and recall a few statements by Jules-Henri Poincaré (1854-1912).

“La liberté est pour la Science ce que l’air est pour l’animal”
["Freedom is for Science much like air for an animal"
Dernières pensées, appendice III]

“La pensée ne doit jamais se soumettre, ni à un dogme, ni à un parti,
ni à une passion, ni à un intérêt, ni à une idée”
["Never submit thought to any dogma, or to any party,
or to any passion, or to any interest, or to any idea"]

“La pensée n’est qu’un éclair au milieu d’une longue nuit.
Mais c’est cet éclair qui est tout”
["Thought is like a lightning in the middle of a long night.
But this lightning is everything"]

Science is made of ideas, both correct and wrong. How can we assess what is correct if this is not compared with what is wrong? Observations, models, extrapolations, forecast, etc. are not science. They are only tentative applications of science. But science is made of ideas.


Having been down a similar route with my own simple model which replicates HADsstV3 to a Pearson R^2 of 0.9 for monthly data, I’m happy to put up this model which achieves an accuracy of R^2=0.95 for smoothed data, using a slightly different technique. Sunspot numbers are a major component in this model by Andrew McRae, although so far as I can tell there is no integration to simulate ocean heat retention as there is with my own effort. Hopefully, he’ll make the spreadsheet available for sharing to interested parties:

Andrew writes:

I was inspired by the work of Dan Pangburn and decided to try to create a simple climate model using the external solar magnetic forcing and internal 60yr ocean cycle as the main factors, with a bit of CO2 thrown in just so it doesn’t feel left out.
The results were quite… interesting.

Here is a screen shot of the model output compared to measurements, plus a few background details in the caption of which I’m sure all of you are already aware but I wanted to write the caption for a potentially wider audience.


How to eliminate malaria

Posted: November 7, 2013 by tchannon in methodology

If it proves correct a paper by Lena Huldén, Ross McKitrick, Larry Huldén in a journal of the Royal Statistical Society gives the blueprint for how to eliminate malaria.

The solution turns out to be reducing the number of humans sleeping as one unit, where the threshold is 4 to 4.5, fewer and malaria will die out.

At first this might seem illogical as a causal, then consider the feeding habit of malaria mosquito: later it returns to close to the same place for another meal of blood, hence infection is transmitted between humans.

Break the threshold and the parasite becomes less infective than is necessary for survival.

This is more good statistical work.

I’ll let Bishop Hill lead, link there to a lay explanation by Ross.


Excerpt from Miles Mathis’ long paper on the precession of the perihelion of Mercury (short version here). I’m posting this for it’s general applicability, without getting into a Hu Flung Dung about relativity (Which Mathis isn’t challenging anyway – just the errors in its calculation).

miracle-mathI have heard some complain, regarding my papers, that I talk too much. Physicists and mathematicians are used to being fed just the equations, with very little or no explanation of what the equations are representing. But although I understand the complaint, I refuse to recognize it as valid. In my opinion, the slowness of progress in physics has been caused by this refusal of mathematicians and scientists to tell us what is going on. How else could Einstein’s major mistake in this problem have stood for so long? It is because he gave us no explanations with his equations. He skipped all or most of the procedural steps, and just supplied the bald derivation of the number. Yes, the use of the tensor calculus made the math quite lengthy, but the procedure was still but a skeleton. Einstein refused to lead us through his maze, so that we could see what was happening all along.


RichardLindzenVia GWPF

Date: 08/10/13 Richard Lindzen, MIT

Each IPCC report seems to be required to conclude that the case for an international agreement to curb carbon dioxide has grown stronger. That is to say the IPCC report (and especially the press release accompanying the summary) is a political document, and as George Orwell noted, political language “is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”

With respect to climate, we have had 17 years without warming; all models show greater tropical warming than has been observed since 1978; and arctic sea ice is suddenly showing surprising growth. And yet, as the discrepancies between models and observations increase, the IPCC insists that its confidence in the model predictions is greater than ever.


The IPCC base their claims about how much warming is ‘in the pipeline’ on the rate that carbon dioxide is eventually removed from the atmosphere, the ‘e-folding time’. This is different to the time it takes for any emitted joe-average co2 molecule to be re-absorbed in the carbon cycle, and reflects the assumptions made about the way the carbon cycle operates.

The IPCC relies on the ‘Bern model, which was cooked up many years ago by Bert Bolin and other atmospheric scientists of the warmist persuasion. The Bern model makes assumptions which lead to a very long e-folding time of hundreds of years, a figure which has been disputed by several able researchers, and discussed here at the talkshop in previous posts.

Now talkshop co-blogger Tim Channon has made novel use of data which shows what has happened to the radioactive carbon 14 isotope carrying co2 levels since the atmospheric nuclear bomb tests of the early 1960’s. The results look like another hole below the waterline for the IPCC and the climate alarmists. – tb


Figure 1


Figure 2

Figures 1 and 2 are demonstrating both northern[1] and southern[2] hemisphere decay from a Dirac injection[3] of a test signal. The consequent effect is very close to perfectly linear, proportionality between pressure and effect of pressure over more than an order of magnitude of data variation (hence linear law). This seems to destroy IPCC claim of a non-simple law. Deviation is <1%

In addition the effect is a simple low pass filter on all kinds of atmospheric carbon dioxide. A later article might cover this in detail.

Note: this article is cross posted from the authors blog, discussion is probably more appropriate on the Talkshop. Some of the content has been the subject of wide discussion around the ‘net but not so far as I know here.  — Tim


I’ve registered my interest in setting up some Massive Open Online Courses (MOOCs) at, Google’s new venture in partnership with edX, the long running MOOC provider partnership of Harvard, MIT and 26 other leading global educational institutions. It offers the possibility of using free services to run courses with interactive learning tools in an educationally supportive environment. Given the complex nature of the material we are dealing with here at the talkshop, I see possibilities for taking advantage of what Google is offering. We could use the collaborative space for holding online conferences with video, whiteboard scratchpads everyone can doodle on for instant sharing of concepts, data etc. I’m wary of Google, but they do some things well, and the wider the audience for our ideas the better. Nothing we do is a secret, we believe in open and collaborative development of ideas, so it all seems to mesh. Ideas, worries and criticisms welcome.

critical-thinking-cartoonThe Conversation has this:

The entrance of Google onto the Massive Open Online Courses market this week has the potential to reignite the spirit of openness that saw these alternative routes into higher education emerge in the first place.

The internet giant is to work with Harvard University and the Massachusetts Institute of Technology on a website, which will go live next year.

MOOCs have exploded over the past year. With companies likeCoursera teaming up with a number of US universities to offer free online courses and Open University launching FutureLearnto offer an alternative for UK universities, it seems that everyone is frantically scrambling for a MOOC solution.