Lisbon workshop output example: Climate Datasets

Posted: January 30, 2011 by tallbloke in climate

Contributor B.Kindseth says:
“In engineering, there are established properties for every material that you use and established specifications for manufacturing processes and analytic procedures. When a model is built, every input is based on solid scientific and empirical ground. But it does not end there. After a design is completed, hardware is built and tested. Data from the tests are used to match the model to the data. I do not see much similarity in the global warming science community.

The IPCC documents are not a statement of science, but propaganda documents by definition which only present one side of the science. We need to get back to the basics of science, possibly a web site which an encyclopedia of basic information. That should include defintion of acceptable statistical procedures.”

Absolutely! This is one of the issues we discussed at Lisbon, with particular reference to climate datasets. Our table had a discussion on this and produced a statement at the end of the hour. Here it is:

There are numerous different datasets used to measure aspects of climate and its changes over time. There is a need for a thorough and objective assessment of the strengths and weaknesses of the different datasets for the purposes of climate measurement.

There is also a need for a wider understanding of the implications of selecting particular indices when there are multiple measures of the same underlying quantity, especially when there is no agreement on what the most appropriate measurement is.

The climate sciences would benefit from the adoption of procedures, definitions and datasets which are validated by agreed standards. We recommend the commencement of a process which will lead to the fulfillment of this aim.

I know this seems elementary and obvious, but you’d be surprised at the amount of negotiation between the various points of view expressed to arrive at this short statement. Negotiation isn’t easy! Perhaps the main point to take away is that this represents the common ground achieved through discussion between people who don’t usually discuss these issues together. Make of it what you will.

Comments
  1. Tenuc says:

    Back to basics is exactly where climate science needs to go. Because current assumtions about climate depend upon these unreliable data-sets, these assumption should go out the window as of now.

    They also need to stop treating climate as a simple linear system and start working to invent a toolkit which will enable them to reliably assess what is a highly non-linear dynamic system driven by deterministic chaos.

    Unless real science starts to be done on the issue of climate, no progress can be made.

  2. tallbloke says:

    Hi Tenuc. yes, that’s why I mostly pontificate about internal climate changes, and calculate solar stuff, which is less controversial and can be better defined.

  3. Roger Andrews says:

    “There are numerous different datasets used to measure aspects of climate and its changes over time.” Indeed there are. Three of them are sea surface temperatures, lower troposphere temperatures and ocean heat content, none of which show any warming since about 2000. But the data set we are supposed to be using is the surface air temperature record, and it shows about 0.3C of warming since 2000. (Why is it different to the other data sets? After years of head-scratching I still don’t know, but it isn’t UHI effects.)

    “The climate sciences would benefit from the adoption of … data sets which are validated by agreed standards.” Indeed they would. Right now the world’s “official” surface temperature time series is HadCRUT3, which is constructed by applying large corrections that range from ad-hoc to demonstrably invalid to the sea surface and surface air temperature records and then by adding these two incompatible data sets together. Speaking as someone with experience in validating data sets to regulatory agency standards, I can guarantee that anyone who took HadCRUT3 to a bank and tried to borrow money against it would rapidly be defenestrated.

  4. R. de Haan says:

    And still the most prominent claim besides the claim that the planet is warming due to CO2 is ‘consensus”. May I have a big laugh?

    Ignorance and tunnel vision provide a better explanation.

  5. tallbloke says:

    Over on Judy Curry’s site, commenter Theo Goodwin says:

    Theo Goodwin | January 30, 2011 at 7:53 pm
    Total reconciliation is a total no brainer. Simply get together a panel of AGW scientists and sceptics that is satisfactory to people on the internet, broadcast their meetings live on the internet, and give each panelist a live internet connection. The topic to be debated is the most important in all categories listed. The topic is:
    Design a measurement system for temperature and other essential items, such as heat, that will satisfy both sceptics and AGW scientists on the panel and that can be readily understood by the educated citizen. Design a system for management of the measurement system that is transparent to every educated citizen and, thereby, guarantees that measurement reports are not biased. Design a system for implementation of both the measurement and management systems. Until these systems are up and running, redirect all funding for climate research to the creation and implementation of these systems.

    This no-brainer proposal is necessary to the stated goals because:

    1. Disagreement with mutual respect
    There can be no respect until the data are trustworthy and their management transparent.

    2. Find better ways to communicate criticism.
    Science cannot engage in its own natural process, which is data driven criticism, until there is trustworthy data and transparent systems for data management and reporting.

    3. Find better ways to admit mistakes without damage to reputation.
    Accept that the data shows that your hypotheses are not confirmed. Such acceptance is part-and-parcel of science and should occasion no harm to reputation.

    4. Find some common ground, something to work on together.
    Trustworthy data and transparent management of reports on data.

    5. Find where interests intersect.
    See 4.

    6. Importance of transparency
    See 5.

    7. Communication engenders trust.
    Communication about trustworthy data managed in a transparent process not only engenders trust but embodies it.

    8. Search for win-win solutions.
    The first and only win-win solution that is available is a system of measurement that everyone embraces, a system of management that is transparent to all, and a system of reporting whose transparency renders bias impossible. Until this win-win solution is embraced, there are no other win-win solutions.

  6. Beano says:

    Unfortunately we all know that the science content of The Climate Crisis has been lost years ago. It’s all U.N. and activist political and about Agenda 21 which includes the transfer of wealth.

    Different ideologically minded climate and earth scientists can start to agree to sort out their differences but the political forces have moved on.

  7. P.G. Sharrow says:

    The actual land air temperature records still exist. The data bases are still intact. The problem is in the “adjustments” and use of the data in the climate computer data bases to make propaganda pronouncements. E.M.Smith on his “Chiefio” site examined the GISS raw data base to great extent and found no warming signal.

    The air temperature increase is “adjustments” made up by selective ignoring cool stations and adding UHI to rural station records that show no warming. Adding cool corrections to old records and lately concentrating on use of recordings from the middle of international airports.
    The adjusted records used to show warming in California is from 4 airports near the coast and this is used to project the temperature for the whole 680 mile long state from the ocean to the 10,000ft snow covered mountains.
    The arctic temperature is projected from stations up to 1200k away.

    But the actual stations are still collecting the records. New data bases can be created. Anthony Watts has made a good start on laying the ground work for station examination to create usable data from each station so that “corrections” will be acceptable to real scientists. pg

  8. I have been using a 2002 set of the Coop summary of the daily records of all ~22,000+ reporting stations, from my examination of the original records from several local stations compared to the Coop TD3200 data set it appears that the archived daily raw data is still uncorrupted and unadjusted, it would seem they took the lazy way out and only adjusted, modified the monthly average data from the stations I looked at, hiding the small changes in the process of getting the average with out quoting all of the meta data of the stations used each day for the whole month.

    For my analog forecast purposes the raw daily data from all reporting stations is used for maximum definition of the effects of large bodies of water, UHI effects, on an average distance of 15 to 20 miles apart, girding the data on 0.1 degree Lat/Log coordinates, post beta product with an improved 1 F (single degree contour line steps for temperatures, and a better method of seasonal shift adjustments, that yields a better following of extreme excursions of invading cold air masses).

    With an improved gridding program for increased tracking of station meta data, and adding an algorithm for excising the past pulses (of about 5 to 7 days in length) of high precipitation trends due to Past outer planet Synod conjunctions, and compiling from their repeating patterns every year repeats slightly later to form a composite set of values one can expect the outer planetary interferences (in the sun/earth/lunar base pattern) for the current period to be reinserted into the formation process of the current grid from which to generate the maps.

    As a usable working tool for the understanding of the dynamics at work I hope to put together a program for the selection and viewing of stand alone Maps for each of the past cycles, as well as a composite of the three compared to the most recent (yesterdays, and past 60-90 days actuals) in a progression of concurrent surface maps, and satellite photos from which a forecaster could evaluate the differences from past cycles and the current weather to tweak the next 60-90 days into a more interactive forecasting process than now used.

    As long as the original raw data is still within a degree of what was written down years ago when collected, it should be easier to forecast with in 5 degrees for more than a week and the adjusting of the real time forecast with the above past data viewing tool should be an interesting piece of open source free software.

  9. tallbloke says:

    Awesome.
    The Central England Temperature records are available to researchers. Ulric Lyons has a copy, but those are annual averages. I don’t know about daily records. Can Anthony Watts help?

  10. I would want/need daily values and lat/long meta data for each reporting station and interrupted series from stations would only slightly affect the maps in that area on the days data was missing, when present the increased detail returns. Because I am not trying to establish a long term average, but display the most detailed map data possible, area coverage and temporally dropped data from random stations will not degrade the output.

    Starting on data base availability searches tomorrow, my daughter (my tech assistant and developer contact) went out on a date tonight.

    Have not contacted Anthony Watts directly as of yet, nor him to me. I have a nasty habit of trying to be totally self sufficient, which slows me down some I’d bet.