NIST and TSI ” unknown systematic bias”

Posted: June 18, 2013 by tchannon in Measurement, methodology, Solar physics, Uncertainty


Image from PDF on NASA/NIST TSI Workshop July 18-20, 2005,
R. C. Willson click image or link

The brief presentation linked above shows some of the story, more follows…

The range of absolute total solar irradiance (TSI) values measured by different exo-atmospheric radiometers is currently about 5 W/m2, which is about 0.35 % (3500 × 10–6, Fig. 1) of the exo-atmospheric absolute TSI value at a distance of 1 astronomical unit (AU) from the Sun. This difference is greater than the individual standard uncertainties reported for most of these instruments, and greater than the 0.02 % per decade value typically stated as required to understand solar vs. anthropogenic forcing in climate change. The discrepancy between different instruments during the
same time indicates the presence of unknown systematic bias.

Sources of Differences in On-Orbital Total Solar Irradiance Measurements
and Description of a Proposed Laboratory Intercomparison.
Journal of Research of the National Institute of Standards and Technology
Volume 113, Number 4, July-Aughust 2008
J. J. Butler, National Aeronautics and Space Administration
B. C. Johnson, J. P. Rice, and E. L. Shirley
National Institute of Standards and Technology

paper here, PDF

This paper seems to originate 2005 so it is old but nevertheless this is a telling admission, the USA equivalent of the UK NPL admitting things are a travesty, all that effort and it’s wrong.

Perhaps the meaning is not entirely clear.

We know the various satellite based instruments for measuring TSI disagree, this well known plot shows it


NIST are stating there should not be the differences and they do not know why they exist. I assume this is because NIST were involved with the metrology of several (or all) of these instruments. (metrology is the name for the field, science of measurement if you like, ought to be a metrologist as well as a statistician in all science teams)


Oops. This is from the SORCE December 2006 newsletter where there is more information about ground based testing of the TIM instrument retained as a reference (other is flying). Still no dice, the compare just shows error.

This does reveal some thinking.

The bridging problem

I go on about bridging quite often to do with meteorological instruments. This is where any significant change of equipment or conditions makes it necessary to run the old and new together for an extended period, only then can the old be decommissioned. Expensive but how it is and violated all the time.

The newsletter mentions this using different language but also mentions an intent of producing absolute instruments which are so superb they can be just dropped in. Sorry, I don’t care how good, good practice is just that. As it stands the results are way off, figures of the same order (same decade ) which is not even accuracy of the same order. And then an order better still is necessary. Even at that I am sceptical the instrument measures the entire solar output nor have they mentioned the electronics or signal processing.

Finally on this “Until good absolute accuracy is achieved, the long-term TSI record relies on data continuity via mission overlap [the bridging except not sufficient]; and the short-term future for continued measurements is bright, with the SORCE/TIM, VIRGO, and ACRIM 3 funded to last until the launch of the Glory/TIM and the PICARD’s TSI instruments in 2008.”


An Orbital Sciences Taurus XL rocket blasts off from Vandenberg Air Force Base, Calif., Friday carrying NASA’s Glory environmental research satellite. The spacecraft was lost when the rocket’s protective nose cone fairing failed to separate.
(Credit: Orbital Sciences Corp.)

The Glory spacecraft failed to reach orbit after liftoff aboard a Taurus XL rocket on March 4 at 5:09 a.m. EST from Vandenberg Air Force Base in southern California.

$424m according to cnet

The 1,200-pound solar-powered Glory spacecraft, also built by Orbital Sciences, was designed to precisely measure how much solar energy enters and leaves Earth’s atmosphere and how small particles called aerosols, both manmade and natural, affect the global environment.

Nothing serious. Backup satellites are normal.

The cause of the failure?

The summary report provides an overview of the mishap investigation board’s findings. The board’s complete report is not available for public release because it contains information restricted by U.S. International Traffic in Arms Regulations and information proprietary to the companies involved.

Summary report, they don’t know what caused it. Or at least in public.

While the T9 MIB was able to identify the proximate cause and two possible intermediate causes for the T9 mishap, they were unable to identify the root cause for this failure. As a matter of explanation, an intermediate cause is between the proximate cause and the root cause in the causal chain. The root cause is the factor or set of factors that contributes to, or creates the proximate cause. Typically multiple root causes contribute to an undesired outcome.

The T9 MIB was unable to determine a root cause for the mishap mainly due to limited flight telemetry and the inability to recover the payload fairing hardware for analysis that would have enabled the determination of a definitive intermediate cause or causes.

Undesired outcome, hmm… Â how to spend $500m excavating a hole in the Pacific, plus heaven knows how much collateral expenditure.



Slide from presentation (PDF)

SORCE was supposed to live for 5 years, but the failure of GLORY pushed things, with battery cell failure starting to cause real trouble.


Due for launch mid-2013

The TCTE instrument, largely built alongside the original SORCE/TIM, provides a means of quickly readying a replacement instrument for a flight on the Air Force’s existing STPSat-3 mission for a mid-2013 launch.’s-demise

Cobbled together out of scrap? Let’s hope this time it works.

Here is a barely legible poster discussing how the dataset transfer might be done

Poster as PDF. (2.4MB)

[UPDATE]  A short article on the Talkshop is linked here which provides news on the satellite and launch date, currently late-2013 [/UPDATE]

I’m struck by how little progress has been made over 100+ years of trying to determine solar intensity.


Article first published at deadal earth, June 14th

  1. Doug Proctor says:

    Non-technical people – and even some engineering types – don’t understand that some data is hard, and some, soft. You have vast amounts of soft data but you can never get very close to the accuracy or precision of one bit of hard data: I’m thinking tree-rings vs a mercury thermometer level.

    Also, a lot of non-technical people don’t realise that a lot of data is not measurement but calculations. Satellite data, either TSI or sea-level, oceanic heat content, are all calculations that depend on so many assumptions you wonder where the real +/- is. For some reason, though, if a computer gives the answer, it is consider as hard as it comes: a GCM is better than balloon measurements, and a satellite measurement with lasers is better than a tidal guage anchored to a rock.

    As you note about TSI: what is being measured/calculated is not necessarily (actually) the same thing as what the sun sends to Earth to warm us up, but Trenberth among others acts to the nth degree as if it is. And on it goes.

    There is a “limit to knowledge” far beyond Heisenberg’s Uncertainty Prinicple. When people are told they can have a world in which everything is nailed to the floor, they get not just a false sense of security, but a false sense of predictability. I wouldn’t care so much but for all the trouble and expense such a foolish, foolish concept causes.

    There is “noise” in everthing we do. Worrying about things that are smaller than what we can detect without a Cray computer and a statistician in-tow is worse than a waste of time when our world needs serious attention to problems that bang us over the head every day.

  2. michael hart says:

    I hear you, Doug. While there is a combinatorial number of different ways to say “the data must be wrong” and then adjust it until it is “right”, I’ll continue taking it with a tablespoon of salt.

    I’ll go with the predictions. And if IPCC supporters continue to insist they don’t make predictions then I’ll continue to think they have nothing.

  3. suricat says:

    I totally concur with the OP’s misgivings and I empathise with Doug Proctor’s response.

    I’ll take a leaf from my Granddad’s repertoire and say:

    Believe nothing that you hear and only half of what you see!

    This suggests that I should believe, at least, half of what I ‘read’ here (and understand)! 🙂

    Best regards, Ray.

  4. Brian H says:

    How is this different from investigators finding wide random differences between what they expect and what they see? Take Feynman’s advice, and junk your guesses to date.

  5. Gail Combs says:

    And Lief Svalgaard and company have decided that TSI is constant and that it is the sunspot numbers that need ‘Adjusting’ to go along with this super precise solar information.

    (Do I need the /sarc off tag?)

  6. RichardLH says:

    Providing a ‘cross calibaration’ of various instruments that are supposed to be reporting the same source has always been a problem. Not much changed here then.

  7. tchannon says:

    That’s the stuff of metrology but meteorology and climatology have different standards.

    Something I have not looked into, might do eventually is dig out all the orbits, bethcha there are issues. This would fit with the complaints about the satellites being within the far atmosphere of earth, with may or may not be an issue.

    It might be the case that there is a hidden additional agenda, the use of a copy instrument in a different orbit, which might shed some light on things. There are going to be hearts in mouths when first data arrives.

    So far the space results are way off precise.

    Lesson few learn: the difference between absolute and relative measurement.

    Absolute is very very hard.

  8. RichardLH says:

    Off topic here really as this is for temperature rather than TSI;

    Just posted this on WattsUpWithThat 🙂

    Proposal – Forced Cross Calibration of Global temperature data series

    It should be possible to force the various Global temperature data sources into alignment simply by adjusting their offsets and scales to determine a best fit over their whole overlap period, 1979 to today. They are supposed to be reporting the same thing after all, average global temperature as measured/estimated by them over that whole period of time. Using the corrective parameters derived from the above step, we can then back project/cross calibrate the thermometer data to create a satellite referenced temperature data series backwards in time, beyond the overlap period and out to the end of the thermometer record. An overlap period of 34 years for the records so far should be sufficient for reasonable accuracy in the parameter choices.

    Align OLS trends in the sources by using offset and scale factors (currently by trial and error). Using OLS trends over the whole period to determine parameter choice allows for the likely best fit, given the relatively short overlap time period. Also OLS trends have no implicit reference points so are ‘floating’ in this regard thus making them more amenable to cross calibration of this type.
    • BEST
    Offset: -0.4
    Scale: 0.5

    • HADCrut4 Global mean
    Offset: -0.16
    Scale: 0.86

    • RSS – No adjustment

    • UAH
    Offset: 0.1
    Apply the cross calibration data so obtained to the thermometer based data sources backwards in time to obtain a satellite cross referenced temperature series.
    Satellite referenced Historical Global temperature data series output

    Processing Steps

  9. tckev says:

    It is truly amazing to read that at a cost of so many millions this has not been sorted. And then to have some ‘climate scientists’ say with supreme confidence and apparent precision that they know what the TSI value is and that is varies little just beggers belief.
    What is the taxpayer paying for again?

  10. DirkH says:

    Nice. When the claimed energetic imbalance at TOA is 0.5 W/m^2 or one thousandth of the main main signal insolation, it is obvious that questions of precisely measuring insolation over longer periods of time and bigger distances become crucial.

    As long as such measurements are not available the warmists have no case; but given that public debate never reaches above the level of a 5 year old they just turn it around and use the lack of data to claim that their computer is right, and no alarmist journalist calls them out.

    Pseudoscience is alive and well funded.