An Interview Given by Dr. Ned Nikolov (a.k.a. Den Volokin) to Ben Guarino,
a Staff Writer at The Washington Post
Sep. 17, 2016
Research Paper Withdrawal by the Journal Advances in Space Research
Q1: As succinctly as possible, could you tell me why you chose to publish this work under a pseudonym?
A1: We adopted pseudonyms as a measure of last resort as we could not get an unbiased and fair review from scientific journals under our real names. This is explained in more details in the attached letter we sent to the chief editor of the Journal Advances in Space Research (JASR) on Sep. 17, 2015. In brief, our real names became known to the climate-science blogosphere in 2012 when a poster, which we presented at an International Climate Conference in Denver in 2011, became available online and caused broad and intense discussions. When we later tried to publish elements of this poster as separate articles in scientific journals, we discovered that journal editors and reviewers would reject our manuscripts outright after Googling our names and reading the online discussion. The rejections were oftentimes justified by the journals using criticisms outside the scope of the manuscript at hand. On two occasions, journal editors have even refused to send our manuscripts for review after reading the blogs and realizing the broader theoretical implications of our results, although the manuscript itself did not explicitly discuss any new theory. For example, our first paper was rejected 4 times by different journals while submitted under our real names before it was finally accepted by SpringerPlus after submitting it under pseudonyms.
To summarize, publishing new research findings that go against the grain of mainstream theories and belief systems is challenging to say the least, since such findings are often met with a fierce resistance by both the scientific community and socio-political institutions. Hiding our names was not an attempt to dodge responsibility, but to allow the readership (including journal editors and reviewers) to see our results for what they really are without being influenced by prejudices related to our true identities. Anonymity in science has a long history and is being recognized by scholars as a useful approach in advancing knowledge (see for example Neuroskeptic 2013; Hanel 2015). Our decision to use pseudonyms was guided by the understanding that a new message is more important than the name of the messenger.
Q2: How did you arrive at these pseudonyms? Did you think it would be likely those names would be eventually linked to your real identities?
A2: We purposefully chose pseudonyms that were not difficult to decipher yet shielded our identities well enough to permit an unbiased reading and review of our work. We wanted pseudonyms that could relatively easily be linked to our true identifies if needed in the future.
Q3: You published once previously under these names before, is this correct? Have you published other papers under different names?
A2: Yes, this is correct. The first paper in the series of our new climate concept was published in the open-access journal SpringerPlus in 2014 under these pseudonyms. We have not published any other papers under pseudonyms before.
Q4: What is Tso Consulting?
A4: This is explained in the attached letter to the JASR editor (see pp. 3 – 4).
Q5: Regarding the Advances in Space Research paper, this discusses a new model for determining the average surface temperature of rocky planets, broadly speaking based on solar radiation and atmospheric pressure. For a lay audience, as much as possible, could you describe the new “macro-level thermodynamic relationship” that emerges from such an analysis?
A5: The model described in the JASR paper is empirical in nature meaning that it was derived from observations. Specifically, in our model development, we used a technique called Dimensional Analysis (DA). DA is a method for extracting physically meaningful relationships from measured data without a reference to any theory. In other words, DA is a data-exploration technique aimed at inferring (discovering) new physical laws and relationships. It has been successfully used in the past in solving complex problems of physics, engineering, mathematical biology, and biophysics.
We started our quest by asking “What controls the long-term average surface temperature of a planet?”. Instead of looking at theoretical explanations, we decided to answer this question by analyzing data from a broad range of planetary environments in the Solar System. Our initial premise was that factors controlling Earth’s mean global temperature must also be responsible for determining the temperature on other planetary bodies. After an extensive query of the peer-reviewed literature we selected 6 bodies for analysis: Venus, Earth, the Moon, Mars, Titan (a moon of Saturn) and Triton (a moon of Neptune). Our selection was based on 3 criteria: a) presence of a solid surface; b) availability of high-quality data on near-surface temperature, atmospheric composition, and total air pressure/density preferably from direct observations; and c) representation of a wide range of physical environments defined in terms of solar irradiance and atmospheric properties. Using vetted NASA measurements from numerous published sources, we assembled a dataset of incoming solar radiation, surface temperature, near-surface atmospheric composition, pressure, density, and a few other parameters for the selected planetary bodies. We then applied DA to group the available data into fewer non-dimensional variables (ratios) forming 12 prospective models that describe the average planetary surface temperature as a function of solar radiation reaching the orbit of a planet, atmospheric greenhouse-gas concentrations, greenhouse-gas partial pressures, total atmospheric pressure and total atmospheric density. Next, we performed a series of regression analyses to find the best mathematical model capable of describing the non-dimensional data. One non-linear model outperformed the rest by a wide margin. This model describes the atmospheric greenhouse effect only as a function of total atmospheric pressure. In our study, we call the Greenhouse Effect an Atmospheric Thermal Enhancement (ATE) quantified as a ratio of the planet’s actual surface temperature (Ts) to a temperature that the planet would have in the absence of atmosphere (Tna). ATE = Ts/Tna. The ‘no-atmosphere’ temperature, Tna, depends on solar irradiance and is computed from the physical model of Volokin and ReLlez (2014). The figure below illustrates the final pressure-temperature relationship emerging from the data and its success in reproducing the relative atmospheric thermal effects of the 6 planetary bodies.
This newly discovered relationship possesses several important characteristics that make it as meaningful as a physical law. These include high accuracy, broad scope of validity, statistical robustness, and a close qualitative similarity to other well-known pressure-temperature relationships such as the dry adiabatic temperature curve. To our knowledge, this is the first and only model in planetary science capable of accurately describing the average global surface temperature of planetary bodies across such a wide range of radiative and atmospheric environments. This relationship suggests that the long-term equilibrium temperature of Earth is part of a cosmic continuum controlled primarily by two factors – solar flux at the top of the atmosphere and total surface atmospheric pressure. These features give our model the significance of a macro-level thermodynamic relationship heretofore unbeknown to science. By macro level we mean applicable to a planetary-scale quantity such as the global average surface temperature across a broad range of conditions. The term thermodynamic refers to the interaction between temperature, pressure, volume and energy.
The theoretical implications of this new relationship are numerous and fundamental in nature. To name just a few, our model suggests that: a) the greenhouse effect of a planetary atmosphere is in fact a pressure phenomenon that is independent of atmospheric composition; b) solar irradiance and atmospheric pressure determine the planet’s baseline (‘backbone’) equilibrium temperature while constraining the annual and decadal temperature variations to a narrow range around that baseline value; hence, large deviations in Earth’s global temperature are not possible without a significant change in either the total atmospheric mass or the incoming solar radiation; c) the climate system is well buffered and does not have tipping points, i.e. functional states that foster rapid and irreversible changes in the global surface temperature cannot occur.
Q6: Have these variables – upper atmosphere solar irradiance and atmospheric surface pressure — been ignored by other planetary temperature models? If so, do you have a sense why?
A6: Yes and no. Solar irradiance is a main driver of climate in all current models including the 3-D Global Circulation Models (GCMs), which are the preferred tool for studying planetary climates today. In GCMs, pressure only indirectly affects the surface temperature through the atmospheric optical depth. According to the standard Greenhouse theory, atmospheric pressure impacts the energy content of a climate system only through its effect on the infrared absorption lines of greenhouse gases. A higher atmospheric pressure broadens the absorption lines, thus increasing both the thermal infrared opacity of an atmosphere and the down-welling longwave radiation, which is believed to control the surface temperature. Our empirical model, however, suggests that pressure directly impacts the surface temperature through added force (by definition, pressure is a force applied over a unit area). The direct effect of pressure on the internal energy and temperature of a gaseous system is well understood in the classical thermodynamics as exemplified by the Ideal Gas Law. Fundamentally, there cannot be a kinetic energy and temperature without a force, i.e. without some form of pressure. Even electromagnetic radiation has pressure!
A change of temperature due to a change of pressure without any addition or subtraction of heat is known as an adiabatic process. Adiabatic heating, a.k.a. heating by compression is a basic principle in the working of diesel engines, a technology we have successfully utilized for over 120 years. The results from our empirical data analysis suggest that the thermal effect of the atmosphere is analogous to a compression heating found in diesel engines except that it is caused by gravity. Therefore, the direction of causality in the real system appears to be different from that assumed in GCMs. In the real system, pressure controls temperature and these two in turn control the atmospheric optical depth while, in climate models, pressure along with greenhouse-gas concentrations control the atmospheric optical depth, which in turn controls temperature. This discrepancy has fundamental implications for projecting future climatic changes. For example, according to our model, altering the atmospheric optical depth by increasing ambient greenhouse-gas concentrations cannot in principle affect the surface temperature, because a change in system’s temperature requires a net change in the applied force, while the optical depth being a dimensionless quantity by definition carries no force. This is why the effect of CO2 on climate is only visible in model outputs, but has never been observed or shown in reality.
Q7: Dr. Nikolov, you told me that no serious scientist would deny that average global temperature is increasing. You do, however, take a critical approach to the idea it has an anthropogenic cause, is that correct? Therefore, would you consider yourselves to be outside the mainstream consensus regarding climate science?
A7: The climate is always changing. The question is what’s forcing the changes? The available evidence from both direct observations and reconstructed geological records does not support the hypothesis that CO2 and other heat-absorbing trace gases control Earth’s climate. Explaining this properly, however, requires a longer discussion. If the global temperature is independent of atmospheric composition as suggested by our inter-planetary analysis, then there is no mechanism for human-induced carbon dioxide emissions to impact Earth’s climate.
The progress of science is not driven by consensus! If you study the history of scientific discoveries, you will find that theoretical breakthroughs, i.e. the introduction of fundamental new concepts, have always been carried out by individuals or small groups of researchers outside the mainstream consensus. For example, there was a unanimous consensus once that the Earth was at the center of the Universe and all celestial bodies revolved around us. Likewise, 120 years ago, there was a consensus among physicists and engineers that heavier-than-air machines cannot fly. The truth about physical phenomena can only be uncovered through careful observations, proper experimentation, and unbiased sound reasoning. A blind adherence to the consensus of the day oftentimes hinders the advancement of knowledge.
Q8: If so, does this stance make it difficult to publish your work?
A8: There is no doubt that trying to publish research results, which do not conform to accepted theories or mainstream beliefs poses a challenge in today’s world of academic political correctness. This is not just our experience and it is not just happening in climate science. In my view, it is a worldwide phenomenon. For science to be useful to society it must be based on a free and open inquiry of the physical reality, where publishing of novel findings and proposing of new hypotheses based on such findings is not constrained by political considerations. In other words, scientific theories and conceptual paradigms should not be institutionalized as has been done in some areas of science.
Q9: And if the cause is not anthropogenic, do you have a hypothesis as to why not?
A9: As explained above, our finding that the atmospheric thermal effect (a.k.a. the Greenhouse Effect) is entirely due to pressure while independent of atmospheric composition implies most likely natural causes for the observed warming during the 20th Century. There are several lines of evidence discussed in numerous papers by different research teams over the past 10 years that support this hypothesis. For instance, both satellite observations and ground measurements show that the cloud cover and cloud albedo (the fraction of solar radiation reflected back to space) declined appreciably between 1980 and the early 2000s. As a result, the amount of solar radiation reaching the surface has increased. This is known as ‘global brightening’. The rate of this increase is more than enough to explain the observed surface warming over the past 30 years. The figure below shows, how closely the global temperature follows changes in global cloud covers according to satellite data. Note that cloud-cover variations precede temperature changes by about 12 months indicating that clouds drive temperature (cloud data are from Dim et al. 2011; temperature data are from the University of Alabama at Huntsville dataset).
The question then arises what controls cloud cover? This is an area of intense research at the moment, but the available evidence thus far indicates that the global cloud cover is affected by the Sun’s magnetic activity – high solar activity creates conditions for fewer clouds (causing warming), while a low activity promotes more clouds (causing cooling). According to the above figure, the appreciable slowdown of global warming after year 2000 is likely the result of increased low-level clouds. The Sun’s influence on Earth’s cloud cover and albedo, although small in absolute terms, is sufficient to cause global temperature variations in the order of ±0.7 C, which is the size of climatic change we have observed since 1850. The good news is that solar induced changes in cloud cover are buffered by negative feedbacks within the Earth’s climate system making it impossible for the global temperature to deviate more than ±0.7 C around a central mean.
Further indication that the 20th Century warming was most likely caused by natural forcing is provided by this rather interesting study of Viterito (2016), who found that global temperature variations since 1980 were highly correlated with global seismic activity, which is a source of geothermal heat. The author reports that seismic activity precedes temperature changes by 2 years. These results strongly suggest that the observed warming over the past 35 years could not possibly be due to anthropogenic factors.
The observed statistical link between global temperature, cloud cover, and seismic activity points towards possible electric or electro-magnetic effects of the Sun upon the entire Earth-atmosphere system, which are currently unbeknown to science. This is an unchartered territory, where future research efforts should focus.
Q10: The paper passed peer-review and was only withdrawn once the question of pseudonyms was raised. Would you consider the withdrawn Advances in Space Research paper to be controversial in and of itself? Are there conclusions to be drawn from it that contradict existing mainstream theories of climate change?
A10: The paper is not controversial as far as data are concerned, since we have used publically available vetted observations from NASA. The Dimensional Analysis and subsequent non-linear regressions employed are also standard techniques. However, the results obtained are unexpected (even to us) and have theoretical implications pointing to a new paradigm in understanding about the physical nature of the Greenhouse Effect.
Q11: Do you anticipate any reaction from the U.S. Department of Agriculture, where I understand you are both employed?
A11: It is only I, Ned Nikolov, who is currently employed by USDA. Dr. Zeller is a retired meteorologist from the US Forest Service… It is difficult to predict what the reaction of my employer would be (if any) given the fact that I have followed closely their instructions to conduct this research in my spare time and not to show my federal affiliation on any papers published on this topic.
Q12: Would you recommend that more journals accept pseudonyms or at least double-blind review? (I believe Dr. Nikolov, you said only about 1 in 10 currently allow for such peer review.)
A12: Yes! As studies have shown, anonymity in science is critical for the advancement of knowledge especially when new theoretical paradigms are being introduced. I also think that double-blind reviews should become standard practice at all scientific journals. Another change that should also be universally adopted, in my view, is to ban the current practice of rejecting manuscripts based on reviewers’ personal opinion about the importance or implications of reported results. If the analyses are done correctly and the stated conclusions are supported by numerical results, the manuscript should be accepted for publication. It should be left up to the broader readership to decide later on after the paper is published, what the importance or relevance of reported findings to the field is.
Editors Note: Some very minor grammatical and style amendments have been made to the supplied transcript of the WashPo interview for clarity and readability.