Skip to main content
NIST Author Manuscripts logoLink to NIST Author Manuscripts
. Author manuscript; available in PMC: 2018 Feb 1.
Published in final edited form as: J Radioanal Nucl Chem. 2016 Jun 21;311(2):1019–1022. doi: 10.1007/s10967-016-4912-4

Believable Statements of Uncertainty and Believable Science

Richard M Lindstrom 1
PMCID: PMC5455790  NIHMSID: NIHMS859821  PMID: 28584391

Abstract

Nearly fifty years ago, two landmark papers appeared that should have cured the problem of ambiguous uncertainty statements in published data. Eisenhart’s paper in Science called for statistically meaningful numbers, and Currie’s Analytical Chemistry paper revealed the wide range in common definitions of detection limit. Confusion and worse can result when uncertainties are misinterpreted or ignored. The recent stories of cold fusion, variable radioactive decay, and piezonuclear reactions provide cautionary examples in which prior probability has been neglected. We show examples from our laboratory and others to illustrate the fact that uncertainty depends on both statistical and scientific judgment.

Keywords: Data quality, uncertainty, traceability

Introduction

Two papers were published in 1968 that clearly set out the terms of discussion for two fundamental concepts in measurement science. Churchill Eisenhart’s four-page paper in Science, “Expression of the Uncertainties of Final Results” [1], pointed out the several meanings that the literature might be understood to imply by an uncertainty expressed in the shorthand form a ± b. “If no explanation is given, many persons will take ±b to signify bounds to the inaccuracy of a. Others may assume that b is the ‘standard error,’ or the ‘probable error,’ of a, and hence the uncertainty of a is at least ±3b, or ±4b, respectively. Still others may take b to be an indication merely of the imprecision of the individual measurements, that is, to be the ‘standard deviation,’ or the ‘average deviation,’ or the ‘probable error’ of a single observation.” As a consequence, far too great a fraction of the data in the scientific literature “cannot be critically evaluated because the minimum of essential information is not present.” Eisenhart recommended unambiguous and statistically valid procedures for expressing uncertainty which should have cleared the air and set the standards for future publications.

Also in 1968, Lloyd Currie’s paper in Analytical Chemistry, “Limits for Qualitative Detection and Quantitative Determination” [2], applied eight common definitions of detection limit from the literature to a simple measurement example in radiation counting, and showed that the resulting estimates cover nearly three orders of magnitude. He then re-examined from the statistical point of view of hypothesis testing what detection and measurement should mean in analytical chemistry, rigorously defining three quantities: critical level, detection limit, and determination limit. Currie’s formulation led to an American Chemical Society symposium on the topic [3] and has been incorporated in many rules of practice governing measurement procedures, international standards [4], regulations, and software. The culmination of Currie’s early work was seen in the adoption of a harmonized international position (ISO-IUPAC) on the nomenclature, concepts, and formulation of detection, decision, and detection limits [5, 6].

In order to foster consistency of data reporting in all of science, in 1993 the International Organization for Standardization (ISO) issued its Guide to the Expression of Uncertainty in Measurement (GUM) [7] and the International Vocabulary of Basic and General Terms in Metrology (VIM) [8]. Directed toward professional metrologists, these publications incorporated some concepts unfamiliar to practicing laboratory workers. Several organizations, e.g., NIST [9], Eurachem/CITAC [10], IAEA [11], and BIPM [12], published supplemental interpretations with examples appropriate to their fields. The intent of these standards is to foster the publication of reliable data, quantitatively and unambiguously traceable to the International System of Units (SI).

Data quality in practice

In practice, of course, not all routine measurements require detailed uncertainty evaluation every time: the effort must be proportional to its importance, following the principle of fitness for use. When numbers are to be published and used by others, however, Eisenhart’s admonitions apply despite the effort required. “The concepts of traceability are not always well accepted by the analytical chemistry community. There is a benign kind of neglect towards these ideas or even straight hostility.’ [13] Part of the reason is that a full ISO-compliant treatment of uncertainties requires differentiation of the measurement equation, which may be difficult. For example, extracting the derivative of the equation used to reduce counting data to standard conditions

A0=CλeλtePδ/Δ(eλΔ-1)(1-e-λΔ)(1-e-λτ)λΔ

with respect to λ is daunting to most people. However, numerical re-calculations of the measurement equation employing finite differences instead of derivatives [14, 15, 10] can quantitatively show the effect upon the uncertainty of the final result due to uncertainties in the parameters. Applied to neutron activation analysis [1618], this approach clearly shows the relevant uncertainties.

The two classic 1968 papers [1, 2] and the ISO GUM set the standards for clarity in future publications, yet we still see many papers submitted for review and even published with uncertainties given as simply “±” or omitted completely, or with many more digits than are truly significant. As Eisenhart pointed out, these practices have consequences for the users of these papers. As an example from our own work, a newly published half-life for 76As [19] was accepted by the evaluators because of a plausible (but novel and incomplete) description of the uncertainties in the measurement, even though this value conflicted with previous measurements in the literature (notably a set of seven determinations with six different kinds of detectors [20]). This half-life would have led to an inaccurate INAA value for Arsenic Implant in Silicon (SRM 2134) had we used it in our measurements [21]. A redetermination of the half-life [22, 23] in agreement with the previous consensus is slowly driving out the incorrect value from tabulations of nuclear data.

Data quality and good science

Although science is always open to new ideas, misunderstanding the real uncertainties in a measurement, in physics as well as statistics, has led to some conspicuously mistaken conclusions. Scientific judgment (prior probability, in Bayesian terms) needs to be applied, especially to unexpected observations, and the null hypothesis that the observation is in error needs to be explicitly tested. Perhaps the best-known recent example is cold fusion [24] where, in the rush to publish, blank experiments were not adequately done and people knowledgeable in nuclear science were not consulted. Early measurements in our laboratory at NIST [25] without the Pd/D electrochemical cell showed neutrons (from cosmic rays) and gammas (mostly from 214Bi) in quantitative agreement with Fleischmann and Pons’s paper. Other workers were also unable to duplicate the publicized work electrochemically, so that research in low-energy nuclear reactions (LENR) has nearly, but not entirely, stopped.

Since the discovery of radioactivity, more than eighty attempts have been made to influence the rate of radioactive decay. None has had an effect, with the sole exception of decay modes that involve the orbital electrons, the physics of which is well understood [26]. More recently, an unexplained anomaly in the decay curve of 54Mn was observed to be coincident with a strong solar flare, and statistical anomalies in other decay measurements made at Brookhaven National Laboratory and at the Physikalisch-Technische Bundesanstalt (PTB) were found to have annual periodicity [27, 28]. The cause was hypothesized to be related to solar neutrinos affecting the value of the decay constant.

To test this connection, measurements were done in our laboratory at NIST to compare the decay rates of paired intense sources of 198Au with greatly different surface/volume ratios (sphere vs. foil or wire), and thus greatly different internal antineutrino fluxes [29, 30]. The half-lives of the paired sources were found to be indistinguishable, contrary to the prediction of the neutrino hypothesis. Decay rate measurements at Delft in the presence of neutrinos from the HOR reactor were found to be no different when the reactor was shut down [31], casting further doubt on the hypothesis. Other arguments degrade the solar connection [32]; for example, a 137Cs source decayed at the expected rate as the MESSENGER spacecraft traveled from earth to Mercury’s orbit at 0.4 AU [33]. Recent measurements at PTB have caused most of the earlier anomalies to disappear as more sources of experimental bias have been revealed and eliminated [34, 35]. Half-life measurement is subject to many sources of bias, not all of which are readily detected [36]. Although there is decreasing evidence for non-constant radioactive decay rates, the search continues in some laboratories.

The surprising observation that light is emitted from collapsing bubbles produced by ultrasonic agitation of water has led to the hypothesis that the energy of the collapse might be sufficient to cause nuclear fusion, a process called sonofusion. Indeed, both tritium and neutrons were claimed to be detected in deuterated acetone under cavitation [37]. Other laboratories failed to duplicate these measurements, and the pursuit quickly collapsed as a combination of self-deception and fraud was revealed. Inspired by this, another group searched for nuclear transformations in sonicated water, and found increased amounts of uranium and even transuranic nuclides [38].

Subsequently [39] it was claimed that neutrons generated in the ultrasonic probe caused measurable changes in element concentrations. That work was accepted for publication with the editor’s comment “This paper has been evaluated by six peers and has been considered to contain questionable results. However, as it reports on results difficult to prove but indisputably important if correct, the editor takes full responsibility for making it public.” Other observations led the same group to believe that cavitation increases the rate of alpha decay of 228Th [40]. The physical evidence has been sharply criticized [41], and more reasonable explanations of the observation have been proposed [42].

More recently, piezonuclear transmutations of 63Cu to 65Zn via multiple neutron capture were claimed [43], even though gamma spectrometry showed that the ultrasonic probe itself was no more radioactive after operation than before. The publication of this report resulted in strong criticism [44, 45] from the neutron activation analysis community, with the admonition, echoing others [41, 42], that the reviewers and editors of journals have an important responsibility to see that only verifiable facts and theories appear in the published literature.

Error and bias have many ways to creep into laboratory measurements. Currie (pers. comm., 1990) has pointed out that in a real measurement process “d.f. < 0 always; since the number of variables exceeds the number of observations, scientific insight is essential.” Anomalous observations may indeed point to new phenomena, but simple explanations are usually most probable. “The first principle is that you must not fool yourself — and you are the easiest person to fool.” (R. P. Feynman, 1974 Caltech commencement address).

Summary and conclusions

Science is a communal activity whose practitioners build upon each others’ work. To exploit the literature we must understand its limitations, which is possible only if the authors of publications understand the uncertainties in their measurements and conclusions, and make us, the readers, understand them in the same way.

Acknowledgments

I have profited from many years of discussions with Lloyd A. Currie.

References

  • 1.Eisenhart CE. Expression of the Uncertainties of Final Results. Science. 1968;160:1201–1204. doi: 10.1126/science.160.3833.1201. [DOI] [PubMed] [Google Scholar]
  • 2.Currie LA. Limits for Qualitative Detection and Quantitative Determination: Application to Radiochemistry. Anal Chem. 1968;40:586–593. [Google Scholar]
  • 3.Currie LA. Detection: Overview of Historical, Societal, and Technical Issues. In: Currie LA, editor. Detection in Analytical Chemistry (ACS Symp Ser 361) Am Chem Soc; Washington: 1988. pp. 1–62. [Google Scholar]
  • 4.Currie LA. Nomenclature in Evaluation of Analytical Methods Including Detection and Quantification Capabilities. Pure & Appl Chem. 1995;67:1699–1723. [Google Scholar]
  • 5.Currie LA. International Recommendations Offered on Analytical Detection and Quantification Concepts and Nomenclature. Anal Chim Acta. 1998;391:103–134. [Google Scholar]
  • 6.Currie LA. ch. 2 (Presentation of the Results of Chemical Analysis) and ch. 18 (Quality Assurance of Analytical Processes) In: Inczédy J, Ure AM, Lengyel T, Gelencsér A, Hulanicki A, editors. IUPAC Compendium of Analytical Nomenclature. Blackwell Science; Oxford: 1998. [Google Scholar]
  • 7.ISO. Guide to the Expression of Uncertainties in Measurement. Internat Stds Org; Geneva: 1993. [Google Scholar]
  • 8.ISO. International Vocabulary of Basic and General Terms in Metrology (VIM) Internat Stds Org; Geneva: 1993. [Google Scholar]
  • 9.Taylor BN, Kuyatt CE. Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Tech Note 1297) Natl Inst Stds & Tech; Gaithersburg, MD: 1994. [Google Scholar]
  • 10.Ellison SLR, Rosslein M, Williams A, editors. Quantifying Uncertainty in Analytical Measurements. 2. Eurachem: CITAC; 2000. QUAM:2000.P1. [Google Scholar]
  • 11.IAEA. Quantifying Uncertainty in Nuclear Analytical Measurements (TECDOC-1401) IAEA; Vienna: 2004. [Google Scholar]
  • 12.BIPM (2008) Evaluation of measurement data—supplement 1 to the ‘Guide to the expression of uncertainty in measurement—propagation of distributions using a MC method (JGCM 101).
  • 13.Adams F. Traceability and Analytical Chemistry. Accred Qual Assur. 1998;3:308–316. [Google Scholar]
  • 14.Rees CE. Error propagation calculations. Geochim Cosmochim Acta. 1984;48:2309–2311. [Google Scholar]
  • 15.Kragten J. Calculating Standard Deviations and Confidence Intervals with a Universally Applicable Spreadsheet Technique. Analyst. 1994;119:2161–2166. [Google Scholar]
  • 16.Robouch P, Arana G, Eguskiza M, Pommé S, Etxebarria N. Uncertainty Budget for k0-NAA. J Radioanal Nucl Chem. 2000;245:195–197. [Google Scholar]
  • 17.Greenberg RR, Lindstrom RM, Mackey EA, Zeisler R. Neutron activation analysis: A primary method of measurement, Chapter 2. Evaluation of Uncertainties for NAA Measurements Using the Comparator Method of Standardization. Spectrochim Acta B. 2011;66:208–232. [Google Scholar]
  • 18.Kubešová M, Kučera J. How to calculate uncertainties of neutron flux parameters and uncertainties of analysis results in k0-NAA? J Radioanal Nucl Chem. 2012;293:87–94. [Google Scholar]
  • 19.Mignonsin EP. Determination of Half-lives by Gamma-Ray Spectrometry: Improvement of Procedure and Precision. Appl Radiat Isotop. 1994;45:17–24. [Google Scholar]
  • 20.Emery JF, Reynolds SA, Wyatt EI, Gleason GI. Half-Lifes of Radionuclides—IV. Nucl Sci Eng. 1972;48:319–323. [Google Scholar]
  • 21.Greenberg RR, Lindstrom RM, Simons DS. Instrumental Neutron Activation Analysis for Certification of Ion-Implanted Arsenic in Silicon. J Radioanal Nucl Chem. 2000;245:57–63. [Google Scholar]
  • 22.Lindstrom RM, Blaauw M, Fleming RF. The half-life of 76As. J Radioanal Nucl Chem. 2003;257:489–491. [Google Scholar]
  • 23.Unterweger MP, Lindstrom RM. Ionization chamber measurements of the half-lives of 24Na, 42K, 76As and 198Au. Appl Radiat Isotop. 2004;60:325–327. doi: 10.1016/j.apradiso.2003.11.035. [DOI] [PubMed] [Google Scholar]
  • 24.Fleischmann M, Pons S. Electrochemically induced nuclear fusion of deuterium. J Electroanal Chem. 1989;261:301–308. [Google Scholar]
  • 25.Lindstrom RM. Investigation of Reported Cold Nuclear Fusion. In: O’Connor C, editor. NIST Tech Note 1272. U. S. Govt. Printing Off; Washington: 1989. pp. 258–261. [Google Scholar]
  • 26.Hahn H-P, Born H-J, Kim JI. Survey on the Rate Perturbations of Nuclear Decay. Radiochim Acta. 1976;23:23–37. [Google Scholar]
  • 27.Fischbach E, Buncher JB, Gruenwald JT, Jenkins JH, Krause DE, Mattes JJ, Newport JR. Time-Dependent Nuclear Decay Parameters: New Evidence for New Forces? Space Sci Revs. 2009;145:285–335. [Google Scholar]
  • 28.Jenkins JH, Fischbach E, Buncher JB, Gruenwald JT, Krause DE, Mattes JJ. Evidence of correlations between nuclear decay rates and Earth-Sun distance. Astropart Phys. 2009;32:42–46. [Google Scholar]
  • 29.Lindstrom RM, Fischbach E, Buncher JB, Greene GL, Jenkins JH, Krause DE, Mattes JJ, Yue A. Study of the dependence of 198Au half-life on source geometry. Nucl Instrum Methods A. 2010;622:93–96. [Google Scholar]
  • 30.Lindstrom RM, Fischbach E, Buncher JB, Jenkins JH, Yue A. Absence of a self-induced decay effect in 198Au. Nucl Instrum Methods A. 2011;659:269–271. [Google Scholar]
  • 31.de Meijer RJ, Blaauw M, Smit FD. No evidence for antineutrinos significantly influencing exponential β+ decay. Appl Radiat Isotop. 2011;69:320–326. doi: 10.1016/j.apradiso.2010.08.002. [DOI] [PubMed] [Google Scholar]
  • 32.Norman EB, Browne E, Shugart HA, Joshi TH, Firestone RB. Evidence against correlations between nuclear decay rates and Earth-Sun distance. Astropart Phys. 2009;31:135–137. [Google Scholar]
  • 33.Fischbach E, Chen KJ, Gold RE, Goldsten JO, Lawrence DJ, McNutt RJJ, Rhodes EA, Jenkins JH, Longuski J. Solar Influence on Nuclear Decay Rates: Constraints from the MESSENGER Mission. Astrophys Space Sci. 2012;337:39–45. [Google Scholar]
  • 34.Schrader H. Half-life measurements of long-lived radionuclides—New data analysis and systematic effects. Applied Radiation & Isotopes. 2010;86:1583–1590. doi: 10.1016/j.apradiso.2009.11.033. [DOI] [PubMed] [Google Scholar]
  • 35.Kossert K, Nähle OJ. Disproof of solar influence on the decay rates of 90Sr/90Y. Astropart Phys. 2015;69:18–23. [Google Scholar]
  • 36.Pommé S. The uncertainty of the half-life. Metrologia. 2015;52:S51–S65. [Google Scholar]
  • 37.Taleyarkhan RP, West CD, Lahey RT, Nigmatulin RI, Cho JS, Block RC, Xu Y. Nuclear Emissions During Self-Nucleated Acoustic Cavitation. Phys Rev Lett. 2006;96:034301. doi: 10.1103/PhysRevLett.96.034301. [DOI] [PubMed] [Google Scholar]
  • 38.Cardone F, Mignani R. Possible Observation of Transformation of Elements in Cavitated Water. Internat J Mod Phys. 2003;B17:303–317. [Google Scholar]
  • 39.Cardone F, Mignani R, Perconti W, Pessa E, Spera G. Possible evidence for production of an artificial radionuclide in cavitated water. J Radioanal Nucl Chem. 2005;265:151–161. [Google Scholar]
  • 40.Cardone F, Mignani R, Petrucci A. Piezonuclear decay of thorium. Phys Lett A. 2009;373:1956–1958. [Google Scholar]
  • 41.Ericsson G, Pomp S, Sjöstrand H, Traneus E. Piezonuclear reactions do they really exist? Phys Lett A. 2009;374:750–753. [Google Scholar]
  • 42.Kowalski L. Comment on “Piezonuclear decay of thorium”. Phys Lett A. 2010;374:696–697. [Google Scholar]
  • 43.Albertini G, Cardone F, Lammardo M, Petrucci A, Ridolfi F, Rosada A, Sala V, Santoro E. Atomic and isotopic changes induced by ultrasounds in iron. J Radioanal Nucl Chem. 2015;304:955–963. [Google Scholar]
  • 44.Rossbach M. Letter to the editor of JRNC. J Radioanal Nucl Chem. 2015;304:965–966. [Google Scholar]
  • 45.Lindstrom RM, et al. Rebuttal to “Atomic and isotopic changes induced by ultrasounds in iron”. J Radioanal Nucl Chem. 2016;307:13–14. [Google Scholar]

RESOURCES