Introduction to Radioisotope Geochronology/Part 7 - Dates to Ages: Considering Interpretation and Uncertainties in Geochronology

Combining Dates to Derive an Age


There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we now know we don’t know. But there are also unknown unknowns. These are things we do not know we don’t know.

—United States Secretary of Defense Donald Rumsfeld

We have learned in the previous sections that radio-isotopic dating is capable of producing very precise dates however full exploitation of these data requires considerations of the associated uncertainties. All measurements have an uncertainty associated therewith regardless of precision and accuracy. This is caused by two factors: the limitation of the measuring instrument (systematic uncertainty) and the skill of the experimenter making the measurements (random/external uncertainty). During the past two decades analytical precision has increased substantially due to improvements in both mass spectrometry and laboratory protocols.

A significant proportion of age constraints for global strata are derived from U-Pb dates on zircons from volcanic rocks. The final reported date and associated uncertainty are often weighted mean dates derived from a number (n) of individual dates on different zircons (or zircon sub-domains). This is the case for data acquired using both ID-TIMS and microbeam techniques. The weighted mean weights each individual analyses (such as a single SIMS spot or single grain ID-TIMS analyses) according to its precision so analyses with a low uncertainty contribute more to the weighted mean than those with high uncertainty. Importantly, the use of a weighted mean algorithm (or other averaging) is underpinned by the expectation of a single population with normally distributed errors (i.e., there is no correlation between precision of analyses and age). If the errors on the individual analyses are approximately equal (as is typical for microbeam U/Pb data) then the weighted mean uncertainty is proportional to 1/√n, therefore high-n datasets can be used to reduce the overall age uncertainty for data collected on a single population with normally distributed errors. If the uncertainties on the individual analyses are variable then the weighted mean uncertainty is controlled by the most precise analyses making the weighted mean date less proportional to n. High-n datasets are critical for assessing analytical vs geological scatter, however the real limit on the precision is the analytical uncertainty of single (spot or grain) analyses as this controls our ability to resolve real variation within a series of analyses due to ‘open-system’ behaviour.

Some definitions

  • Systematic uncertainty: biases in a measurement where the mean of individual measurements differs from the true value.
  • Random/external uncertainty: aberration in measurements that lacks reproduciblity of a fixed value.
  • Accuracy: the correctness of measurements to a quantity's true/accepted value.
  • Precision: the degree of reproducibility and repeatibility of measurements.

Sources and types of uncertainty


Without an accurate estimation of total uncertainty, the radio-isotopic age of a given rock or mineral is of limited value.

It is necessary for users of geochronological data to understand the various sources of error and when one must consider the total uncertainty of a given date as opposed to its constituent parts. Although the uncertainty of each date contains an internal/random component in the total uncertainty, there are also components that are systematic, such as errors from the decay constants. When comparing ages determined by the same isotopic system these can be ignored offering a potential increase in resolving power. In this section we review the different sources of uncertainties and the assumptions that underlie the often quoted (or not) errors. For more detailed treatment of uncertainties in geochronology the following articles are recommended: Ireland and Williams (Ireland and Williams, 2003); Stern and Amelin (Stern and Amelin, 2003), Schmitz and Schoene (Schmitz and Schoene, 2007), and various papers by Ludwig (Ludwig, 1980, 1991, 1998, 2003).

Random/internal uncertainties


Random/internal uncertainties are those relating to the measurement of isotopic ratios of the sample, standards and blank and are used in the derivation of errors of the radiogenic ratios. These uncertainties are intrinsic to each analysis and represent the minimum uncertainty that must be considered. Most of these sources of random uncertainty relate to the mass spectrometry measurements and our ability to measure and reproduce isotopic ratios. Factors such as the electronic noise of detectors place a theoretical limit on the precision which can be achieved by detecting a certain number of ions over a finite period of time (counting statistics). However for almost all geochronologic applications other factors such as correction for mass dependent fractionation that occurs during sample ionisation, and correction for common and/or initial parent and daughter nuclide dominate the analytical uncertainty budget. It is possible to reduce the uncertainty in the mass dependent fractionation via ‘double-spiking’, where two tracer isotopes of the same element (202Pb-205Pb, or 233U-235U for example) are used for real time mass fractionation correction.

Systematic/external uncertainties


Systematic uncertainties are those related to the uncertainty in absolute value of various constant parameters used in the calculation of either an isotopic ratio or in the calculation of the date itself. As outlined below, these uncertainties are systematic for a given parameter and have to be considered when comparing dates calculated using different determinations of a given parameters. If data are generated using the same parameter constants then these uncertainties can be ignored in order to assess the difference between dates.

Decay constants


One source of systematic uncertainty that affects all radio-isotopic dates are those related to the uncertainty in the decay constants (Table 1). Three approaches have been taken to determine the decay constants (the probability that a given atom will decay per unit of time) of the long-lived radionuclide; (1) direct counting; (2) ingrowth and (3) geological comparison. Direct counting involves the detection of alpha, beta or gamma activity relative to the total number of radioactive atoms. Ingrowth relies upon the quantification of a decay product that is accumulated from a quantity of high-purity parent nuclide over a well-defined period of time. Geologic comparison involves the analyses of cogeneitc materials with multiple chronometers, knowing that each chronometer should yield the same date. This approach has the potential for relative intercalibration of the decay constants but accurate intercalibration requires that at least one decay constant is accurate and known with some precision. This is usually assumed to be the 238U and 235U due to the precision with which the decay constants have been determined (Jaffey et al., 1971) and the internal check provided by closed system zircon analyses (Mattinson, 2000; Schoene et al., 2006).

The counting experiments of Jaffey et al (1971) determined the 238U and 235U decay constants with uncertainties of 0.11% and 0.14% respectively. These values have been adopted for use in geochronology (Steiger and Jager, 1977). The 187Re and 176Lu decay constants have been determined by both direct counting experiment and through geologic comparison with the U-Pb system and uncertainties are estimated at ca. 0.4 to 0.5% (Scherer et al., 2001; Selby et al., 2007).

The incorporation of decay constant uncertainties are becoming increasingly important as both the internal precision of dates is reduced and multiple geochronometers are being used to investigate the same time intervals. The decay constant uncertainties for isochron dates are typically <20% of the total uncertainty budget, in contrast the uncertainties in the U decay constants are often >50% of the total uncertainty budget of U/Pb ID-TIMS dates (Fig. 2). The situation for the ID-TIMS U/Pb community is that they are now often generating 206Pb/238U and 207Pb/206Pb dates often do not overlap within analytical precision and the U decay constant uncertainties must be considered (Begemann et al., 2001; Ludwig, 2000; Schoene et al., 2006). As the ‘user’ often uses these date interchangeably we are now seeing 206Pb/238U and 207Pb/206Pb age uncertainties presented as ± X/Y/Z and ± X/Z respectively, where X is the analytical/internal uncertainty, Y is the analytical uncertainty plus the systematic tracer calibration uncertainty, and Z is the total uncertainty including X, Y and the decay constant uncertainties. This permits use of the data with the level of uncertainty that is appropriate to the problem being addressed. Mattinson, 2000, 2008 and Schoene et al., 2006 pointed out that sets of high-precision dates from zircons of different ages indicates a difference of approximately 0.2% between 206/238 dates and 207/206 dates due to probable inaccuracy in 235U decay constant. This has led to the suggestion that the 235U decay constant be recalculated to achieve concordance between 206/238 , 207/235, and 207/206 dates. While not officially adopted the authors predict this will become common and users if data must be be aware of this trend.

Excess Variance


Weighted means and Reduced chi-squared (MSWD)


There are a variety of ways in which a series of dates can be combined into a single age and uncertainty. There are also ways in which we can evaluate the viability of these ages.

Arithmetic Mean


The arithmetic mean is the sum of a collection of numbers divided by the number of values from which an average is being calculated. It is defined by the formula


Although the arithmetic mean is often used as a measure of central tendency, it is highly effected by outliers and disregards any uncertainty associated with individual datum.

Weighted Mean


The weighted mean is similar to the arithmetic mean, but the mean is weighted upon the constraints of each datum (i.e. the uncertainty). The weighted mean is calculated with the following equation:   where   is the data and   is the absolute uncertainty (1 sigma) of said data.
The weighted uncertainty (1 sigma) is calculated as  .
For weight mean calculation of correlated data see McLean et al., 2011[1].

Reduced Chi-Squared (Mean Squared Weighted Deviation)


The reduced chi-squared statistic (also known as the mean square weighted deviation or MSWD; Wendt and Carl, 1991) is a very popular goodness-of-fit test for model assessment and comparison. In geochronology, this statistic is used to assess the degree of coherence within a given dataset. This statistic is the chi-squared statistic divided by the number of degrees of freedom and is calculated using the following equation:   where 𝜈 is the degrees of freedom (n-1),   is the observed data,   is the theoretical or expected data (i.e. the weighted average of the observed data/model that represents the data), and   is the variance of the observed data. The reduced chi-squared statistic is most generally used to assess the appropriate propagation of random and systematic uncertainties (Wendt and Carl, 1991[1]; Ludwig, 2003[1]). A value of approximately 1 indicates that the scatter in the data can be explained by analytical uncertainties alone, values much less than 1 indicates that analytical uncertainties have been overestimated, and values greater than 1 can indicate either that the uncertainties have been underestimated or that another source of scatter, often called “geological” scatter is present. Furthermore, the actual reduced chi-squared value for which the scatter of the data can be considered due to analytical factors alone, is not restricted to a value of one but in fact varies according to the number of data points in the calculation (Wendt and Carl, 1991). So, to be 95% confident that the scatter of the data is due to the analysis when n=5, an acceptable reduced chi-squared range would be 0.2 - 2.2 but for n = 25 this would be 0.6-1.5 (Wendt and Carl, 1991). Although not often explicitly stated, an reduced chi-squared of 1 does not necessarily mean there is a single (age) population. Rather, it indicates that if real (age) variation is present, it cannot be resolved within the precision of the individual analyses.

Isochrons and Linear Regression


Uncertainties as a result of geologic complexity


Uncertainty as a result of geologic complexity is the most difficult to quantify. The most common cause of excess scatter is open system behaviour resulted from either inheritance of older zircon and/or Pb loss. For U-Pb zircon analyses reduced errors on single analyses often exposes fine-scale variability that may reflect protracted or punctuated crystallisation of zircon crystals in a magma chamber or the effects of very subtle open system behaviour thus that high-precision analyses do not always transform into reduced uncertainties in calculated weighted mean dates.

Complex U-Pb zircon systematics

In the past decade errors associated with ID-TIMS analyses have been reduced by almost an order of magnitude. These reduced errors offer unprecedented precision but also expose geological complexity at the <0.1 % level sometimes resulting in scatter that exceeds analytical uncertainties. It is now common for a geochronologist to be faced with a population of zircon analyses that do not form a coherent cluster and the crucial question is how to interpret the data to arrive at a depositional age. The advent of CA-TIMS pre-treatment for the elimination of Pb loss has been extremely important as it gives one confidence that in many cases Pb loss need not be considered as a cause of excess scatter. Furthermore, for Neoproterozoic rocks, the concordia curve has a shallow enough slope, and the 207Pb/235U dates measured precisely enough to be able to evaluate discordance at the per mil level, however this is not the case for microbeam U/Pb dates. As outlined above, microbeam U/Pb dates on volcanic rocks rely upon the averaging of a relatively high-n dataset (10-20) of relatively imprecise (ca. 2 to 4%) U/Pb determinations to get a weighted mean date with precision ca. 1% or less. Underpinning these lower uncertainties is the assumption of a single population with normally distributed errors. However, it is the low precision of each analysis combined with variability of the standard analyses that bracket unknowns that often precludes the detection of subtle amounts of Pb loss or inheritance. Stated another way, if the amount of Pb-loss or inheritance is less than the precision of a single spot analyses then it cannot be detected via normal statistical proxies (such as the MSWD) therefore the assumption of a normal distribution maybe be invalid (see Fig. 4). If Pb-loss is the main source of open-system behaviour, this will have the effect of lowering the 206Pb/238U date on some analyses as well as the weighted mean 206Pb/238U date.

  1. a b c [1],McLean, N.M., Bowring, J.F., Bowring, S.A., 2011 An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation. Geochemistry, Geophysics, Geosystems, v. 12, no. 6 Invalid <ref> tag; name "#" defined multiple times with different content