Answer to Question #8067 Submitted to "Ask the Experts"
Category: Instrumentation and Measurements — Surveys and Measurements (SM)
The following question was answered by an expert in the appropriate field:
My standard GM (Geiger Mueller) probe has these features (from the technical data sheet):
Measuring range: | 0.5 µSv/h - 10 mSv/h | |
Display range: | 0.05 µSv/h - 10 mSv/h | |
Accuracy: | +/- 15% | |
Sensitivity (137Cs): | 1,800 cps per mSv/h |
When I read on the display 0.10-0.20 µSv/h (e.g., in checking the background), am I sure to obtain a correct evaluation of radiometric field or are such values rough approximations because they are out of the measuring range in which the accuracy is certified by manufacturer (15%)?
It seems a fairly simple question, but we have a lot of discussions about it at work.
Because of the apparent discrepancy between the measuring range and the display range contained in the technical data sheet, I can understand why you might have questions about the validity of measurements at low-field intensities, as for background. While I do not know what specific instrument you are using, I shall assume that it shares most of the characteristics of typical GM instruments commonly used to perform dose-related measurements associated with gamma radiation fields.
Such instruments are often calibrated using 137Cs sources, in which case the instrument is exposed to a field of known intensity so that a reasonable number of calibration points are evaluated on each range of the detector. For example, for an instrument that has selectable ranges (e.g., range selected by using a multiplier factor obtained by rotating a knob on the face of the instrument), two distinctly different points on each scale might be selected during the calibration process. Such calibration measurements help to demonstrate the linearity characteristics of the instrument. If the instrument behaves in a linear fashion, the count rate per unit dose rate will be constant, within statistical uncertainties (for your instrument, this value is apparently 1,800 cps per mSv/h). At high dose rates the linearity may fail because of count losses associated with increased dead time. At low dose rates (in the background region or below) this response factor should apply, but it is difficult to make precise readings because of the large inherent statistical variability observable at low dose rates. This latter fact may be at least part of the reason why the measuring range cited for your instrument begins at a higher value than the display or response range. This does not necessarily mean, however, that the instrument will not provide legitimate readings at low dose rates, but the statistical uncertainty may be outside the range that is desirable, possibly leading to the manufacturer's specification of the higher measuring range. We might be able to demonstrate this for one or two simple cases using the numbers you have included.
If your instrument is a conventional analog ratemeter, the fractional standard deviation in the output voltage, V, is given by
σ/V = 1/(2rRC)1/2
(see Knoll, Radiation Detection and Measurement, 4th ed., Wiley & Sons, 2000, pp. 624–625), where r is the r average pulse rate being processed, and RC is the circuit time constant. For the instrument you specify, a dose rate of 0.1 μSv/h would yield an expected count rate of 0.18 cps; we shall take this as r in the above equation. Time constants can vary widely, but at low dose rates a large time constant is often desirable as it dampens out the rapid variations in the needle movement. If we assume a moderately high time constant of 10 seconds, then we obtain
σ/V = 1/((2)(0.18)(10))1/2 = 0.53,
implying that the relative standard deviation is about 53%; at the 95% confidence level the relative uncertainty would then be about 106%, confirming the difficulty of obtaining a precise reading at these lower dose rates. For the dose rate of 0.5 μSv/h the expected count rate would be 0.9 cps, and the relative standard deviation would be 0.23 or 23%; the 95% confidence value would be about 46%, less than half of the value at the 0.1 μSv/h dose rate.
If you are using a digital ratemeter, the same results do not apply, although you can review a very similar approach summarized by Knoll (above reference, p. 626) in which the time constant, RC, is replaced by the product of the accumulation time (period for which a gate is open to accumulate counts in the register, and F represents the fixed fraction of the accumulated register counts that are subtracted from the total accumulated counts).
The bottom line is that low dose-rate measurements are generally subject to greater statistical uncertainty than are higher dose-rate measurements, and this may be why the manufacturer specifies the higher range for measurements within certain accuracy criteria.
Hope this is helpful.
George Chabot, PhD, CHP