Answer to Question #13343 Submitted to "Ask the Experts"
Category: Instrumentation and Measurements — Instrument Calibration (IC)
The following question was answered by an expert in the appropriate field:
My partner has been using GM tubes for more than a decade now. He has been calibrating them once per year. The calibration data reveal that his GM tubes would have remained within allowed error range even if he had never calibrated them after the first calibration. Can I use this operating experience to calibrate my GM tubes (same manufacturer, same type, same working environment) after a decade? I know that calibration frequency is recommended by manufacturer and adopted by end-user based on his own experience, as they might have different operating environments. What is the correct method of defining calibration frequency without risking safety as well as keeping in view economic considerations? Can I safely reduce the frequency to 10 years based on operating experience?
Despite the fact that your friend's experience might appear to validate increasing the calibration interval from one year to 10 years, I do not believe it is appropriate. Additionally, if use of the GM detector is governed/required by a licensing/regulating agency, such as the US Nuclear Regulatory Commission (USNRC) or an agreement state in the United States, I doubt whether the licensee would receive approval to make such a change. In fact, such licensees are commonly required to calibrate at intervals specified by, or approved, by the regulating/licensing agency. As you have inferred, the intent of specifying a particular calibration interval is to pick one that provides an acceptable level of certainty that, for employment of a particular instrument by authorized users in expected environments, the instrument response under routine use conditions will not have changed significantly since the last calibration. In some instances, the licensee may attempt to make a case for changing the calibration frequency. We shall say a bit more about this later in the discussion. Of course, if you are using the instrument simply as a private citizen under your own authority with no obligation or responsibility to an accepted regulating or licensing group, you are free to do whatever you wish regarding calibration; however, if you are concerned that your measurements are acceptably accurate, there is a need to perform calibrations at acceptable intervals. When such intervals become excessively long, the likelihood of response changes increases, and I believe the commonly invoked one-year calibration interval has been accepted and promulgated as such an acceptable interval.
The rationale for the typical annual calibration is to determine whether any response changes have occurred since the last calibration. Depending on how frequently an instrument is used, who is using it, and the conditions of use, the performance of an instrument may change as a consequence of incidental (or occasionally purposeful) damage/abuse, partial or complete component failure, environmental influences, and/or normal wear and tear. Some facilities in which instrument use is heavy and sometimes physically severe, calibration may be required more frequently than annually. The annual frequency seems to be the most commonly agreed upon routine that provides a reasonable level of confidence in the readings when instruments are subject to average use; for example, the annual frequency is recommended by the NRC for some licensees (e.g., Regulatory Guide 8.24 for Part 70 licensees and 10 CFR 35.61(a) for medical licensees), the US Department of Energy (DOE rule 10 CFR 835.401(c)). The American National Standards Institute has recommended a one-year calibration interval in its nuclear standards ANSI N323A-1997 and ANSI N323AB-2013, "Radiation Protection Instrumentation Test and Calibration, Portable Survey Instruments." The Code of Federal Regulations 10 CFR 20.1501(c) does not specify a calibration interval but states "The licensee shall ensure that instruments and equipment used for quantitative radiation measurements (e.g., dose rate and effluent monitoring) are calibrated periodically for the radiation measured." The most common expectation is that the calibration interval for typical use for most licensees would be one year; for at least one particular application, industrial radiography, however, the NRC specifies a requirement for a shorter portable instrument calibration period of six months (10 CFR 34.25(b)(1)).
In theory, a licensee might present information to the NRC or to an agreement state or other regulating/licensing group in an attempt to validate a longer calibration interval, although I don't believe an extension much beyond the year would be likely. Licensees have at times applied to regulating/licensing groups for approval to increase the intervals between calibrations but, instances I am aware of have been to decrease the calibration frequency from something less than one year to one year, since the one year has already been accepted by responsible groups. At any rate, such applications must contain sufficient data to show that the longer period will not produce adverse effects and preferably include descriptions of tests/methods that will provide interim validations of proper instrument performance. One way of demonstrating this, at least partially, is to make the check source use more rigid and reproducible. Portable instrument use by a licensee typically requires at least daily use of an appropriate check source to confirm reasonable operation. Frequently such check source use is not very precise, sometimes because of such things as inexact positioning of the source, inversion of the source, and even possible use of different sources at different times. It is possible to establish a more demanding procedure for use of a check source that will provide for a higher level of precision that will assure that one would be more likely to identify a change in detector response in the interval between calibrations. An attempt at such improvement is desirable even if the decreased calibration frequency is not granted.
If you decide to pursue an attempt to get an allowance for longer intervals between calibrations from a regulating authority, it is desirable to bolster your request with legitimate statistical evidence and rationale for making the change. It will likely be necessary to present the data you currently have as well as additional information to verify that operating conditions and user behavior will not change in a manner to negatively affect instrument performance. You can find various references to changing calibration intervals that employ certain statistical/test criteria by searching the internet. A couple of references that might be worth reviewing are a paper prepared under contract to the DOE (Simplified Calibration Interval Analysis by Allen Bare; Savannah River National Laboratory) and another foundation document available from the National Conference of Standards Laboratories, NCSL RP-1, Establishment and Adjustment of Calibration Intervals (unfortunately, I don't believe you can find a free version of this).
George Chabot, PhD, CHP