When to Calibrate Radiation Detectors: A Brief Guide
Home » Radiation Safety & Health Physics Blog » Radiological Surveys » When to Calibrate Radiation Detectors: A Brief Guide

When to Calibrate Radiation Detectors: A Brief Guide

By Dr. Zoomie

Hi, Dr. Zooms – I just started working as an RSO and I’m trying to figure out what sort of calibration program we need to have. Some people tell me I need to calibrate everything annually, others tell me it’s only as often as the instrument manufacturer recommends, and others say I don’t need to calibrate at all. I’ve got to admit I’m not quite sure who to believe and I’m hoping you can shed some light on the matter. You wrote a piece about a decade ago, but I’m afraid it doesn’t have the information I need (sorry).

Yeah – going back and reading that piece I can see how maybe it might not have been as helpful as it could have been. Sorry about that…and let’s see if I can do better!

The most pressing reason to calibrate your radiation detectors is that it can be a regulatory requirement (eg. 10 CFR 20.1501: “(c) The licensee shall ensure that instruments and equipment used for quantitative radiation measurements (e.g., dose rate and effluent monitoring) are calibrated periodically for the radiation measured.)  The periodicity mentioned here is typically annual and this is generally stated outright in your license application or as a license condition. It’s also a fairly standard calibration periodicity for instruments used to verify compliance with regulatory requirements. So if you’ve got a radioactive materials license, you can’t go wrong with having your instruments calibrated annually, although this can run $100 or so per instrument, depending on who’s doing your calibration.

If you don’t have a license then you really have no requirement to calibrate your instruments – although you might still wish to do so. For example – say you work for a police department that performs radiological sweeps of storage areas from time to time and one of your officers gets a “hit” when they’re walking through a storage facility, identifying a particular unit as producing the hit. You might want to investigate the unit, but will a judge issue a warrant if the instrument that recorded the hit wasn’t calibrated? Maybe…but if I were an expert witness for the other side, I could suggest reasons that the judge should look askance at the readings – especially if they weren’t very high to begin with. Being able to show the judge a calibration certificate could serve to allay concerns about issuing a search warrant based on incorrect readings.

Another possibility occurs to me as well. Say you have some firefighters responding to a fire in a university “hot” lab or at a Radiopharmacy. It’s not unlikely that, over the next few decades, one of those firefighters might come down with cancer and might request compensation, suggesting that the radiation exposure they received during the response caused their cancer. Here, a solid set of radiation readings using a calibrated instrument would be able to make a strong case either way. Readings from an uncalibrated instrument, on the other hand, might not be quite as compelling. These are just two concrete examples; the bottom line is that, the more important it is that readings be acknowledged to be accurate, the more important it is that you be able to demonstrate that your instrument was giving accurate readings – which typically means having it calibrated within the last year.

Having said that…unless there is a specific regulatory or license requirement or if you don’t have a radioactive materials license, the default calibration periodicity is frequently whatever interval is recommended by the instrument manufacturer. The rationale here appears to be that the manufacturer knows more than anyone about the instruments they designed and manufacture and that their recommendations should be considered appropriate. So if a manufacturer recommends calibrating their instrument only ever 2-3 years, this is frequently (and reasonably) considered an appropriate calibration periodicity, especially for non-licensees. If you’re not calibrating instruments annually (and, to be honest, if you don’t have a license, it’s hard to justify annual calibrations) then you really can’t go wrong by following the manufacturer’s recommendations.

The National Council on Radiation Protection and Measurements (NCRP) came out with a somewhat more nuanced view of instrument calibration in Statement 14 (Instrument Response Verification and Calibration for Use in Radiation Emergencies). NCRP identified three levels of rigor to be used in evaluating an instrument’s performance; confirming that an instrument responds to radiation via a “bump” test, confirming that an instrument responds appropriately to a radioactive source of known strength via a “field response check,” and performing a full-blown calibration. In general, the more potentially important the reading(s) you plan to obtain, the more important it is to be able to demonstrate that the readings can be trusted.

Performing the first two levels of these checks is fairly simple.

  • For a “bump test,” simply expose your detector to a radiation source and confirm that the meter responds. If you hold a source up to your meter and the meter reading increases then you’ve demonstrated that the meter will sense elevated radiation levels. This shows that the meter can actually detect radiation.
    • Every day that you use your instrument you should turn it on and check to see that the background radiation level is about what you expect to see. Then you should “bump” it with a radioactive source (I use thoriated welding electrodes) simply to make sure it responds to radiation.
  • A field response test involves exposing the instrument to a source of known strength and showing that the instrument responds the same way today that it did the last time(s), as well as showing that it reads the same background levels that it did the last time (in the same location). For this, all you really need is a source that’s not going to decay away quickly, which can be a thoriated gas lantern mantel, a radioactive source you by from a vendor, thoriated welding electrodes, a radioactive mineral, a radioactive antique plate or glass beads, or the like.
    • Every 3-6 months you should hold your “source of known strength” close to the detector and confirm that your instrument response to the source and to background radiation are within 20% of what you expect to see.
    • Note: If you’re doing field response checks then you need to make sure that the relationship between the source and the detector is as close as possible to being exactly the same each time you do the check (i.e. holding the source 1 cm from the center of the detector)

You can do either (or both) of these yourself; if you want to go to the next level (calibration) then you’ll need to either send your instruments out to a licensed calibration facility or you’ll need to develop your own calibration capability.

  • If an instrument fails the bump test or the field response test then it’s time to have it calibrated. In addition, consider calibrating your instrument at whatever periodicity recommended by the manufacturer, or annually if you have a radioactive materials license.

And having said this, you’re not likely to be able to calibrate your instruments yourself; if you want to do so then you’ll almost certainly need to have this stated explicitly in your radioactive materials license, and most regulators are going to need you to demonstrate to them that you know how to do so. This means taking a class in how to calibrate radiation instruments (and showing regulators a copy of your course completion certificate), having the appropriate calibration sources and equipment, developing a set of procedures, and so forth – all in compliance with ANSI Standard N323AB. I’ve worked at two places that calibrated their own instruments (both universities) and set up a new calibration program and each one needed to have all of those components, including the regulatory approval. And, for what it’s worth, setting up the one calibration facility cost about $250,000 and took the better part of a year – had we had a smaller variety of instruments I could have likely done it for about $100,000 – but that’s still a lot of money and time to expend unless you have a lot of instruments to send to cal every year.

I hope this helps to fill in the blanks from my earlier post on the subject; if not let me know what additional questions you have that I can help with.