Author Archives: Dr. Zoomie

Dose Reconstruction – What Happens if a Rad Workers Loses A Dosimeter?

Dear Dr. Zoomie – one of my rad workers lost his dosimeter. He normally gets some radiation exposure from working with our equipment – some gives off x-rays and some uses radioactive sources. What should I do?

From the information that you’ve provided I think this is something that has to be taken care of sooner rather than later – having said that I’m guessing that this is important rather than urgent. In other words, you have to make sure you meet regulatory requirements, but the worker’s health is probably not at risk (at least, based on the fact that you said the worker normally gets exposure, but you didn’t say that he normally is close to dose limits – and there’s a huge gap between reaching a dose limit and facing potential health effects). So – if my assumptions are correct – what you need to do is to come up with a reasonable dose estimate that you can provide to your dosimetry vendor so you can ask them to assign that dose to the worker in his dosimetry record. Here’s how you can go performing a dose estimate. And remember – make sure you document everything!

What I’m going to do is to lay out a bunch of techniques you can use – in some cases, a single one might do the trick; in other cases you might have to try several or even all of these.

One thing you can do is to look at the worker’s past dosimetry reports to see what level of exposure he’s received in the past. If, for example, his exposure has typically been between 50-100 mrem monthly AND if his workload for the month with missing dosimetry was typical then you can make a safe guess that, during the month in question, he probably received no less than 50 mrem and no more than 100 mrem. In this case, as the RSO, I could justify assigning a dose of 100 mrem to the worker for the month. But it would be nice (if possible) to come up with a second estimate to justify the first. So what else can you do?

You didn’t say anything about co-workers, but if you have more than just this one person who does this sort of work, you can check to see what dose his colleagues received during the moth with missing dosimetry. If their doses are normally comparable to the dose of the man with the missing dosimetry then you can see how much exposure they received and use that as the basis for a dose assignment. Say you have three other rad workers who received 50, 80, and 60 mrem for the month in question – being conservative, you can assign a dose of 80 mrem to the worker.

You have yet another option – and this one will require a little more work, but it’s likely to be most accurate. First, take a look at what the worker did during the month in question. For example, if he’s a radiographer you’re going to have to see how many jobs he went out on, which source he used for each job, and how many shots each job entailed. Or if he’s a nuclear medicine technologist, you need to see how many patients he saw and how much dose each received. Once you’ve figured out what the worker did you can start making radiation measurements. For example, if the worker did three radiography jobs with a 50 Ci source of Co-60, you can set up a mock radiography shot that mimics the jobsites he would have worked (in a safe location). Then you can perform a dummy shot in which you run the source out and retract it while you measure the radiation exposure – this will tell you how much dose your worker received during a single shot; this can be used to calculate his total dose for ALL of the short he performed during that month. For example, if you measure a dose of 5 mrem for one shot and he performed a total of 15 shots during the month you can assign a dose of 75 mrem for his radiography duties for that month. Alternately, you can perform this same measurement on a real shot and use the results to make the same calculation.

There’s another way to accomplish this, say, for someone who works in radiation areas or around radioactive materials. In this case, you need to try to estimate how much time the worker spent in each radiation area and then go to these areas yourself to measure the radiation levels. Say (for example) he spent a half hour performing sealed source leak tests – you’d need to go to your source storage and measure radiation levels. If you measure 20 mR/hr then you multiply this by a half hour and determine that his source leak tests gave him a dose of 10 mrem. Repeat this for every other task he performed and add these numbers together – this is the worker’s assigned dose for that month.

If time permits – and if the estimated dose is fairly high (more than 10% of a dose limit is my general rule of thumb, although others will do this for any calculated dose greater than 100 mrem) – you should do more than one technique and compare the results. Say the worker’s typical dose is 50-100 mrem monthly for the last several years, that his co-workers received doses of 50-80 mrem for the same month, and your measurements suggest a dose of 120 mrem for that month. These are all in the general ballpark – close enough that you could justify assigning a dose of anywhere between 80-120 mrem. If you want to use the average you’d pick 100 mrem, but I’d be a little more conservative and would likely go with 120 mrem. Where you have to really think, however, is if there’s a wide disparity – say your calculations, instead of showing 120 mrem, showed he might have received 3000 mrem during that month? You can always go with the measurements and assign a dose of 3000 mrem, but I’d want to do some more checking first. In this case, I’d re-run the dose measurements and calculations to see if I made a mistake. If my measurements and math seem to work out, I might even call a consultant to do an independent dose assessment as well as to review the work I’d done to see if these numbers could somehow be reconciled. You might have to go through this process a few times until you’ve got an answer you can feel comfortable about – once you reach this point you’ll need to contact your dosimetry vendor to ask them to assign this dose to the worker for that month. Oh – something to remember – if you base any part of your dose assignment on radiation surveys you performed, you need to file your survey results away with your dosimetry records and then hang onto them for as long as your company has a radioactive materials license.

There’s one related topic I’d like to mention before signing off – what to do if the dosimetry report comes back showing a dose that seems way too high. Say a worker’s badge reads 45 rem (45,000 mrem) one month. This is a high dose – not dangerously high, but far higher than the worker’s annual dose limit. Unless the worker was responding to a radiological emergency there’s no acceptable reason to have so high a dose – you need to try to figure out if the dose is real. With some dosimeters, you can ask for an assessment to see if the badge was attached to the worker during the exposure (this is called “static/dynamic imaging” by one vendor). But this should be verified by a dose reconstruction – you should do everything described above: compare his dose with previous months, compare his dose with co-workers, look at his work schedule, AND make dose measurements in all the areas where he worked (and under the same conditions under which he was exposed); maybe consider calling in a consultant as well.

But with a dose this high you have an option that a lower dose doesn’t give you – you can send a blood sample off for biodosimetry; most likely a procedure called chromosome aberration analysis. This looks at the chromosomes to see if they’ve been damaged by the radiation; if so, the amount of damage can be used to estimate the dose. The body is the ultimate arbiter of dose – no matter how high a dose the dosimeter shows, if the body shows it received a small dose then the dosimeter must be wrong.

OK – having said all of this, I have to acknowledge that you can get even deeper into dose reconstruction than this, but if what I’ve described above doesn’t solve your problem then you really need to bring in a consultant to help you out. In addition, if your worker had any sort of an uptake (inhalation or ingestion) then you should really consider bringing a consultant in as soon as possible. But barring one of these possibilities, this should stand you in good stead. Good luck!

North Korea – What Did They Detonate?

Dear Dr. Zoomie – I just heard that North Korea claims to have developed a hydrogen bomb, but our experts say it’s probably just a regular fission bomb – or maybe a boosted device. This is all Greek to me – what’s the difference between these?

Good question! And I know the terminology can be a bit confusing, so let me see if I can help shed some light. But first, some basics.

First, where the energy comes from. Conventional explosives get their energy from breaking chemical bonds – breaking or rearranging chemical bonds releases a few electron volts each (the electron volt is a unit of energy that makes sense on an atomic or molecular level). By comparison, nuclear reactions involve rearranging the nuclear structure of an atom and nuclear reactions release millions of electron volts. So a single nuclear reaction releases as much energy as at least a million chemical reactions.

Next – where the energy comes from in a nuclear reaction. Some atoms are so big that they barely hold themselves together; hit them with a neutron and they’ll split apart, releasing all that energy. They also release additional neutrons, and if those neutrons are absorbed (and cause fissions in) additional nuclei then the reaction will grow, as will the energy release. And since all of this happens in the merest fraction of a second (timescales are on the order of nanoseconds), the power output grows…well…explosively. Fission weapons (using uranium-235 or plutonium-239) make use of this process exclusively. But fission weapons can be horribly inefficient – it’s not uncommon for over 90% of the fissionable material to be blown apart before it can be fissioned. We’ll get back to that in a moment.

Another way of producing nuclear energy is from slamming light atoms together hard enough that they stick, forming a larger atom. This is how the sun makes energy – hydrogen atoms stick together to form helium; three helium atoms can fuse to form carbon, and so forth. But fusion can only happen under extraordinary conditions – specifically the conditions that we see in the center of the sun. In a weapon, these conditions are generated using a fission explosion – a fission bomb goes off, igniting the fusion reaction. Now the question becomes how much fusion takes place and how much energy does it produces.

Most importantly, this fusion also generates neutrons, and these neutrons are vitally important in a boosted weapon. In a boosted weapon, enough fusion fuel is put in the center of the bomb to produce copious numbers of neutrons, but not enough to produce a lot of energy. But these neutrons are crucial because they can be captured by some of that 90% of the fuel that normally is untouched – if you can simply double the number of U-235 or Pu-239 atoms that fission you’ll double the weapon’s yield. So this is a boosted weapon – a “typical” fission bomb with a smidgeon of fusion fuel in the center – but the fusion fuel is there to produce neutrons. You can think of this as the nuclear equivalent of blowing on a fire – you’re not directly adding significant energy to the fire, but you’re helping the existing fuel to burn more efficiently.

Of course, if you put more fusion fuel (this can be a mixture of hydrogen isotopes deuterium and tritium, often combined with lithium to form lithium deuteride or lithium tritide) then you get more energy from fusion – at some point the fusion is not only producing a ton of neutrons, but a significant amount of energy as well. This is where we transition from a boosted fission weapon to an out-and-out thermonuclear (or hydrogen) bomb. And this, too, is where the weapons designers have to decide what to do with all of the fusion neutrons – they can use them to cause still more fission, to produce a great deal of radioactivity (for example, adding stable cobalt to the weapon can produce radioactive cobalt-60), or they can let them escape to make a “neutron bomb” that produces high levels of radiation while sparing the infrastructure. Since fusion doesn’t result in radioactive fission products it’s considered to be fairly clean – especially compared to fission weapons.

Ivy Mike - H-Bomb

Ivy Mike was the codename given to the first test of a full-scale thermonuclear device, in which part of the explosive yield comes from nuclear fusion. It was detonated on November 1, 1952 by the United States on Enewetak, an atoll in the Pacific Ocean, as part of Operation Ivy. The device was the first full test of the Teller-Ulam design, a staged fusion bomb, and was the first successful test of a hydrogen bomb.

Finally, there’s one more point I’d like to address – North Korea’s claims that their weapons have been miniaturized. First, this is potentially important because unless a device can be delivered, it can’t really be considered to be a weapon. The smaller (physically) a weapon is, the more easily it can be used. And to be used on a missile, where every cubic inch and every ounce matters, the smaller a weapon can be made, the better. The problem is that it’s not easy to make a compact nuclear weapon – physics itself places some constraints (you have to have a critical mass of fuel, in addition to the explosives to set it off plus the electronics plus the casing and so forth. You can trim a lot of this somewhat – for example, a boosted weapon will require less uranium to achieve the same yield – but there are limits. It takes a lot of science and engineering to press up against these limits, as well as a lot of testing to make sure that, when you’re pushing the limits of the science, that your weapon will actually work the way you intend. Think of electronics – it’s easy to make something that’s large, but making something tiny can be hard. So North Korea’s claims to have developed miniaturized nuclear weapons is potentially alarming – but also somewhat dubious.

There’s a LOT more discussion we could have here, but to go beyond this level would take up the better part of a book (rather than a blog posting). But if you’re interested in how nuclear weapons work (at an unclassified level) you can go online to the Nuclear Weapons Archive; if you’re interested in the effects, try reading The Effects of Nuclear Weapons – these are all unclassified documents. In addition, Richard Rhodes’ books The Making of the Atomic Bomb and Dark Sun: The Making of the Hydrogen Bomb are outstanding histories of these two projects. In addition, the magazine The Progressive published a piece titled The H-Bomb Secret in their November, 1979 issue; this can still be downloaded at no charge.

How Do I Protect Myself from Background Radiation?

Dear Dr. Zoomie – I just heard that we’re always exposed to radiation and it sort of scares me. And on top of that, there’s the radiation from Fukushima. Can I just line my walls with lead? And how much lead do I need to cut down this radiation to something safe?

Well…I think you might be overly concerned here. First, you’ve got to remember that radiation from natural sources has been a part of life on Earth since it first formed – it’s safe to say that every single organism that has ever lived on Earth has been exposed to radiation. Not only that, but there’s ample evidence that these natural background radiation levels today are the lowest they have ever been. This means that today we are being exposed to less radiation (from natural sources) than were our distant ancestors.

It’s also important to realize that natural radiation varies considerably from place to place on Earth. Some places – the American Gulf Coast, for example, most of Hawaii, Japan, and so forth – have unusually low levels of natural radiation (more on this in a moment) while other places (Ramsar Iran, the American Colorado Plateau, Guapari Brazil, and others) have elevated levels of natural radiation. Interestingly, cancer rates don’t seem to be at all correlated with natural radiation levels. For example, the parts of the US with the highest natural radiation levels also have among the lowest cancer rates in the country. Some people have actually used this fact to suggest that low levels of radiation might make us healthier – I’m not sure I can agree with that, though, because there are so many other things that can affect cancer rates. For example, the population along the Gulf Coast is generally older than in the Rocky Mountains, there are more people who smoke, and the area has a significant petrochemical industry – all of these could serve to increase cancer rates compared to the Rocky Mountain states. The most we can say is that the effects of these different radiation levels is fairly low – low enough to be swamped by the other factors.

Did you know Bananas contain naturally occurring radioactive material in the form of Potassium-40?

Did you know Bananas contain naturally occurring radioactive material in the form of Potassium-40?

As far as where this natural radiation comes from, there are four main sources, and these all change from place to place. They are:

  • Radiation from the rocks and soils. All rocks and soil contain potassium (about 0.01% of potassium is naturally radioactive) along with trace amounts of uranium and thorium. Uranium and thorium, in turn, decay through a number of radioactive progeny until they reach stable lead – these progeny include radium, radon, and polonium. So any rock or soil that you analyze is going to have radioactivity in it and we’re all exposed to radiation from this. On average, we receive 20-30 mrem annually from radioactivity in the rocks and soils as well as from building materials made from them (for example, bricks are made of clay that an contain radioactive potassium).
  • Radon in our homes and buildings. Radon comes from the decay of uranium in the rocks and soils. Radon is a gas; once it forms it will seep through the soil and into our homes – since it’s heavier than air it collects in basements, subway tunnels, caves, and so forth. Places with higher levels of uranium and thorium in the rocks and soils will have higher concentrations of radon in the air. On the other hand, since radon is such a heavy gas it’s not a concern above the first floor of any building. We receive about 150-250 mrem annually from radon, depending on the local geology and the amount of time we spend in areas where the radon might collect.
  • Radiation from our own bodies. As mentioned earlier, a tiny fraction of potassium is naturally radioactive, including the potassium in our own bodies. In addition, we’re always exposed to radioactive carbon (carbon-14) and hydrogen (H-3, also called tritium) that’s formed in the atmosphere from interactions with cosmic rays. Finally, we’re always ingesting or inhaling trace amounts of dust that can contain uranium or thorium. All told, we receive about 40 mrem each year from all of these sources of radiation in our own bodies.
  • Cosmic radiation from the sun and stars. The sun gives off high-energy particles as the solar wind. In addition, distant stars give off their own stellar winds and some of those particles escape into space as well – some of these stars, by the way, are far more active and energetic than our sun and the particles they give off have far more energy than what our own sun emits. As if that weren’t enough, every few decades a massive star explodes in our galaxy in one of the most energetic events in the universe – a supernova. The particles from these explosions also permeate space. When any of these particles – whether from our sun or from elsewhere in the galaxy – slam into our atmosphere they can generate what are called cosmic ray air showers; cascades of particles and gamma rays that percolate down to the ground to expose us to radiation. In general we receive from 20-40 mrem each year from cosmic radiation.

Putting all of this together, we receive about 250-300 mrem each year from natural radiation, with some variability depending on where we live and what we eat. And remember – first, there’s no way to get away from this and second, we’ve been exposed to this radiation for as long as there has been life on Earth.

Background Radiation Pie Chart

Background Radiation Pie Chart

So – as far as your questions go – lining your walls with lead might help to shield you from cosmic rays, but that’s only a minor source of radiation. You might also line your floors with lead to reduce dose from the rocks and soils, but that’s also somewhat minor. Sealing your basement might remove radon as a source – but there’s no way to eliminate the radiation exposure from the radionuclides in your own body. But really, this doesn’t matter – again, this is radiation that our bodies have the ability to deal with. The bottom line is that I really don’t see a reason to try to shield your home from natural radiation. Ah (I hear you say) but what about the radiation from Fukushima? For that, read on!

There’s no denying that the Fukushima meltdowns put a lot of radioactivity into the environment. A lot of it ended up in the Pacific Ocean and a lot settled out on the ground in Japan, but a lot of radioactivity ended up in the atmosphere. This was measurable in the US – just as fallout from the Chernobyl nuclear accident could be measured in the US. In fact, after the Chernobyl accident I was on a nuclear submarine that was stationed off the coast of the Soviet Union and we could measure the radioactivity in the air whenever we came up to ventilate. But what I learned then – and what is important to realize now – is that just because we can measure the radioactivity doesn’t mean that it’s a threat. With the right equipment, for example, I can measure the radioactivity in my own body, in the granite countertop in my kitchen, or in a bunch of bananas. That’s because we’re really good at measuring radioactivity. But, again, just because we can measure it doesn’t mean that it’s dangerous. Consider – I can measure the weight of a single grain of sand or a single particle of dust. But does that make it dangerous to have a grain of sand land on my head? Not really – because the weight of the sand grain is so small that, although measurable, it’s insignificant. Similarly, while we could measure radioactivity in the air after the Fukushima accident, the radiation dose to anyone in the US was trivial – even in Alaska and Hawaii. Some of the Fukushima radioactivity undoubtedly settled on the ground here, but the majority of that was relatively short-lived iodine (I-131 to be precise) and the remainder was present in such low quantities that it simply poses no more threat than the natural potassium that’s already in the soil.

Finally, I wanted to mention something else briefly – about environmental sampling. Every now and again you’ll read a report that somebody has, say, sampled rainwater or soil, or even made radioactivity measurements on a beach and they’ve found elevated readings. To be honest with you, I always take these with a grain of salt because making these measurements isn’t as easy as one might think. It’s not uncommon to take a quick reading and to get counts that are higher than background readings. But just because you get a reading above background doesn’t necessarily mean anything. To get into a full discussion would require talking about counting statistics, and I won’t subject you to that! For now, let it suffice to say that radiation is random – if you take a radiation detector and push the button to record every count that comes in during the next minute you’re going to get a number. Push it again and you’ll get a different number; push it a third time and you’ll get something else. Over time you’ll notice that the readings are all clustered together a shape that’s called a bell curve. Sixty-eight percent of the time the readings will fall within one standard deviation of the average value and 95% of the time the readings will be within two standard deviations. As long as particular reading is within two standard deviations of the average then you probably don’t have anything unusual. But there are a lot of people who think that any reading that’s above background is significant, when it could simply be that your detector was hit by a cosmic ray air shower while you were counting.

Another mistake that people make is taking too small a sample and counting it for too short a period of time. If you have elevated readings after counting, say, a 1 liter sample of rainwater for an hour, it’s a lot more compelling than having an elevated reading following a 1-minute count of a one-milliliter sample. This is another problem that people run into – they’ll take a small sample, count it for a minute, and if the count rate is a little higher than background then they start worrying about contamination. Most of the time, though, counting a larger sample for a longer period of time (many environmental samples are counted for over 12 hours) will show us that there’s really no contamination. OK – so let me put all of this together!

Did you know we receive anywhere from 20 to 40 mrem each year from cosmic radiation?

Did you know we receive anywhere from 20 to 40 mrem each year from cosmic radiation? – Graphic by W. Kent Tobiska

First, every creature that has ever lived on Earth has been exposed to background radiation and we’ve evolved to cope with this level of exposure. Background radiation comes from radionuclides in the rocks and soils, in our own bodies, from radon, and from cosmic rays. Cosmic radiation varies around the world, but these variations don’t seem to be correlated with health or cancer risks. That being the case, you really don’t need to worry about shielding your home to reduce exposure from background radiation. And with regards to radiation from Fukushima, some of that was detected in the air shortly after the accident and some might even have settled to the ground in the US. But radiation dose from this fallout was so low that it’s not worth worrying about – less exposure than if you were to move from the Gulf Coast to the Rocky Mountains. Finally, there are a number of people who have reported measuring higher levels of radioactivity at various times. But unless the samples were obtained and analyzed by someone with experience in environmental sampling they’re likely to be small samples counted for a short period of time – as such, the results are likely to be meaningless. And even if an increase was seen, unless it’s more than a few standard deviations higher than background levels, the results are most likely nothing more than a statistical anomaly rather than an indication of contamination.

What Should Be Included in a Training Program for Radiation Workers?

Hello, Dr. Z – I’m a Radiation Safety Officer (RSO). I am putting together a training program for my radiation workers. What do I need to include to make sure I cover all the bases? Thanks!

Wow – great question! And very near and dear to my heart – I’ve been teaching in one way or another since the early 1980s. So let’s see how much I can tackle here.

First, let’s start off with who is considered to be a radiation worker. And the fact is that there is no regulatory definition of what a radiation workers is – this is something that you will have to define for yourself based on your understanding of your facility. The bottom line is that anybody who is NOT a radiation worker is limited to 100 mrem of exposure annually, so if you have anybody who you can reasonably expect to reach or exceed this level of exposure then you need to designate them as a rad worker and give them appropriate training. You might do this based on radiation dose rate measurements (if a person works for 2000 hours a year in a radiation field of 0.05 mrem/hr they’ll receive 100 mrem in a year).

Alternately, you might decide to designate everyone who works in a particular area or who performs specific tasks as a rad worker – everyone who operates an x-ray cabinet for QA/QC measurements for example, or everyone who works in a veterinary x-ray suite. Something else to consider is that some types of workers are likely never going to receive 100 mrem in a year – but they should be considered rad workers nevertheless. Think, for example, of laboratory technicians who handle radionuclides every day – even though these workers almost never receive any measureable dose at all (at most universities, anyhow) they’re working directly with radioactive materials in a form that can easily cause contamination.

Pictured here, an RSO gives a short demonstration on how to properly use a survey meter.

Pictured here, a Radiation Safety Officer (RSO) gives a short demonstration on how to properly use a survey meter.

OK – moving on…training is a regulatory requirement, in addition to being simple good sense. Think about it – if you’re going to ask someone to work with anything potentially hazardous, you owe it to them to teach them how to do so safely. So we have, say, electrical safety training for those working with electrical gear, we give driver’s training before we can operate vehicles…and we give radiation safety training to people before they’re allowed to work with radiation or radioactivity. And if that’s not enough of a reason for you, radiation safety regulations (specifically 10 CFR 19.12) require training for “all individuals who… are likely to receive in a year an occupational dose in excess of 100 mrem (1 mSv)….” Not only that, but all of your radiation workers are required to receive annual refresher training.

Oh – I should also point out that you might need to give training to non-radiation workers if they routinely work around radiation or radioactivity. For example – your maintenance or housekeeping staff might regularly enter radiological areas to clean up, empty the trash, or fix equipment. Even though these people might never receive any appreciable exposure they’re still working around radioactive materials and they deserve to know how to do so safely. These are called ancillary workers and of course they should receive training! Luckily, this training need not be very in-depth – mine would usually only last about 10 minutes and I’d briefly make sure they could recognize the radiation symbol, that they knew not to remove radioactive waste, I’d make sure to let them know that the health effects were minimal (given the very low exposure they were receiving), and would let them know how to get in touch with Radiation Safety or Security if they ran into any problems.

The founder of Nevada Technical Associates, Inc., the late Dr. Robert Holloway, frequently conducted Radiation Safety Officer Training courses, and other short courses on Radiation Safety.

The founder of Nevada Technical Associates, Inc., the late Dr. Robert Holloway, frequently conducted Radiation Safety Officer Training courses, and other short courses on Radiation Safety.

How the training (both initial and refresher) is given is not specified in the regs – I will always give the initial training in person, but I’ve found a number of ways to give the refresher training. I’ve required rad workers to watch instructional DVDs, to read the refresher training materials, I’ve had on-line refresher training, and also given annual classes. I’ll also accept (only for refresher training) certificates from related radiation training – for example, if a person attends a class on responding to radiation emergencies. Whenever possible I prefer to give all training in person, but sometimes this just isn’t possible. The bottom line is to make sure that you can document that all of your rad workers have received the training and that the training meets the regulatory requirements. As to what those requirement are…keep reading!

The same regulation mentioned earlier also tells us what has to be included in the training:

  • Kept informed of the storage, transfer, or use of radiation and/or radioactive material;
  • Instructed in the health protection problems associated with exposure to radiation and/or radioactive material, in precautions or procedures to minimize exposure, and in the purposes and functions of protective devices employed;
  • Instructed in, and required to observe, to the extent within the workers control, the applicable provisions of Commission regulations and licenses for the protection of personnel from exposure to radiation and/or radioactive material;
  • Instructed of their responsibility to report promptly to the licensee any condition which may lead to or cause a violation of Commission regulations and licenses or unnecessary exposure to radiation and/or radioactive material;
  • Instructed in the appropriate response to warnings made in the event of any unusual occurrence or malfunction that may involve exposure to radiation and/or radioactive material; and
  • Advised as to the radiation exposure reports which workers may request

In addition to these, you should also consider discussing program-specific topics; what these are is up to you. For example, I would include a discussion of the most common problems we found when we inspected our radiation laboratories, I’d go over a few instructive case studies, and would also show the rad workers (most of them were laboratory technicians or graduate students) how to work safely with radioactivity in their labs. Oh – and at the end of the rad worker training you should give an exam to the students to confirm that they’ve been paying attention. Typically your regulators will be looking to make sure that they receive a score of 70% or higher – anyone who can’t get this score should be required to re-take the training, hoping the information will stick a little better the second time around.

Now a few odds and ends….

  • You need to keep records of everything. Students should sign in for the training (keep the sign-in sheets) and their exam grades should be recorded; these records should be kept for at least a year, until they take their refresher training (and pass the exam for that).
  • You, as Radiation Safety Officer (RSO), also need to have periodic refresher training, presumably to keep you from just talking to yourself for a half hour or so every year. And since you have to have a far higher level of knowledge than any of your rad workers, you should seriously consider taking a 2-3 day class for your refresher training, but this class need not be called “RSO refresher training.” Many regulators will accept any radiation-related class you find – I’ve had RSO refresher students in classes I’ve taught on radiological terrorism, radiation instruments, Naturally-Occurring Radioactive Materials, and more.
  • You might also have to receive more specialized training. For example, if you’re transporting or shipping radioactive materials then you have to receive training in this every three years. If you have (for example) an irradiator then you’ll probably have to teach people how to safely use the irradiator before they’re allowed to do so. And so forth….
  • And there’s no specific requirement as to how long the training should last – I’ve been to rad worker training that lasted as little as a half hour and to training that took three full days – you’re going to have to balance what the regulations require, the site-specific materials you’d like to go over, as well as the amount of time you’re able to pry your workers away from their normal duties.

That’s about it on training – I can go into more details but, really, from here it gets to be very program-specific. I hope it helps – and good luck!

Are Power Lines and Other Electromagnetic Fields Dangerous?

Dear Dr. Zoomie – I keep hearing about how dangerous electromagnetic fields are. Should I be worried about the power lines near my house? And the wiring in my house? And my electric razor? And all that other stuff? I don’t want to be Amish – but I don’t want to get sick either!

The first time I came across this question was over 20 years ago – and it was about a decade old even then. In my case, my father was trying to sell a house that was close to some high-voltage power lines and he couldn’t understand why people didn’t want to buy it. Someone finally told him that they were worried about electromagnetic fields around the power lines. My dad asked me what I could tell him about the science behind these worries. The short version is that these fears were unfounded and the risks from power lines – and electromagnetic fields in general – is vastly overstated. Here’s the longer version of the story.

Are Power Lines Dangerous?

Are Power Lines Dangerous?

This whole story starts in the late 1970s with the publication of a paper suggesting that overhead power lines (and the electromagnetic fields they produce) were associated with cases of childhood leukemia. Although nobody was ever able to show how these fields could cause this disease, some scientific studies received a lot of publicity. There were dramatic videos of people holding up fluorescent light bulbs near high-voltage power lines – the electromagnetic field was enough to cause the bulbs to light up. Ever since these first studies came out people have been worried about electromagnetic fields and their possible health effects.

The problem is that the initial studies have all been discredited and subsequent studies have very clearly shown that electromagnetic fields aren’t dangerous – at least, not at the levels we find near power lines or in our homes. There’s a great summary article online (written by physics professor John Farley) about this on the Quackwatch website, summarizing the history of this debate – it also features prominently in a great book called Voodoo Science by physicist Robert Park. And there have been any number of reports in the intervening years – including some by the National Research Council – that have shown this to be a non-issue. Part of the problem, though, is that one side shows photos of kids dying of cancer while the other side shows calculations and academic studies – the emotional impact of the one side far outweighs the scientific impact of the other. But first, let’s look a little at the science.

First, you’ve got to understand that the Earth has its own electromagnetic field and every creature that has ever live on Earth – including humans – has been exposed to these fields from birth. As with radiation, we have to remember that any exposure to man-made fields is in addition to our exposure to natural fields – if the magnitude of the man-made fields is small compared to the natural ones then we have to consider that the man-made fields might not be that dangerous.

Earth's Magnetic Field

Earth’s Magnetic Field – The Earth’s magnetic field varies from about 300-500 milliGauss (unit of measurement of magnetic field strength) while the magnetic fields from power lines are only a few milliGauss.

According to both Park and Farley, the strength of the electromagnetic fields produced by power lines is very small compared to natural fields. For example, the human body is electrically conductive – we’re pretty much filled with salt water and salt water conducts electricity quite well (pure water, by comparison, is a lousy conductor). If you move any conductor through a magnetic field you’ll induce an electrical current – as we walk (or drive or fly) through the Earth’s magnetic field we induce electrical currents in our own bodies. The fact is that the electrical and magnetic fields induced by high-voltage power lines are much smaller in magnitude than are the natural fields we’re exposed to on a regular basis. To put some numbers on it – the Earth’s magnetic field varies from about 300-500 milliGauss (unit of measurement of magnetic field strength) while the magnetic fields from power lines are only a few milliGauss. If small variations in magnetic field strength can cause health effects then we’d expect to see much greater health effects among people moving from place to place on Earth. The fact that we don’t see these changes (for example, cancer rates are about the same in the North and in mountainous states as they are in the South and in low-lying states) suggests that the much smaller variability from power lines won’t be harmful.

Something else to consider is what I touched on earlier – there is no plausible mechanism for how external magnetic fields can cause cells to become cancerous. Think, for example, if someone gave you a can of gas and told you it could get you home. Without a vehicle of some sort the gas isn’t going to get you anywhere – you need a car to turn gasoline into mileage. At present we can’t find any way to turn external electrical or magnetic fields into genetic damage – we’ve seen nothing in human experience or in animal studies, including those of mice exposed to as much as 10,000 milliGauss.

It also turns out that the original study had some problems, the biggest one being that the authors of the study never actually measured magnetic field strength. Once follow-up studies were done that did make this rather important measurement it turned out that there was no correlation at all. Not only that, but the initial studies looked at only relatively small numbers of people. When larger studies were performed, including measuring magnetic fields, the apparent correlations melted away.

On top of all this, we’ve got to look at what’s possibly the most important piece of information – age-adjusted cancer incidence rates have been dropping steadily for several decades in spite of the fact that our exposure to electromagnetic fields has increased astronomically in those same decades. Think about it – we live our entire lives surrounded by electromagnetic fields – the wiring in our homes, overhead power lines, our computers and monitors, electric razors, televisions and entertainment systems, microwave ovens, and so forth and so on. Seventy years ago, many Americans didn’t even have electricity in their home, and those who did used it primarily for lighting – today, we use it for everything.

All this being said, are there things about the health effects of electromagnetic fields that we don’t know? Of course. We don’t – we can’t – know everything. And, let me add, the fact that we don’t know everything is often used as rationale for continuing to be frightened while further studies are being performed. The question, though, shouldn’t be “Do we know everything?” so much as “Do we know enough?” All of the evidence to date tells us that we know enough to conclude that these levels of electromagnetic fields are not causing harm. There is an awful lot of scientific evidence and scientific reasoning that tells us that electromagnetic fields aren’t nearly the hazard they’re portrayed to be.

At the same time, we know that we get a lot of benefits from the use of electricity – if we’re going to look at the potential downside then we also have to look at the benefits and the billions of lives that have been made better by its use. Let’s think about it for a moment – even if there’s a small risk from using electricity, it makes possible things like street lights, x-ray machines, computer control systems, aluminum and steel manufacture, air conditioners, elevators, and so much more. Electrical power makes our lives better, longer, and healthier – stopping (or even scaling back) its use would certainly add risk to all of our lives. We know that driving puts us all at risk – in the US, almost 1% of us will die in a traffic accident – but we accept this risk because of the benefits we derive from cars and trucks. Similarly, even if (against all scientific evidence) electromagnetic fields turns out to carry with them a small risk, I would argue that we derive far more benefit than risk from their use – if our goal is to make our society as safe as it can be then we should continue our use of electricity.

Finally, for what it’s worth…I have read up on this topic, if only to find out if my father (and those who eventually bought his house) faced any risk. While I’m neither a physicist nor a physician, I understand the science well enough to follow the scientific papers I’ve read and they make sense to me. After taking a look at the science and the epidemiological evidence and after reading the conclusions of scientists I respect (Park and Farley) I’ve decided that this is something I’m not going to be worried about. So I use my electric razor, my computers, my microwave, and all of the other electrical and electronic stuff in my apartment. I guess you’ve got to decide for yourself what you feel comfortable with, but I’d suggest that your concerns about electromagnetic fields might be misplaced.

How to Do Radiation Surveys and Contamination Surveys

Dear Dr. Zoomie – I’m trying to figure out what goes into a radiation survey program and I have to admit I’m drawing a blank. Can you tell me how often I should be doing radiological surveys? Also, can you tell me how to do a survey? Thanks!

I should start by saying that I can often make a pretty good guess about the quality of a radiation safety program by looking at the quality of their surveys. In general, if I audit your facility and see that you’re doing your surveys properly then it’s pretty safe to say that the rest of your radiation safety program is likely to be squared away; at the same time, if you’re missing surveys or being slipshod in your survey technique I’m probably going to find other problems as well. Now, let’s talk about what goes into a survey program and how surveys are performed.

When to survey

We’ll start with when you should be surveying. First, there is no regulatory requirement on this – your survey requirements will be set by your internal procedures, your license application, and your license conditions. You’ll have to use your own judgement as to how often various areas need to be surveyed – if you don’t have the experience to make this decision on your own it’s not a bad idea to ask a consultant for suggestions, or even to ask your regulators what they recommend. Here are a few things to think about:

  • Will people be using unsealed sources of radioactivity? For example, are they working with radiopharmaceuticals or radiolabeled compounds that can cause a spill? If so, you might want to ask people to survey workbenches or fume hoods for contamination daily when the area is in use and to survey the entire room (laboratory, hot lab, etc.) monthly.
  • Are there activities taking place that can be expected to cause contamination? For example, in a rad waste storage room, are you compacting waste, crushing vials, or moving a lot of packages? If so then you should consider surveying for contamination at least weekly, as well as after any potentially contaminating activities.
  • Are you storing radioactive materials in the area? If so then you should consider surveying for radiation at least every six months, as well as after any movement of radioactivity into or out of the area.
  • Do you have radiation-generating machines (x-ray, electron microscopes, etc.)? If so, you might be required to survey annually for radiation leakage, scattered radiation, and/or the effectiveness of your shielding. If the device will be used for medical diagnosis or treatment there will be other requirements as well, including routine quality insurance checks on a daily, weekly, monthly, quarterly, or semi-annual basis.
  • Have you had maintenance on anything that could affect your radiation shielding? Have you had an earthquake, flood, or anything else that could damage your shielding? If so, you should perform a radiation survey as soon as conditions stabilize to make sure the shielding is still intact and doing its job.

These are some of the most common circumstances and your facility might not fit into any of these categories. The important things are to think about how frequently people are using radiation and radioactivity, how potentially dangerous it can be, and what can happen that could cause radiation or contamination to be higher than what you’d normally expect (or want) them to be.

The next part of this discusses (briefly) how to perform radiation and contamination surveys. But please not – this is not the same as receiving formal training – this is a guide, but you should really receive appropriate training and develop a formal survey procedure before you do your own surveys. And remember – before you start ANY survey, make sure your instrument is in calibration, check to make sure the batteries are OK, ensure the meter, probe, and cable are all in good condition, and (for count-rate instruments) make sure to response check against a source of known strength.

Performing radiation surveys

Performing radiation surveys isn’t too difficult – mostly you’ll be walking around with your radiation dose-rate meter watching the dial; make a note of the dose rate on your survey map anyplace from time to time, especially in places where dose rates are higher than the rest of the area being surveyed. As a default, hold your meter about waist-high unless you’re measuring a specific location (say, in front of a source storage safe or a low-temperature freezer). Finally, you’re most interested in dose rates in “accessible areas” – that’s about one foot (30 cm) from any surface, and only in areas where a person could actually be expected to enter. So you don’t need to survey inside of refrigerators or fume hoods unless you expect people to spend a lot of time inside of them. Oh – and make sure that your meter has been calibrated within the last year so that your survey counts! All of your meters have to be calibrated annually (according to regulations) and you can’t meet a legal requirement with an illegal meter.

Chuck Surre, a University of Rochester radiation safety technician is shown here performing a radiation survey around some drums of just-compacted radioactive waste in the waste storage room.

Chuck Surre, a University of Rochester radiation safety technician is shown here performing a radiation survey around some drums of just-compacted radioactive waste in the waste storage room.

Performing contamination surveys

The key to any contamination survey is “low and slow.” You want to keep the detector as close as possible to whatever it is that you’re surveying without being so close that you contaminate the detector. And you want to move the detector no more than about 2-3 inches (about 5-8 cm) per second. If you hold the detector too far away you can miss some contaminated areas, and if you’re surveying too quickly then the probe might not be over a contaminated area long enough to pick up any counts. You should also try to survey as much of the surface as possible – 100% of the surface if you can – to avoid missing any contamination.

Alternately, you might want to perform a smear wipe survey to look for removable contamination (contamination that could come off on your hands or feet) – especially if you’re looking for radionuclides (such as tritium or carbon-14) that aren’t easily detected by hand-held radiation detectors. For a smear wipe survey you’ll need to use a piece of dry filter paper (a Watman or Milipore filter will do the trick) and you’ll need to wipe an area of 100 square centimeters (about 4”x4”). Apply enough pressure to the wipe to pick up any loose contamination, but not so much that you tear the wipe.

After you’re done with your survey you’ll have to records your results on a survey map; your survey map will have to be filed and maintained for three years (under normal circumstances) or longer if your survey is used to reconstruct the radiation dose to one of your workers. And – again – remember that there’s more to doing surveys than what’s provided here; this will give you a start, but there’s a lot more to the topic than what’s written here.

Sample Forms

  1. radiation_and_contamination_survey_procedure (PDF)
  2. radiation_and_contamination_survey_procedure (DOC)
  3. sample_radiological_survey_report_form (DOC)
  4. sample_radiological_survey_report_form (PDF)

Smugglers Selling Radioactive Materials to ISIS – What If They Succeed

Dear Dr. Zoomie – I just read in the news today that they caught criminals trying to sell radioactive and nuclear materials to ISIS in Moldova. What gives? I thought it took a nation to make a nuclear weapon – I’m assuming I should be scared; how scared should I be? Do I need to update my life insurance?

The threat of radiological and nuclear terrorism has been a concern for a number of years and this article is only the latest of a number of articles on this threat. There’s a lot in here so let’s take things one at a time, starting with the easiest.

The article mentions the isotope Cs-135 (cesium-135) as possibly being available for purchase by terrorist organizations to use to make a radiological dispersal device (the so-called “dirty bomb”).  There’s been a lot written about dirty bombs and I won’t add to that here except to note that ISIS has mentioned their pursuit (or possession) of radioactive materials to use in making such a device. This particular isotope (Cs-135) is an odd one to mention – it’s neither as widely used nor as dangerous as Cs-137 – it’s produced in nuclear reactors but is usually disposed of as radioactive waste. It also has a fairly long half-life (so it’s not intensely radioactive) and it emits beta radiation, which is not terribly dangerous. So this nuclide could be used as a nuisance, but it poses much less health risk than Cs-137. The fact that this nuclide is being offered suggests that either the person doesn’t know exactly which nuclide they have, that they’re making something up, or that they have access to spent (or reprocessed) reactor fuel or nuclides that can be scavenged from it.

dirty_bomb_scenario

Members from the Oregon National Guard’s 102nd Civil Support Team approach a mock crash scene, ready to take readings and assess levels of contamination by a “dirty bomb” at a joint-agency exercise held Feb 15, 2006 at the Portland Fire Department’s training facility in north Portland.

The article also mentions plutonium. There is a fissionable isotope of plutonium that’s used in nuclear weapons (Pu-239) but this is produced in nuclear reactors and then must be chemically separated from the spent fuel using some fairly complex chemistry. Pu-239 is too valuable as a fissionable material to be used in a dirty bomb, and it’s a lot harder to fashion into a nuclear weapon than is uranium – chances are that, if plutonium is being sold as a potential RDD material, it’s more likely to be Pu-238. Pu-238 is fairly common – although it’s not fissionable (so it can’t be used to make nuclear weapon) it is fairly highly radioactive. Its primary use is in spacecraft in the form of radioisotopic thermal generators (RTGs) due to the high levels of heat emitted by the decay of Pu-238 atoms. It’s also toxic, although (contrary to common misconception) it’s not the most toxic substance known to science. In any event, an RDD made with Pu-238 would be a great way to make a mess, but it’s not likely to put many (if any) people at risk since the radiation it gives off (alpha radiation) is fairly innocuous unless it’s inhaled or ingested. It would, however, make a huge mess – it could contaminate a large area, cause a huge financial impact, and would likely cause a degree of panic – but the perceived health risk would very likely far outstrip the actual danger.

Finally we get to the weapons-grade uranium that was mentioned.  The nuclear weapon that was dropped on Hiroshima contained a little over 60 kg of highly enriched uranium; this story mentions that the seller had 100 grams of weapons grade uranium with him and promised to deliver it in 1-kg batches at a cost of about $32 million per batch.  To make a Little Boy-type device would require over 60 transactions and a total cost of two billion dollars. The financial part of it is beyond the reach of most (but not all) of our enemies – this narrows the list of potential buyers. Equally important is the sheer number of transactions that would need to take place for a terrorist group (or rogue nation) to get their hands on a bomb’s—worth of material. Sixty transactions greatly increases the odds of being caught by law enforcement or intelligence agencies and having the whole plot unwind. This isn’t to say that this would be impossible – but the more complex a plan is, the less likely it is that it will be successful. It’s certainly possible that a criminal organization might have access to large quantities of weapons-grade uranium, but it’s also possible (perhaps more likely) that the seller has just enough of the material to get rich and that the supply would dry up after just a handful of sales. So, while we have to take this report seriously, I’m not sure that it’s time to start buying nuclear bomb insurance.

Taking all of this together, what have we got?

Well, first we now that there’s a continuing interest in putting radiological or nuclear materials into the hands of groups that wish us ill. And if organized crime organizations really have found sources of radiological or nuclear materials – and if they have found buyers – the possibility of an attack of some sort is quite possibly higher today than in the past. On the other hand, there have been attempts at radiological and nuclear smuggling for at least a decade – what makes this report different is the apparent involvement of organized crime.  The biggest unknown is whether this group has access to enough uranium to make a weapon and whether or not a terrorist group has access to enough money to buy it all.  Right now we just don’t know the answers to these questions.

Is Nuclear Energy the Solution to Global Warming, and is it Safe?

Dear Dr. Zoomie – I’m worried about global warming because I live near the beach. The pro-nucks tell me that nuclear reactors don’t give off any greenhouse gases, but the anti-nukes tell me that you can’t make a safe reactor and we’ll just keep having more meltdowns will contaminate the whole world. What’s the real story?

Thanks – it’s been awhile since I’ve been asked a question that deals with the fate of the world. I guess, for starters, I have to say that I don’t think that nuclear energy is the solution to our problems – but I think that it’s part of the solution. I should also say up front that, while I have never worked for a commercial nuclear power plant (or for any sort of commercial power plant), I did spend 8 years in Naval Nuclear Power, including two years as an instructor and four years on a nuclear submarine. I also spent nearly two weeks in Japan after the Fukushima accident, helping to train emergency responders and medical responders in how to care for patients coming from the contaminated areas. So I think I’ve seen both some of the good and bad that nuclear power can offer – I hope that this helps me to be objective.

The Fukushima I Nuclear Power Plant after the 2011 Tōhoku earthquake and tsunami. Reactor 1 to 4 from right to left.

The Fukushima I Nuclear Power Plant after the 2011 Tōhoku earthquake and tsunami. Reactor 1 to 4 from right to left.

So – this is one of those “Do you want the good news or the bad news first?” sort of things. I normally go for the bad news first, so let’s start there. The bottom line is that, in the 70 years since Fermi oversaw the world’s first reactor criticality, the world has had three serious nuclear accidents in the nuclear age as well as a number of near-misses or less serious accidents. The result of these accidents is that large swathes of land in Japan, Ukraine, and Belorussia are currently evacuated and might not be reoccupied for decades. Huge amounts of radioactivity were put into the environment – the fallout cloud from Chernobyl and Fukushima blanketed the Northern Hemisphere, and Fukushima also dumped large amounts of radioactivity into the oceans. The accidents themselves have cost hundreds of billions of dollars and have had a global impact on the way that we view nuclear energy. And – at least in the case of Chernobyl – people have died. I should add that other accidents have killed people as well.

Chernobyl Disaster Aftermath:very extensive damage to the main reactor hall (image center) and turbine building (image lower left)

Chernobyl Disaster Aftermath:very extensive damage to the main reactor hall (image center) and turbine building (image lower left)

On the other hand, I should also point out that the World Health Organization conducted an extensive study of the people around Chernobyl in 2006 – on the 20th anniversary of the accident – and concluded that fewer than 100 people have died of radiation sickness or from radiation-induced cancer from that accident, and they also concluded that nobody in Japan will die from radiation-induced disease (including cancer) from Fukushima. Nobody got sick or died from the Three Mile Island accident, and only a handful of people have died from any of the other nuclear reactor accidents that have taken place. When we compare the number of deaths per gigawatt-hour of energy produced, nuclear actually stacks up quite favorably to other forms of energy – coal mining, hydrocarbon extraction and processing, and transportation of these fossil fuels are dangerous activities that cause hundreds to thousands of deaths annually; far more than the deaths from these accidents. And that doesn’t even get into the health effects of smokestack emissions, acid rain, and so forth. I’m not trying to say that nuclear power is harmless – just that we have to remember that no form of energy is without risk.

The Diablo Canyon Power Plant in San Luis Obispo County, California. This electricity-generating nuclear power plant near Avila Beach has operated safely since 1985. Photo by marya from San Luis Obispo, USA: https:// flickr.com/photos/35237093637@N01

The Diablo Canyon Power Plant in San Luis Obispo County, California. This electricity-generating nuclear power plant near Avila Beach has operated safely since 1985. Photo by marya from San Luis Obispo, USA: https:// flickr.com/photos/35237093637@N01

Ah – I hear you say – but if we build more nuclear reactors it’s only a matter of time until the next accident, and the more reactors we build the more accidents there will be. Well – yes and no. If we assume that the risk of an accident is the same for every reactor built then yes, this logic makes sense. But reactors have been getting safer with time – reactors built now are much less likely to suffer catastrophic accidents than those built in the 1980s. And – in case you’re wondering – factors that make reactors less likely to melt down or to have catastrophic accidents include operating at lower pressures (we can’t avoid high temperatures), simplifying the engineering designs, and making use of basic laws of physics to provide cooling in an emergency rather than pumps and other reengineered safeguards that require power to operate properly. Reactors powered by thorium, for example, have a number of features that make them almost impossible to melt down (sorry – this is far too much to get into here, but maybe at a later time; in the interim, https://energyfromthorium.com/ is a site where you can find some information if you’re interested). So if today’s reactors are safer than those built in the past, doubling the number of reactors won’t necessarily double the number of accidents. The bottom line is that some of the newest designs are virtually melt-down proof – not completely so, but pretty close. The bottom line is that the risk of meltdowns might increase, but not as much as one might expect. In fact, if we replace older reactors with the newer designs (in addition to building new ones), the risk of a meltdown might actually drop somewhat.

One last thing that’s touted as being a risk from nuclear reactors is the radioactive waste that they produce – more specifically, where it can be stored safely. At the moment the US has no long-term radioactive waste solution, but this is more a function of politics than of science. Again, there’s not enough space here to go into a full-blown discussion about radioactive waste disposal, but here’s something to consider: about two billion years ago a particularly rich uranium deposit achieved criticality and operated as a reactor for at least 100,000 years in what’s now West Africa. In the time since then, all of the fission products (the nuclear waste) have stayed put, in spite of the fact that the reactor zones are in porous and fractured sandstone and below the water table for most of that time. This bodes well for our ability to safely contain radioactive waste in an engineered facility set in much less-porous rock well above the water table.

The bottom line is that nuclear energy has its drawbacks, as does every form of energy. But these drawbacks are understood and can largely be managed – they certainly don’t call for abandoning nuclear energy altogether. Incidentally, you might also be interested to know that fossil fuels (coal, oil, natural gas) are associated with natural radioactivity due to the geochemistry of uranium. The radiation dose from burning any of these fuels is actually higher than the radiation dose from nuclear energy for every GW-hour of energy produced.

Now let’s look at the other side – the part of your question regarding nuclear energy and global warming. Here the argument is pretty simple – nuclear energy produces a lot of power and virtually no greenhouse gas at all; pretty attractive in a world increasingly worried about global warming. And – yes – some greenhouse gases are produced by the mining, processing, and transportation of uranium. But this pales in comparison to the greenhouse gas produced by fossil fuel plants.

There are other forms of low-carbon energy – solar, wind, tidal, and geothermal power come to mind. But each of these has its limitations – solar power doesn’t work well at night for example, the strongest winds are usually distant from the cities that use the most power, geothermal is only useful in a few location, and tidal power isn’t very useful in the continent’s interior. As of today, nuclear energy is the only form of non-fossil fuel power that can be used anywhere at any time. That’s not to say that nuclear energy is the answer to all of our problems (it isn’t) or that there’s no place for alternative energy sources (there is) – just that for reliable baseline power that can be put virtually anywhere on Earth and that doesn’t emit any greenhouse gases we just can’t get any better than nuclear at the moment. One of these days we might have fusion reactors, solar power satellites, or other exotic forms of energy – but not today and not in the plannable future.

How Do Sodium Iodide (Scintillation) Detectors Work?

Dear Dr. Zoomie – your last posting discussed how gas-filled detectors work, but I’ve got a different type. I think it’s called a sodium iodide detector. Can you tell me how this works and when I should use it? Thanks!

You’ve got one of my favorite detectors (and yes, I know that this raises me way up on Geek sale)! There are two fundamental families of detectors – the gas-filled detectors I wrote about the last time, and the scintillation detectors, of which the sodium iodide (abbreviated NaI) is one. The vast majority of radiation detectors out there – and virtually every detector used in general situations – fall into one of these two families. So since I wrote about the gas-filled detectors last time, this is a good chance to write about the other major family. Here’s how the scintillation-type detectors work.

First, an important point. Most scintillation detectors are only sensitive to one type of radiation. So NaI detectors will pick up gamma radiation, but not alpha or beta, zinc sulfide (ZnS) will only pick up alpha radiation, and so forth. In actuality, there might be some sensitivity to other radiations – NaI, for example, will sometimes pick up high-energy betas – but you should only use a detector for the type of radiation it’s designed to pick up. Now, with that out of the way, on to how the things work!

radiation_Detection_scintillation_detectors

The basic principle is the same for every scintillation-type detector: when radiation strikes the scintillator it causes it to give off photons of visible light (that’s the scintillation part). These photons pass through the crystal and they strike a thin metal foil called a photocathode – when this happens the light enters the second part of the detector, called a photo-multiplier tube (PMT). When the photon hits the photocathode it causes an electron to be ejected from the photocathode. Just past the photocathode there is a set of metal cups, each with a voltage applied to it (typically several hundred to a thousand volts) – the electron is accelerated by this voltage to a high energy and it strikes the cup with enough energy that it knocks loose a number of other electrons. Each of them, in turn, is accelerated towards the next metal cup, where each of the “new” electrons knocks loose a number of additional ones – by the end of the PMT the initial signal has been multiplied by a factor of a million or so. From there, it’s up to the instrument manufacturer to figure out how they’re going to use the light that’s emitted.

As one example, every time a gamma hits the crystal it starts this whole process that culminates in a pulse of electrons arriving at the far end of the detector. The simplest way to deal with this is to simply count the pulses of electrons as they arrive at a counting circuit – this is a great way to measure contamination (which we normally record in terms of counts per minute or counts per second). We can also use this mode to measure radiation dose rates, but only by assuming that every count carries with it a specific amount of energy. If you remember the posting on gas-filled detectors, this is the same way that Geiger counters work and it’s one of the reasons that Geiger counters aren’t always accurate for measuring radiation dose rates. We’ll get back to that in a moment. Oh – one other thing to be careful about with an NaI detector is that the larger crystals (say, 2”x2”) can be sensitive to drastic temperature changes – they can thermally stress the crystal, eventually breaking it. So if you’re working outside on a blazing hot (or bitingly cold) day, you probably want to leave the crystal outside, rather than bring it into a much colder (or warmer) office a few times a day.

Something else that we can use NaI detectors for is identifying specific radionuclides by measuring the energy of each individual gamma that enters the crystal – this process is called gamma spectroscopy, or can also be called multi-channel analysis (and the instrument set up for this purpose is called a gamma spectroscopy device, or a multi-channel analyzer – abbreviated MCA). The basic principle behind gamma spectroscopy is that every gamma-emitting radionuclide emits a gamma ray (or a few gammas) with very specific energies – like a fingerprint – and if we can identify the gamma energies precisely enough then we can identify the radionuclide(s) present. For example, cesium-137 (Cs-137) gives off a gamma with an energy of 662 thousand electron volts (abbreviated keV) – if we analyze a gamma ray spectrum and find a peak with an energy of 662 keV then we know that Cs-137 is present. Along the same lines, seeing twin gamma peaks at about 1.1 and 1.3 million electron volts (MeV) tells us that we’ve found cobalt-60 (Co-60).

twin_gamma_peaks

The problem is in figuring out how much energy is in each gamma photon – luckily we can do this with a scintillation detector. When a gamma ray interacts with the NaI crystal it deposits energy – this energy is what causes the photons to be given off. Not only that, but a predictable number of photons are emitted depending on the energy deposited – in a sodium iodide crystal, depositing 1 MeV of energy will cause about 42,000 scintillation photons to be emitted. We know the number of photons that it takes to eject a single electron from the photocathode, and we know the amount of amplification (and the size of the electrical pulse) for each electron ejected. So if we can measure the size of the output pulse then we know how much energy was deposited in the crystal – and we can know what radionuclide emitted the gamma that we just detected.

So these are two uses of NaI detectors – measuring contamination levels (in counts per minute or counts per second) and identifying radionuclides by measuring gamma energies – now for a third, measuring radiation dose rate.

Radiation dose rate is a measure of the amount of energy deposited in an object. You’d think that this would be fairly simple with an NaI detector since the number of electrons coming out of the detector is proportional to the energy deposited in the crystal. But that’s not how most manufacturers handle things – most of the time they simply count the pulse rate and, as with a Geiger counter, assume that they are all coming from Cs-137. So, for example, the manufacturer might determine that 175 counts per second are equivalent to a dose rate of 1 mR/hr. This means that NaI detectors set up to measure dose rate this way have the same limitations as a Geiger counter – unless you are measuring exactly what they were calibrated with the dose rates you measure are going to be wrong. Thus, if you’re using a scintillation detector to measure radiation dose rate you need to make sure that either you’re measuring the same nuclide to which it was calibrated or you need to have a set of correction factors that will let you convert the meter reading to the correct dose rate (assuming that you know the nuclide that’s actually present). This graphic, which is for a sodium iodide scintillation detector, show that the meter will show only half the actual dose rate if the radiation is from a Co-60 source, but will over-respond by a factor of 6 or 7 if the radiation is coming from the Am-241 in, say, a box of smoke detectors.

energy_response_for_ludlum_model_19

OK – so that covers how the NaI detector works and what it can be used for (again, to measure gamma contamination, for nuclide identification, or – with caveats – to measure radiation dose rate). Now a little on other types of scintillators.

Zinc sulfide (ZnS) is used to measure alpha contamination. The ZnS crystals are razor-thin – only about as thick as a single human hair (give or take a little). But since alpha particles can’t penetrate very far into any materials the crystals don’t need to be any thicker than this. For a number of reasons we don’t worry about dose rate from alpha radiation, so the only thing we need to measure is count rate – a fairly simple matter of counting pulses. The biggest problem with ZnS detectors is that they can be fragile (remember how thin the crystals are). It can also take a long time to do a proper alpha contamination survey since alphas are so easily shielded and have such a short range in air. But for alpha counting, ZnS is about as good as it gets, in spite of its limitations.

Finally, there are also beta scintillators. Liquid scintillation counting is normally performed in the laboratory using fairly expensive (and large) machines – chances are that you won’t have to use one of these unless you work in a laboratory. There are also beta scintillation crystals that you use the same way you use an NaI detector – these tend to be made of plastic (called organic scintillators). While not as fragile as ZnS or NaI, the photo-multiplier tube is the same in all of these detectors and is not very sturdy – no matter what kind of scintillation detector you’re using, you need to treat it gently.

So to sum all of this up, scintillation detectors have their limitations, but they are essential pieces of equipment for just about anyone making radiation measurements. The big things are to treat them gently to keep from breaking them, and to use them properly and within their limitations.

How Do Geiger Counters Work?

Dear Dr. Zoomie – I’ve got all these Geiger counters but I have no idea how they work. Plus, somebody told me that they can only be used for a few types of measurements. Is it true that I can’t use my Geiger counter to measure radiation dose rate? And how DO they work, anyhow?

Geiger counters are part of a family of radiation detectors called “gas-filled detectors.” These detectors – as suggested by the name – are filled with gas. They have an electrode in the middle of the chamber and are set up so that there’s an electrical voltage between the electrode and the metal wall of the chamber – in a Geiger counter, for example, the voltage difference between these is about 900 volts. When radiation hits the molecules of gas in the tube it strips electrons off of the atoms – this process is called ionization. The electron is attracted to the positive charge of the anode and the rest of the atom (a positively charged ion) rushes towards the wall of the tube. Then the electron travels through the wires that make up the electrical circuit and recombines with the ion – part of this electrical circuit is a device that measures the flow of electrons.

Geiger-Mueller Tube

Geiger-Mueller Tube

Of course, it’s hard to measure a single electron – luckily the instruments don’t have to do so. When the electron and ion are accelerated towards the electrode and chamber walls they gain a LOT of energy because of the high voltage – they bump into other atoms and knock electrons off of them in a process called secondary ionization. Those electrons and ions, in turn, cause even more ionizations and so on – this amplifies the original signal by a huge degree, to a point where it can be measured.

Creation of Discrete Avalanches Proportional Counter

Creation of Discrete Avalanches in a Proportional Counter

In a Geiger counter, the voltage is so high that the entire chamber of gas becomes ionized – this gives the highest sensitivity to incoming radiation. In other detectors (called ion chambers and proportional counters) the voltage is lower and the amount of gas amplification is lower as well; only some of the gas atoms are ionized in these detectors. The amount of amplification from increasing voltage is shown in the graphic.

Geiger-Mueller Region

To make sense of this graph – especially when it comes to understanding some of the limitations of Geiger counters – it helps to understand that alpha particles have a lot more energy than do beta particles. It also helps to understand that radiation dose is a measure of the amount of energy that’s deposited by radiation in an object – more energy means more dose. So looking at this graph, we can see that high-energy radiation hitting a detector leaves a bigger signal than does low-energy radiation. Now look to the right, in the Geiger-Muller region – in this region the high-energy radiation produces exactly the same signal as the low-energy radiation. So we can’t tell the difference between high-energy and low-energy radiation, which means that we can’t necessarily tell how much radiation dose a person was exposed to if we’re just measuring with a Geiger counter. This is one reason that we can’t always use a Geiger counter to measure radiation dose rate accurately. If a Geiger counter, for example, is calibrated to measure radiation dose rate from the radionuclide Cs-137 it will be right on the money as long as you’re trying to measure radiation dose from this nuclide. But what if you’re trying to measure radiation from cobalt-60 (Co-60)? Well, then you’re in a bit of trouble – radiation from Co-60 is twice the energy as radiation from Cs-137 so whatever your detector reads will be only half the actual radiation dose rate. On the other hand, a lot of radionuclides are lower-energy; in this case, your meter is going to read a higher dose-rate than is actually the case. The bottom line is that a Geiger counter will only give an accurate radiation dose-rate reading if it’s measuring the same radioactive material it was calibrated with. This is why Geiger counters aren’t always the best instruments to use to measure radiation dose rates.

It turns out that there is one type of Geiger tube that will give fairly accurate readings from a wide range of radiation energies – they’re called energy-compensated Geiger counters. These are designed to give a fairly constant reading across a wide range of radiation energies, so if you’re using an energy-compensated GM you actually can make accurate dose-rate readings. Otherwise, it’s probably best to make your dose-rate measurements with an ion chamber.

One more thing about Geiger counters – they are great general-purpose radiation detectors because they can measure alpha, beta, and gamma radiation. If you’re measuring gamma radiation you’re probably most interested in measuring dose-rate (measured in mR/hr or, for really low levels, in µR/hr – microR per hour); if you’re measuring alpha or beta radiation then you’re probably looking for contamination and you should be measuring counts per minute (cpm) or counts per second (cps), depending on the type of meter you’re using.

So let’s put all of this together:

  • Gas-filled detectors (such as Geiger counters and ion chambers) are filled with a gas that has an electrical voltage applied across it.
  • When radiation interacts with the gas it causes ionizations, and this small signal is amplified by the electrical voltage – the amount of amplification depends on the voltage.
  • Ion chambers can measure the difference in radiation energy – for this reason they are ideal for measuring radiation dose-rate.
  • Geiger counters, on the other hand, have full amplification of the signal for any radiation energy that strikes them. For this reason, they don’t always give accurate dose-rate readings, especially if the energy of the radiation is different than what they were calibrated with.
  • Having said that, energy-compensated GM detectors are designed to help accommodate this problem – they give fairly accurate dose-rates across a wide range of radiation energies.
  • And finally, Geiger counters can also measure not only gamma radiation dose (measured in mR/hr or µR/hr), but also alpha and beta contamination (measured in cps or cpm).

I know this is a long and fairly complicated answer – I hope it helps!