To answer this question we need to think about why radioactivity decreases with time. The simple answer is that every time a nucleus decays and releases a particle (like an alpha or beta) then there's one fewer undecayed nucleus left. For a given isotope every nucleus has the same chance of decay each second. It doesn't matter how long it's already been around for or what its neighbours are doing.
If the chance of decay is high then lots of nuclei decay each second (so lots of radiation is given off) but you quickly end up running out of undecayed nuclei. This means the half-life is short.
If the chance of decay is low then very few nuclei decay each second (so very little radiation is given off) and there are still lots of undecayed nuclei left a long time later. This means the half-life is long.
The key point is that isotopes with a very long half-life are only very weakly radioactive and isotopes that are very radioactive don't stay radioactive for long.
It turns out that the most problematic half-life for the environment is a few decades or so. Isotopes such as strontium-90 (with a half-life of about 30 years) are pretty radioactive and stick around for a time comparable to a human life-time, which gives plenty of opportunity for them to cause genetic damage.
No comments:
Post a Comment