Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Epidemiological studies of medical radiation workers have found excess risks of leukemia, skin and female breast cancer in those employed before 1950 but little consistent evidence of cancer risk increases subsequently. Occupational radiation-related dose–response data and recent and lifetime cancer risk data are limited for radiologists and radiologic technologists and lacking for physicians and technologists performing fluoroscopically guided procedures. Survey data demonstrate that occupational doses to radiologists and radiologic technologists have declined over time. Eighty mostly small studies of cardiologists and fewer studies of other physicians reveal that effective doses to physicians per interventional procedure vary by more than an order of magnitude. For medical radiation workers, there is an urgent need to expand the limited information on average annual, time-trend and organ doses from occupational radiation exposures and to assess lifetime cancer risks of these workers. For physicians and technologists performing interventional procedures, more information about occupational doses should be collected and long-term follow-up studies of cancer and other serious disease risks should be initiated. Such studies will help optimize standardized protocols for radiologic procedures, determine whether current radiation protection measures for medical radiation workers are adequate, provide guidance on cancer screening needs, and yield valuable insights on cancer risks associated with chronic radiation exposure.
The annual number of CT scans in the U.S. is now over 70 million. The concern is that organ doses from CT are typically far larger than those from conventional X-ray examinations, and there is epidemiological evidence of a small but significant increased cancer risk at typical CT doses. Because CT is a superb diagnostic tool and because individual CT risks are small, when a CT scan is clinically indicated, the CT benefit/risk balance is by far in the patient's favor. Nevertheless, CT should operate under the ALARA (As Low As Reasonably Achievable) principle, and opportunities exist to reduce the significant population dose associated with CT without compromising patient care. The first opportunity is to reduce the dose per scan, and improved technology has much potential here. The second opportunity is selective replacement of CT with other modalities, such as for many head and spinal examinations (with MRI), and for diagnosing appendicitis (selective use of ultrasound CT). Finally, a fraction of CT scans could be avoided entirely, as indicated by CT decision rules: Clinical decision rules for CT use represent a powerful approach for slowing down the increase in CT use, because they have the potential to overcome some of the major factors that result in some CT scans being undertaken when they are potentially not clinically helpful. In the U.S. and potentially elsewhere, legislative approaches are a possible option, to improve quality control and reduce clinically unneeded CT use, and it is also possible that upcoming changes in heath care economics will tend to slow the increase in such CT use.
Studies of Mayak workers and people who lived along the Techa River have demonstrated significant associations between low-dose-rate radiation exposure and increased solid cancer risk. It is of interest to use the long-term follow-up data from these cohorts to describe radiation effects for specific types of cancer; however, statistical variability in the site-specific risk estimates is large. The goal of this work is to describe this variability and provide Bayesian adjusted risk estimates. We assume that the site-specific estimates can be viewed as a sample from some underlying distribution and use Bayesian methods to produce adjusted excess relative risk per gray estimates in the Mayak and Techa River cohorts. The impact of the adjustment is compared to that seen in similar analyses in the atomic bomb survivors. Site-specific risk estimates in the Mayak and Techa River cohorts have large uncertainties. Unadjusted estimates vary from implausibly large decreases to large increases, with a range that greatly exceeds that found in the A-bomb survivors. The Bayesian adjustment markedly reduced the range of the site-specific estimates for the Techa River and Mayak studies. The extreme variability in the site-specific risk estimates is largely a consequence of the small number of excess cases. The adjusted estimates provide a useful perspective on variation in the actual risks. However, additional work on interpretation of the adjusted estimates, extension of the methods used in describing effect modification, and making more use of prior knowledge is needed to make these methods useful.
Here I consider whether radiation-sensitive individuals might exist in the population and the potential impact of low-dose/dose-rate radiation exposure. Radiation induces DNA double-strand breaks (DSBs), which cause lethality if they are unrepaired and enhance genomic instability if they are misrepaired. DNA damage response (DDR) mechanisms play a vital role in protecting cells from the harmful effects of DSB formation. The DDR encompasses DSB repair pathways, of which DNA nonhomologous end joining is the most significant, and a signal transduction process involving ATM. Patients defective in DDR proteins have been described, and some have shown clinical radiosensitivity. However, such patients are rare and belong to defined syndromes. The critical question is whether heterozygosity or mild defects in DDR proteins confer low-dose radiosensitivity. While it is unlikely that low-dose radiation will dramatically enhance cell killing in such patients, it is possible that there could be an impact on stem cell turnover, leading to stem cell depletion with age. More importantly, it is likely that such patients could have increased misrepair of radiation damage and hence an elevated risk of radiation-induced carcinogenesis. Evidence in support of this and the potentially important genes in this context are discussed.
Radiation research has its foundation on the target and hit theories, which assume that the initial stochastic deposition of energy on a sensitive target in a cell determines the final biological outcome. This assumption is rather static in nature but forms the foundation of the linear no-threshold (LNT) model of radiation carcinogenesis. The stochastic treatment of radiation carcinogenesis by the LNT model enables easy calculation of radiation risk, and this has made the LNT model an indispensable tool for radiation protection. However, the LNT model sometimes fails to explain some of the biological and epidemiological data, and this suggests the need for insight into the mechanisms of radiation carcinogenesis. Recent studies have identified unique characteristics of the tissue stem cells and their roles in tissue turnover. In the present report, some important issues of radiation protection such as the risk of low-dose-rate exposures and in utero exposures are discussed in light of the recent advances of stem cell biology.
In the last four decades, advances in therapies for primary cancers have improved overall survival for childhood cancer. Currently, almost 80% of children will survive beyond 5 years from diagnosis of their primary malignancy. These improved outcomes have resulted in a growing population of childhood cancer survivors. Radiation therapy, while an essential component of primary treatment for many childhood malignancies, has been associated with risk of long-term adverse outcomes. The Childhood Cancer Survivor Study (CCSS), a retrospective cohort of over 14,000 survivors of childhood cancer diagnosed between 1970 and 1986, has been an important resource to quantify associations between radiation therapy and risk of long-term adverse health and quality of life outcomes. Radiation therapy has been associated with increased risk for late mortality, development of second neoplasms, obesity, and pulmonary, cardiac and thyroid dysfunction as well as an increased overall risk for chronic health conditions. Importantly, the CCSS has provided more precise estimates for a number of dose–response relationships, including those for radiation therapy and development of subsequent malignant neoplasms of the central nervous system, thyroid and breast. Ongoing study of childhood cancer survivors is needed to establish long-term risks and to evaluate the impact of newer techniques such as conformal radiation therapy or proton-beam therapy.
The incidence of and mortality from cerebrovascular diseases (CVD) have been studied in a cohort of 12,210 workers first employed at one of the main plants of the Mayak nuclear facility during 1948–1958 and followed up to 31 December 2000. Information on external γ-ray doses is available for virtually all of these workers (99.9%); the mean total γ-ray dose (± SD) was 0.91 ± 0.95 Gy (99th percentile 3.9 Gy) for men and 0.65 ± 0.75 Gy (99th percentile 2.99 Gy) for women. In contrast, plutonium body burden was measured only for 30.0% of workers; among those monitored, the mean cumulative liver dose from plutonium α-particle exposure (± SD) was 0.40 ± 1.15 Gy (99th percentile 5.88 Gy) for men and 0.81 ± 4.60 Gy (99th percentile 15.95 Gy) for women. A total of 4418 cases of CVD, including 665 cases of stroke, and 753 deaths from CVD, including 404 deaths from stroke, were identified in the study cohort. Having adjusted for non-radiation factors, there were statistically significant increasing trends in CVD incidence but not mortality with both total external γ-ray dose and internal liver dose. Much of the evidence for increased incidence in relation to external dose arose for workers with cumulative doses above 1 Gy. Although the dose response is consistent with linearity, the statistical power to detect non-linearity at external doses below 1 Gy was low. CVD incidence was statistically significantly higher among workers with a plutonium liver dose above 0.1 Gy. There was a statistically significant increasing trend in incidence with increasing internal dose, even after adjusting for external dose, although the trend estimates differed between workers at different plants. The risk estimates for external radiation are generally compatible with those from other large occupational studies, although the incidence data point to higher risk estimates compared to those from the Japanese A-bomb survivors.
Radiation is an independent risk factor for cardiovascular and cerebrovascular disease in cancer patients. Modern radiotherapy techniques reduce the volume of the heart and major coronary vessels exposed to high doses, but some exposure is often unavoidable. Radiation damage to the myocardium is caused primarily by inflammatory changes in the microvasculature, leading to microthrombi and occlusion of vessels, reduced vascular density, perfusion defects and focal ischemia. This is followed by progressive myocardial cell death and fibrosis. Clinical studies also demonstrate regional perfusion defects in non-symptomatic breast cancer patients after radiotherapy. The incidence and extent of perfusion defects are related to the volume of left ventricle included in the radiation field. Irradiation of endothelial cells lining large vessels also increases expression of inflammatory molecules, leading to adhesion and transmigration of circulating monocytes. In the presence of elevated cholesterol, invading monocytes transform into activated macrophages and form fatty streaks in the intima, thereby initiating the process of atherosclerosis. Experimental studies have shown that radiation predisposes to the formation of inflammatory plaque, which is more likely to rupture and cause a fatal heart attack or stroke. This paper presents a brief overview of the current knowledge on mechanisms for development of radiation-induced cardiovascular and cerebrovascular damage. It does not represent a comprehensive review of the literature, but reference is made to several excellent recent reviews on the topic.
In this paper we summarize the long-term effects of A-bomb radiation on the T-cell system and discuss the possible involvement of attenuated T-cell immunity in the disease development observed in A-bomb survivors. Our previous observations on such effects include impaired mitogen-dependent proliferation and IL-2 production, decreases in naive T-cell populations, and increased proportions of anergic and functionally weak memory CD4 T-cell subsets. In addition, we recently found a radiation dose-dependent increase in the percentages of CD25/CD127− regulatory T cells in the CD4 T-cell population of the survivors. All these effects of radiation on T-cell immunity resemble effects of aging on the immune system, suggesting that ionizing radiation might direct the T-cell system toward a compromised phenotype and thereby might contribute to an enhanced immunosenescence. Furthermore, there are inverse, significant associations between plasma levels of inflammatory cytokines and the relative number of naïve CD4 T cells, also suggesting that the elevated levels of inflammatory markers found in A-bomb survivors can be ascribed in part to T-cell immunosenescence. We suggest that radiation-induced T-cell immunosenescence may result in activation of inflammatory responses and may be partly involved in the development of aging-associated and inflammation-related diseases frequently observed in A-bomb survivors.
The thyroid gland is one of the most radiosensitive human organs. While it is well known that radiation exposure increases the risk of thyroid cancer, less is known about its effects in relation to non-malignant thyroid diseases. The aim of this review is to evaluate the effects of high- and low-dose radiation on benign structural and functional diseases of the thyroid. We examined the results of major studies from cancer patients treated with high-dose radiotherapy or thyrotoxicosis patients treated with high doses of iodine-131, patients treated with moderate- to high-dose radiotherapy for benign diseases, persons exposed to low doses from environmental radiation, and survivors of the atomic bombings who were exposed to a range of doses. We evaluated radiation effects on structural (tumors, nodules), functional (hyper- and hypothyroidism), and autoimmune thyroid diseases. After a wide range of doses of ionizing radiation, an increased risk of thyroid adenomas and nodules was observed in a variety of populations and settings. The dose response appeared to be linear at low to moderate doses, but in one study there was some suggestion of a reduction in risk above 5 Gy. The elevated risk for benign tumors continues for decades after exposure. Considerably less consistent findings are available regarding functional thyroid diseases including autoimmune diseases. In general, associations for these outcomes were fairly weak, and significant radiation effects were most often observed after high doses, particularly for hypothyroidism. A significant radiation dose–response relationship was demonstrated for benign nodules and follicular adenomas. The effects of radiation on functional thyroid diseases are less clear, partly due to the greater difficulties encountered in studying these diseases.
The prevailing belief for some decades has been that human radiation-related cataract occurs only after relatively high doses; for instance, the ICRP estimates that brief exposures of at least 0.5–2 Sv are required to cause detectable lens opacities and 5 Sv for vision-impairing cataracts. For protracted exposures, the ICRP estimates the corresponding dose thresholds as 5 Sv and 8 Sv, respectively. However, several studies, especially in the last decade, indicate that radiation-associated opacities occur at much lower doses. Several studies suggest that medical or environmental radiation exposure to the lens confers risk of opacities at doses well under 1 Sv. Among Japanese A-bomb survivors, risks for cataracts necessitating lens surgery were seen at doses under 1 Gy. The confidence interval on the A-bomb dose threshold for cataract surgery prevalence indicated that the data are compatible with a dose threshold ranging from none up to only 0.8 Gy, similar to the dose threshold for minor opacities seen among Chernobyl clean-up workers with primarily protracted exposures. Findings from various studies indicate that radiation risk estimates are probably not due to confounding by other cataract risk factors and that risk is seen after both childhood and adult exposures. The recent data are instigating reassessments of guidelines by various radiation protection bodies regarding permissible levels of radiation to the eye. Among the future epidemiological research directions, the most important research need is for adequate studies of vision-impairing cataract after protracted radiation exposure.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere