Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
The goal of radiation therapy is to provide the highest probability of tumor control while minimizing normal tissue toxicity. Recently, it has been discovered that ultra-high dose rates of ionizing radiation may preferentially spare normal tissue over tumor tissue. This effect, referred to as FLASH radiotherapy, has been observed in various animal models as well as, more recently, in a human patient. This effect may be related to the cell sparing found in vitro at ultra-high dose rates of photons and electrons dating back to the 1960s. Conditions representative of physiologic oxygen were found to be essential for this process to occur. However, there is no conclusive data on whether this effect occurs with protons, as all results to date have been in cells irradiated at ambient oxygen conditions. There have been no ultra-high dose-rate experiments with heavy ions, which would be relevant to the implementation of FLASH to carbon-ion therapy. These basic science results are critical in guiding this rapidly advancing field, since clinical particle therapy machines capable of FLASH dose rates have already been promoted for protons. To help ensure FLASH radiotherapy is reliable and maximally effective, the radiobiology must keep ahead of the clinical implementation to help guide it. In this context, in vitro and in vivo proton and heavy ion experiments involving FLASH dose rates need to be performed to evaluate not only short-term consequences, but also sequelae related to long-term health risks. Critical to these future studies is consideration of relevant oxygen tensions at the time of irradiation, as well as appropriate in silico modeling to assist in understanding the initial physicochemical events.
Evaluating the risk for central nervous system (CNS) effects after whole-body or partial-body irradiation presents challenges due in part to the varied exposure scenarios in the context of occupational, accidental or wartime releases. Risk estimations are further complicated by the fact that robust changes in brain function are unlikely to manifest until significantly late post exposure times. Collectively, the current data regarding CNS radiation risk are conflicting in humans and a survey of the animal model data shows that it is similarly inconsistent. Due to the sparseness of such data, the current study was conducted using male and female mice to evaluate the brain for the delayed effects of a 2 Gy whole-body exposure to c rays starting six months postirradiation. Behavioral testing indicated sex-specific differences in the induction of anxiety-like behaviors and in the ability to abolish fear memories. Molecular analyses showed alterations in post-synaptic protein levels that might affect synaptic plasticity and increased levels of global DNA methylation, suggesting a potential epigenetic mechanism that might contribute to radiation-induced cognitive dysfunction. These data add to the understanding of the CNS response to whole-body irradiation and may lead to improved risk assessment and provide guidance in the development of effective radiation countermeasures to protect military personnel and civilians alike.
The goal of this work was to determine whether hydrogen-rich water (HRW) could attenuate radiation-induced cognitive dysfunction in rats and to explore the underlying mechanisms. Rats received 30 Gy whole-brain irradiation using a 6-MeV electron beam. Either purified water or HRW (0.8–0.9 ppm) was administrated at 10 min prior to irradiation, as well as a daily HRW treatment after irradiation for 30 consecutive days. The Morris water maze was used to test spatial memory in the rats. The concentration of glutathione (GSH), malondialdehyde (MDA), 8-hydroxydeoxyguanosine (8-OHdG) and the super-oxidedismutase (SOD) activity in cerebral cortex, as well as brain-derived neurotrophic factor (BDNF) level in serum, were measured. Immunofluorescence staining was adopted to detect proliferating cells. The expression of BDNF-TrkB pathway-related genes and proteins were detected using qRT-PCR and Western blot. Models of cognitive dysfunction were successfully established using a 30 Gy dose of ionizing radiation. Compared to the radiation treated group, the radiation-HRW treated group showed significantly decreased escape latency (P < 0.05), but increased retention time, swimming distance of original platform quadrant (P < 0.05) and number of platform crossings (P < 0.05). Furthermore, the SOD, GSH (P < 0.05) and BDNF (P < 0.05) levels in the radiation-HRW treated group were higher compared to the radiation treated group. The MDA and 8-OHdG levels (P < 0.05) were decreased in the radiation-HRW treated group when compared to the radiation treated group. Additionally, treatment with HRW increased the number of BrdU+NeuN+ cells in the radiation treated group. The mRNA and protein levels of BDNF and TrkB (P < 0.05) in radiation-HRW treated group was higher than that in the radiation treated group. Collectively, our study indicates that HRW has a protective effect on radiation-induced cognitive dysfunction, and that the possible mechanisms mainly involve anti-oxidative and anti-inflammatory reactions, and its protection of newborn neurons by regulating the BDNF-TrkB signaling pathway.
In the aftermath of a nuclear incident, survivors will suffer the deleterious effects from acute radiation exposure. The majority of those affected would have received heterogeneous radiation exposure, reflected in hematological metrics and blood chemistry. Here, we investigated the acute and long-term changes in kinetics and magnitude of pancytopenia and blood chemistry in rats irradiated using varying degrees of body shielding. We hypothesized that, although a single blood count may not be able to differentiate the degree of radiation exposure, a combination of measurements from complete blood cell counts (CBCs) and blood chemistry tests is able to do so. Male Sprague Dawley rats, 8–10 weeks of age, received single-dose 7.5 Gy (160 kVp, 25 mA, 1.16 Gy/min) whole-body irradiation (WBI, LD100/30) or partial-body irradiation (PBI), as follows: one leg shielded (1LS, LD0/30), two legs shielded (2LS, LD0/30) or the upper half of the body shielded (UHS, LD0/30). Animal morbidity and weights were measured. Blood was drawn at 1, 5, 10, 20 and 30 days postirradiation (n = 4–11). For kidney and liver function measurements, CBC and blood chemistry analyses were performed. WBI animals on average survived 9 ± 0.4 days postirradiation. In contrast, all PBI animals survived the 30-day study period. CBC analysis revealed that both white blood cell (WBC) and platelet counts were most affected after irradiation. While WBC counts were significantly lower in all irradiated groups on days 1, 5 and 10, platelets were only significantly lower on days 5 and 10 postirradiation. In addition, on day 5 postirradiation both WBC and platelet counts were able to differentiate WBI (non-survivors) from PBI 2LS and UHS animals (survivors). Using four blood parameters (platelets, percentage lymphocytes, percentage neutrophils and percentage monocytes) on day 5 after 7.5 Gy irradiation and a linear discrimination analysis (LDA), we were able to predict the degree of body exposure in animals with a 95.8% accuracy. Alkaline phosphatase (ALP) was significantly lower in all groups on days 5 and 10 postirradiation compared to baseline. Furthermore, ALP was significantly higher in the UHS than WBI animals. The AST:ALT ratio was significantly higher than baseline in all irradiated groups on day 1 postirradiation. In conclusion, four CBC parameters, on day 5 after receiving a 7.5 Gy dose of radiation, can be employed in a LDA to differentiate various degrees of exposure (shielding). The characterization presented in this work paves the way for further studies in differences caused by heterogeneous body exposure to radiation and a new metric for biodosimetry.
Alina L. Bendinger, Lisa Seyler, Maria Saager, Charlotte Debus, Peter Peschke, Dorde Komljenovic, Jürgen Debus, Jörg Peter, Ralf O. Floca, Christian P. Karger, Christin Glowa
We collected initial quantitative information on the effects of high-dose carbon (12C) ions compared to photons on vascular damage in anaplastic rat prostate tumors, with the goal of elucidating differences in response to high-LET radiation, using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Syngeneic R3327-AT1 rat prostate tumors received a single dose of either 16 or 37 Gy 12C ions or 37 or 85 Gy 6 MV photons (iso-absorbed and iso-effective doses, respectively). The animals underwent DCE-MRI prior to, and on days 3, 7, 14 and 21 postirradiation. The extended Tofts model was used for pharmacokinetic analysis. At day 21, tumors were dissected and histologically examined. The results of this work showed the following: 1. 12C ions led to stronger vascular changes compared to photons, independent of dose; 2. Tumor growth was comparable for all radiation doses and modalities until day 21; 3. Nonirradiated, rapidly growing control tumors showed a decrease in all pharmacokinetic parameters (area under the curve, Ktrans, ve, vp) over time; 4. 12C-ion-irradiated tumors showed an earlier increase in area under the curve and Ktrans than photon-irradiated tumors; 5. 12C-ion irradiation resulted in more homogeneous parameter maps and histology compared to photons; and 6. 12C-ion irradiation led to an increased microvascular density and decreased proliferation activity in a largely dose-independent manner compared to photons. Postirradiation changes related to 12C ions and photons were detected using DCE-MRI, and correlated with histological parameters in an anaplastic experimental prostate tumor. In summary, this pilot study demonstrated that exposure to 12C ions increased the perfusion and/or permeability faster and led to larger changes in DCE-MRI parameters resulting in increased vessel density and presumably less hypoxia at the end of the observation period when compared to photons. Within this study no differences were found between curative and sub-curative doses in either modality.
D-dimer plasma levels were evaluated to determine whether they are altered by radiation. D-dimer levels were measured in radiation oncology patients, who were diagnosed with prostate, breast or lung cancer, or leukemia, as well as in healthy subjects serving as controls. Blood samples from radiotherapy patients were taken at three different time points: pre-, on- and post-radiotherapy. For the patients, considered together, differences between the D-dimer levels at these three time points compared to controls were statistically significant. Compared to the pre-radiotherapy measurements, radiation exposure was associated with a significant increase in the D-dimer levels at the on- and post-radiotherapy time points. At the post-radiotherapy time point, D-dimer levels in the patients were not significantly reduced compared to the on-radiotherapy levels, indicating that the risk for developing disseminated intravascular coagulation (DIC) may be increased in some radiation oncology patients. Of particular concern are the post-radiotherapy results observed for the D-dimer levels in the leukemia patients, in which the average fold increase in the D-dimer levels was 5.43 (compared to the pre-radiotherapy levels). These results suggest that leukemia patients might benefit from frequent assessment of their D-dimer levels after their total-body irradiation-conditioning regimen to detect early signs of DIC development. It is hoped that the results described here will lead to heightened awareness in the radiation oncology community that the risk of DIC development is greatly increased in some of these patients.
We monitored a physiological response in a neutron-exposed normal mouse brain using two imaging tools, [18F]fluro-deoxy-D-glucose positron emission tomography ([18F]FDG-PET) and diffusion weighted-magnetic resonance imaging (DW-MRI), as an imaging biomarker. We measured the apparent diffusion coefficient (ADC) of DW-MRI and standardized uptake value (SUV) of [18F]FDG-PET, which indicated changes in the cellular environment for neutron irradiation. This approach was sensitive enough to detect cell changes that were not confirmed in hematoxylin and eosin (H&E) results. Glucose transporters (GLUT) 1 and 3, indicators of the GLUT capacity of the brain, were significantly decreased after neutron irradiation, demonstrating that the change in blood-brain-barrier (BBB) permeability affects the GLUT, with changes in both SUV and ADC values. These results demonstrate that combined imaging of the same object can be used as a quantitative indicator for in vivo pathological changes. In particular, the radiation exposure assessment of combined imaging, with specific integrated functions of [18F]FDG-PET and MRI, can be employed repeatedly for noninvasive analysis performed in clinical practice. Additionally, this study demonstrated a novel approach to assess the extent of damage to normal tissues as well as therapeutic effects on tumors.
It is well known that mitochondria and the endoplasmic reticulum (ER) play important roles in radiation response, but their functions in radiation-induced bystander effect (RIBE) are largely unclear. In this study, we found that when a small portion of cells in a population of human lung fibroblast MRC-5 cells were precisely irradiated through either the nuclei or cytoplasm with counted microbeam protons, the yield of micronuclei (MN) and the levels of intracellular reactive oxygen species (ROS) in nonirradiated cells neighboring irradiated cells were significantly increased. Mito/ER-tracker staining demonstrated that the mitochondria were clearly activated after nuclear irradiation and ER mass approached a higher level after cytoplasmic irradiation. Moreover, the radiation-induced ROS was diminished by rotenone, an inhibitor of mitochondria activation, but it was not influenced by siRNA interference of BiP, an ER regulation protein. While for nuclear irradiation, rotenone-enhanced radiation-induced ER expression, and BiP siRNA eliminated radiation-induced activation of mitochondria, these phenomena were not observed for cytoplasmic irradiation. Bystander MN was reduced by rotenone but enhanced by BiP siRNA. When the cells were treated with both rotenone and BiP siRNA, the MN yield was reduced for nuclear irradiation but was enhanced for cytoplasmic irradiation. Our results suggest that the organelles of mitochondria and ER have different roles in RIBE with respect to nuclear and cytoplasmic irradiation, and the function of ER is a prerequisite for mitochondrial activation.
Radiation-induced lymphopenia (RIL) is associated with worse survival in patients with solid tumors, as well as lower response rates to checkpoint inhibitors. While single-fraction total-body irradiation is known to result in exponential decreases in the absolute lymphocyte count (ALC), the kinetics of lymphocyte loss after focal fractionated exposures have not previously been characterized. In the current study, lymphocyte loss kinetics was analyzed among patients undergoing focal fractionated radiotherapy for clinical indications. This registry-based study included 419 patients who received either total-body irradiation (TBI; n = 30), stereotactic body radiation therapy (SBRT; n = 73) or conventionally fractionated chemoradiation therapy (CFRT; n = 316). For each patient, serial ALCs were plotted against radiotherapy fraction number. The initial three weeks of treatment for CFRT patients and the entirety of treatment for SBRT and TBI patients were fit to exponential decay in the form ALC(x) = ae–bx, where ALC(x) is the ALC after x fractions. From those fits, fractional lymphocyte loss (FLL) was calculated as FLL = (1 – e–b) * 100, and multivariable regression was performed to identify significant correlates of FLL. Median linearized R2 when fitting the initial fractions was 0.98, 0.93 and 0.97 for patients receiving TBI, SBRT and CFRT, respectively. In CFRT patients, apparent ALC loss rate slowed after week 3. Fitting ALC loss over the entire CFRT course therefore required the addition of a constant term, “c”. For TBI and SBRT patients, treatment ended during the pure exponential decay phase. Initial FLL varied significantly with treatment technique. Mean FLL was 35.5%, 24.3% and 10.77% for patients receiving TBI, SBRT and CFRT, respectively (P < 0.001). Significant correlates of FLL varied by site and included field size, dose per fraction, mean spleen dose, chemotherapy backbone and age. Finally, total percentage ALC loss during radiotherapy was highly correlated with FLL (P < 0.001). Lymphocyte depletion kinetics during the initial phase of fractionated radiotherapy are characterized by pure exponential decay. Initial FLL is strongly correlated with radiotherapy planning parameters and total percentage ALC loss. The two groups with the highest FLL received no concurrent chemotherapy, suggesting that ALC loss can be a consequence of radiotherapy alone. This work may assist in selecting patients for adaptive radiotherapy approaches to mitigate RIL risk.
Radiation-resistant hypoxic tumor areas continue to present a major limitation for successful tumor treatment. To overcome this radiation resistance, an oxygen-independent treatment is proposed using UVC-emitting LuPO4:Pr3+ nanoparticles (NPs) and X rays. The uptake of the NPs as well as their effect on cell proliferation was investigated on A549 lung cancer cells by using inverted time-lapse microscopy and transmission electron microscopy. Furthermore, cytotoxicity of the combined treatment of X rays and LuPO4:Pr3+ NPs was assessed under normoxic and hypoxic conditions using the colony formation assay. Transmission electron microscopy (TEM) images showed no NP uptake after 3 h, whereas after 24 h incubation an uptake of NPs was documented. LuPO4:Pr3+ NPs alone caused a concentration-independent cell growth delay within the first 60 h of incubation. The combined treatment with UVC-emitting NPs and X rays reduced the radiation resistance of hypoxic cells by a factor of two to the level of cells under normoxic condition. LuPO4:Pr3+ NPs cause an early growth delay but no cytotoxicity for the tested concentration. The combination of these NPs with X rays increases cytotoxicity of normoxic and hypoxic cancer cells. Hypoxic cells become sensitized to normoxic cell levels.
William E. Fahl, Frank Jermusek, Thomas Guerin, Dawn M. Albrecht, Carol J. Sarabia Fahl, Emma Dreischmeier, Chelsea Benedict, Susan Back, Jens Eickhoff, Richard B. Halberg
Radiation-induced cancer is an ongoing and significant problem, with sources that include clinics worldwide in which 3.1 billion radiology exams are performed each year, as well as a variety of other scenarios such as space travel and nuclear cleanup. These radiation exposures are typically anticipated, and the exposure is typically well below 1 Gy. When radiation-induced (actually ROS-induced) DNA mutation is prevented, then so too are downstream radiation-induced cancers. Currently, there is no protection available against the effects of such <1 Gy radiation exposures. In this study, we address whether the new PrC-210 ROS-scavenger is effective in protecting p53-deficient (p53–/–) mice against X-ray-induced accelerated tumor mortality; this is the most sensitive radiation tumorigenesis model currently known. Six-day-old p53–/– pups received a single intraperitoneal PrC-210 dose [0.5 maximum tolerated dose (MTD)] or vehicle, and 25 min later, pups received 4.0 Gy X-ray irradiation. At 5 min postirradiation, blood was collected to quantify white blood cell c-H2AX foci. Over the next 250 days, tumor-associated deaths were recorded. Findings revealed that when administered 25 min before 4 Gy X-ray irradiation, PrC-210 reduced DNA damage (c-H2AX foci) by 40%, and in a notable coincidence, caused a 40% shift in tumor latency/incidence, and the 0.5 MTD PrC210 dose had no discernible toxicities in these p53–/– mice. Essentially, the moles of PrC-210 thiol within a single 0.5 MTD PrC-210 dose suppressed the moles of ROS generated by 40% of the 4 Gy X-ray dose administered to p53–/– pups, and in doing so, eliminated the lifetime leukemia/lymphoma risk normally residing “downstream” of that 40% of the 4 Gy dose. In conclusion: 1. PrC-210 is readily tolerated by the 6-day-old p53–/– mice, with no discernible lifetime toxicities; 2. PrC-210 does not cause the nausea, emesis or hypotension that preclude clinical use of earlier aminothiols; and 3. PrC-210 significantly increased survival after 4 Gy irradiation in the p53–/– mouse model.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere