BioOne.org will be down briefly for maintenance on 14 May 2025 between 18:00-22:00 Pacific Time US. We apologize for any inconvenience.
Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
This review aims to trace the evolution of dosimetry, highlight its significance in the advancement of radiation research, and identify the current trends and methodologies in the field. Key historical milestones, starting with the first publications in the journal in 1954, will be synthesized before addressing contemporary practices in radiation medicine and radiobiological investigation. Finally, possibilities for future opportunities in dosimetry will be offered. The overarching goal is to emphasize the indispensability of accurate and reproducible dosimetry in enhancing the quality of radiation research and practical applications of ionizing radiation.
Radiobiological data, whether obtained at the clinical, biological or molecular level has significantly contributed to a better description and prediction of the individual dose-response to ionizing radiation and a better estimation of the radiation-induced risks. Particularly, over the last seventy years, the amount of radiobiological data has considerably increased, and permitted the mathematical formulas describing dose-response to become less empirical. A better understanding of the basic radiobiological mechanisms has also contributed to establish quantitative inter-correlations between clinical, biological and molecular biomarkers, refining again the mathematical models of description. Today, big data approaches and, more recently, artificial intelligence may finally complete and secure this long process of thinking from the multi-scale description of radiation-induced events to their prediction. Here, we reviewed the major dose-response models applied in radiobiology for quantifying molecular and cellular radiosensitivity and aimed to explain their evolution: Specifically, we highlighted the advances concerning the target theory with the cell survival models and the progressive introduction of the DNA repair process in the mathematical models. Furthermore, we described how the technological advances have changed the description of DNA double-strand break (DSB) repair kinetics by introducing the important notion of DSB recognition, independent of that of DSB repair. Initially developed separately, target theory on one hand and, DSB recognition and repair, on the other hand may be now fused into a unified model involving the cascade of phosphorylations mediated by the ATM kinase in response to any genotoxic stress.
Radiation research is a multidisciplinary field, and among its many branches, mathematical and computational modelers have played a significant role in advancing boundaries of knowledge. A fundamental contribution is modelling cellular response to ionizing radiation as that is the key to not only understanding how radiation can kill cancer cells, but also cause cancer and other health issues. The invention of microdosimetry in the 1950s by Harold Rossi paved the way for brilliant scientists to study the mechanism of radiation at cellular and sub-cellular scales. This paper reviews some snippets of ingenious mathematical and computational models published in microdosimetry symposium proceedings and publications of the radiation research community. Among these are simulations of radiation tracks at atomic and molecular levels using Monte Carlo methods, models of cell survival, quantification of the amount of energy required to create a single strand break, and models of DNA-damage-repair. These models can broadly be categorized into mechanistic, semi-mechanistic, and phenomenological approaches, and this review seeks to provide historical context of their development. We salute pioneers of the field and great teachers who supported and educated the younger members of the community and showed them how to build upon their work.
Numerous dose rate effects have been described over the past 6–7 decades in the radiation biology and radiation oncology literature depending on the dose rate range being discussed. This review focuses on the impact and understanding of altering dose rates in the context of radiation therapy, but does not discuss dose rate effects as relevant to radiation protection. The review starts with a short historic review of early studies on dose rate effects, considers mechanisms thought to underlie dose rate dependencies, then discusses some current issues in clinical findings with altered dose rates, the importance of dose rate in brachytherapy, and the current timely topic of the use of very high dose rates, so-called FLASH radiotherapy. The discussion includes dose rate effects in vitro in cultured cells, in in vivo experimental systems and in the clinic, including both tumors and normal tissues. Gaps in understanding dose rate effects are identified, as are opportunities for improving clinical use of dose rate modulation.
The relative biological effectiveness is a mathematical quantity first defined in the 1950s. This has resulted in more than 4,000 scientific papers published to date. Yet defining the correct value of the RBE to use in clinical practice remains a challenge. A scientific analysis in the radiation research literature can provide an understanding of how this mathematical quantity has evolved. The purpose of this study is to investigate documents published since 1950 using bibliometric indicators and network visualization. This analysis seeks to provide an assessment of global research activities, key themes, and RBE research within the radiation-related field. It strives to highlight top-performing authors, organizations, and nations that have made major contributions to this research domain, as well as their interactions. The Scopus Collection was searched for articles and reviews pertaining to RBE in radiation research from 1950 through 2023. Scopus and Bibiometrix analytic tools were used to investigate the most productive countries, researchers, collaboration networks, journals, along with the citation analysis of references and keywords. A total of 4,632 documents were retrieved produced by authors originating from 71 countries. Publication trends could be separated in 20-year groupings beginning with slow accrual from 1950 to 1970, an early rise from 1970-1990, followed by a sharp increase in the years 1990s-2010s that matches the development of charged particle therapy in clinics worldwide and opened discussion on the true value of the RBE in proton beam therapy. Since the 2010s, a steady 200 papers, on average, have been published per year. The United States produced the most publications overall (N = 1,378) and Radiation Research was the most likely journal to have published articles related to the RBE (606 publications during this period). J. Debus was the most prolific author (112 contributions, with 2,900 citations). The RBE has captured the research interest of over 7,000 authors in the past decade alone. This study supports that notion that the growth of the body of evidence surrounding the RBE, which started 75 years ago, is far from reaching its end. Applications to medicine have continuously dominated the field, with physics competing with Biochemistry, Genetics and Molecular Biology for second place over the decades. Future research can be predicted to continue.
A multiple-parameter based approach using radiation-induced clinical signs and symptoms, hematology changes, cytogenetic chromosomal aberrations, and molecular biomarkers changes after radiation exposure is used for biodosimetry-based dose assessment. In the current article, relevant milestones from Radiation Research are documented that forms the basis of the current consensus approach for diagnostics after radiation exposure. For example, in 1962 the use of cytogenetic chromosomal aberration using the lymphocyte metaphase spread dicentric assay for biodosimetry applications was first published in Radiation Research. This assay is now complimented using other cytogenetic chromosomal aberration assays (i.e., chromosomal translocations, cytokinesis-blocked micronuclei, premature chromosome condensation, c-H2AX foci, etc.). Changes in blood cell counts represent an early-phase biomarker for radiation exposures. Molecular biomarker changes have evolved to include panels of organ-specific plasma proteomic and blood-based gene expression biomarkers for radiation dose assessment. Maturation of these assays are shown by efforts for automated processing and scoring, development of point-of-care diagnostics devices, service laboratories inter-comparison exercises, and applications for dose and injury assessments in radiation accidents. An alternative and complementary approach has been advocated with the focus to de-emphasize dose and instead focus on predicting acute or delayed health effects. The same biomarkers used for dose estimation (e.g., lymphocyte counts) can be used to directly predict the later developing severity degree of acute health effects without performing dose estimation as an additional or intermediate step. This review illustrates contributing steps toward these developments published in Radiation Research.
The aim of this paper is to review the history surrounding the discovery of lethal mutations, later described as delayed reproductive death. Lethal mutations were suggested very early on, to be due to a generalised instability in a cell population and are considered now to be one of the first demonstrations of “radiation-induced genomic instability” which led later to the establishment of the field of “non-targeted effects.” The phenomenon was first described by Seymour et al. in 1986 and was confirmed by Trott's group in Europe and by Little and colleagues in the United States before being extended by Mendonca et al. in 1989, who showed conclusively that the distinguishing feature of lethal mutation occurrence was that it happened suddenly after about 9–10 population doublings in progeny which had survived the original dose of ionizing radiation. However, many authors then suggested that in fact, lethal mutations were implicit in the original experiments by Puck and Marcus in 1956 and were described in the extensive work by Sinclair in 1964, who followed clonal progeny for up to a year after irradiation and described “small colony formation” as a persistent consequence of ionizing radiation exposure. In this paper, we examine the history from 1956 to the present using the period from 1986–1989 as an anchor point to reach into the past and to go forward through the evolution of the field of low dose radiobiology where non-targeted effects predominate.
Extracellular vesicles (EVs) have been recognized as a novel way of cell-to-cell communication in the last several decades. It is believed that EVs exert their functions on nearby or distant cells through transfer of the cargo that they carry. In this review, we focus on EVs produced by endothelial cells, with emphasis on their role in hematopoiesis. We first describe how endothelial cells interact with hematopoietic stem/progenitor cells during development and in disease conditions. We then discuss EVs, ranging from their subtypes to isolation methods and analysis of EVs. With the above background information, we next review the literature related to endothelial cell derived EVs (ECEVs), including physiological functions and their clinical uses. In the last sections, we summarize the current results about the effect of ECEVs on hematopoiesis under physiological and stress conditions.
Radiation cytogenetics has a rich history seldom appreciated by those outside the field. Early radiobiology was dominated by physics and biophysical concepts that borrowed heavily from the study of radiation-induced chromosome aberrations. From such studies, quantitative relationships between biological effect and changes in absorbed dose, dose rate and ionization density were codified into key concepts of radiobiological theory that have persisted for nearly a century. This review aims to provide a historical perspective of some of these concepts, including evidence supporting the contention that chromosome aberrations underlie development of many, if not most, of the biological effects of concern for humans exposed to ionizing radiations including cancer induction, on the one hand, and tumor eradication on the other. The significance of discoveries originating from these studies has widened and extended far beyond their original scope. Chromosome structural rearrangements viewed in mitotic cells were first attributed to the production of breaks by the radiations during interphase, followed by the rejoining or mis-rejoining among ends of other nearby breaks. These relatively modest beginnings eventually led to the discovery and characterization of DNA repair of double-strand breaks by non-homologous end joining, whose importance to various biological processes is now widely appreciated. Two examples, among many, are V(D)J recombination and speciation. Rapid technological advancements in cytogenetics, the burgeoning fields of molecular radiobiology and third-generation sequencing served as a point of confluence between the old and new. As a result, the emergent field of cytogenomics now becomes uniquely positioned for the purpose of more fully understanding mechanisms underlying the biological effects of ionizing radiation exposure.
When environmental impact and risks associated with radioactive contamination of ecosystems are assessed, the source term and deposition must be linked to ecosystem transfer, biological uptake and effects in exposed organisms. Thus, a well-defined source term is the starting point for transport, dose, impact and risk models. After the Chornobyl accident, 3–4 tons of spent nuclear fuel were released and radioactive particles were important ingrediencies of the actual source term. As Chornobyl particles were observed in many European countries, some scientists suggested that radioactive particles were a peculiarity of the Chornobyl accident. In contrast, research over the years has shown that a major fraction of refractory elements such as uranium (U) and plutonium (Pu) released to the environment has been released as particles following a series of past events such as nuclear weapons tests, non-criticality accidents involving nuclear weapons, military use of depleted uranium ammunition, and nuclear reactor accidents. Radioactive particles and colloids have also been observed in discharges from nuclear installations to rivers or to regional seas and are associated with nuclear waste dumped at sea. Furthermore, radioactive particles have been identified at uranium mining and tailing sites as well as at other NORM sites such as phosphate or oil and gas industrial facilities. Research has also demonstrated that particle characteristics such as elemental composition depend on the emitting source, while characteristics such as size distribution, structure, and oxidation state influencing ecosystem transfer will also depend on the release scenarios. Thus, access to advanced particle characteristic techniques is essential within radioecology. After deposition, localized heterogeneities such as particles will be unevenly distributed in the environment. Thus, inventories can be underestimated, and impact and risk assessments of particle contaminated areas may suffer from unacceptable large uncertainties if radioactive particles are ignored. The present paper will focus on key sources contributing to the release of radioactive particles to the environments, as well as linking particle characteristics to ecosystem behavior and potential biological effects.
This paper starts with a brief history of the birth of the field of radioecology during the Cold War with a focus on US activity. We review the establishment of the international system for radiation protection and the science underlying the guidelines. We then discuss the famous ICRP 60 statement that if Man is protected, so is everything else and show how this led to a focus in radioecology on pathways to Man rather than concern about impacts on environments or ecosystems. We then review the contributions of Radiation Research Society members and papers published in Radiation Research which contributed to the knowledge base about effects on nonhuman species. These fed into international databases and computer-based tools such as ERICA and ResRad Biota to guide regulators. We then examine the origins of the concern that ICRP 60 is not sufficient to protect ecosystems and discuss the establishment of ICRP Committee 5 and its recommendations to establish reference animals and plants. The review finishes with current concerns that reference animals and plants (RAPs) are not sufficient to protect ecosystems, given the complexity of interacting factors such as the climate emergency and discusses the efforts of ICRP, the International Union of Radioecologists and other bodies to capture the concepts of ecosystem services and ecosystem complexity modelling in radioecology.
Strontium-90 is a radionuclide found in high concentrations in nuclear reactor waste and nuclear fallout from reactor accidents and atomic bomb explosions. In the 1950s, little was known regarding the health consequences of strontium-90 internalization. To assess the health effects of strontium-90 ingestion in infancy through adolescence, the Atomic Energy Commission and Department of Energy funded large-scale beagle studies at the University of California Davis. Conducted from 1956 to 1989, the strontium-90 ingestion study followed roughly 460 beagles throughout their lifespans after they were exposed to strontium-90 in utero (through feeding of the mother) and fed strontium-90 feed at varying doses from weaning to age 540 days. The extensive medical data and formalin-fixed paraffin-embedded tissues were transferred from UC Davis to the National Radiobiology Archive in 1992 and subsequently to the Northwestern University Radiobiology Archive in 2010. Here, we summarize the design of the strontium-90 ingestion study and give an overview of its most frequent recorded findings. As shown before, radiation-associated neoplasias (osteosarcoma, myeloproliferative syndrome and select squamous cell carcinomas) were almost exclusively observed in the highest dose groups, while the incidence of neoplasias most frequent in controls decreased as dose increased. The occurrence of congestive heart failure in each dose group, not previously assessed by UC Davis researchers, showed a non-significant increase between the controls and lower dose groups that may have been significant had sample sizes been larger. Detailed secondary analyses of these data and samples may uncover health endpoints that were not evaluated by the team that conducted the study.
Several scientific themes are reviewed in the context of the 75-year period relevant to this special platinum issue of Radiation Research. Two criteria have been considered in selecting the scientific themes. One is the exposure of the associated research activity in the annual meetings of the Radiation Research Society (RRS) and in the publications of the Society's Journal, thus reflecting the interest of members of RRS. The second criteria is a focus on contributions from Australian members of RRS. The first theme is the contribution of radiobiology to radiation oncology, featuring two prominent Australian radiation oncologists, the late Rod Withers and his younger colleague, Lester Peters. Two other themes are also linked to radiation oncology; preclinical research aimed at developing experimental radiotherapy modalities, namely microbeam radiotherapy (MRT) and Auger endoradiotherapy. The latter has a long history, in contrast to MRT, especially in Australia, given that the associated medical beamline at the Australian Synchrotron in Melbourne only opened in 2011. Another theme is DNA repair, which has a trajectory parallel to the 75-year period of interest, given the birth of molecular biology in the 1950s. The low-dose radiobiology theme has a similar timeline, predominantly prompted by the nuclear era, which is also connected to the radioprotector theme, although radioprotectors also have a long-established potential utility in cancer radiotherapy. Finally, two themes are associated with biodosimetry. One is the micronucleus assay, highlighting the pioneering contribution from Michael Fenech in Adelaide, South Australia, and the other is the c-H2AX assay and its widespread clinical applications.
This historical review of extracellular vesicles in the setting of exposure to ionizing radiation (IR) traces our understanding of how vesicles were initially examined and reported in the literature in the late 1970s (for secreted exosomes) and early 1980s (for plasma membrane-derived, exfoliated vesicles) to where we are now and where we may be headed in the next decade. An emphasis is placed on biophysical properties of extracellular vesicles, energy consumption and the role of vesiculation as an essential component of membrane turnover. The impact of intercellular signal trafficking by vesicle surface and intra-vesicular lipids, proteins, nucleic acids and metabolites is reviewed in the context of biomarkers for estimating individual radiation dose after exposure to radiation, pathogenesis of disease and development of cell-free therapeutics. Since vesicles express both growth stimulatory and inhibitory molecules, a hypothesis is proposed to consider superposition in a shared space and entanglement of molecules by energy sources that are external to human cells. Implications of this approach for travel in deep space are briefly discussed in the context of clinical disorders that have been observed after space travel.
In vitro and in vivo observations accumulated over several decades have firmly shown that the biological effects of ionizing radiation can spread from irradiated cells/tissues to non-targeted cells/tissues. Redox-modulated intercellular communication mechanisms that include a role for secreted factors and gap junctions, can mediate these non-targeted effects. Clearly, the expression of such effects and their transmission to progeny cells has implications for issues related to radiation protection. Their elucidation is also relevant towards enhancing the efficacy of cancer radiotherapy and reducing its impact on the development of normal tissue toxicities. In addition, the study of non-targeted effects is pertinent to our basic understanding of intercellular communications under conditions of oxidative stress. This review will trace the history of non-targeted effects of radiation starting with early reports of abscopal effects which described radiation induced effects in tissues distant from the site of radiation exposure. A related effect involved the production of clastogenic factors in plasma following irradiation which can induce chromosome damage in unirradiated cells. Despite these early reports suggesting non-targeted effects of radiation, the classical paradigm that a direct deposition of energy in the nucleus was required still dominated. This paradigm was challenged by papers describing radiation induced bystander effects. This review will cover mechanisms of radiation-induced bystander effects and the potential impacts on radiation protection and radiation therapy.
At the dawn of the 20th Century, the underlying chemistry that produced the observed effects of ionizing radiation, e.g., X rays and Radium salts, on aqueous solutions was either unknown or restricted to products found postirradiation. For example, the Curies noted that sealed aqueous solutions of Radium inexplicably decomposed over time, even when kept in the dark. By 1928 there were numerous papers describing the phenomenological effects of ionizing radiation on a wide variety of materials, including the irradiated hands of early radiologists. One scientist who became intensely interested in these radiation effects was Hugo Fricke (Fricke Dosimetry) who established a laboratory in 1928 dedicated to studies on chemical effects of radiation, the results of which he believed were necessary to understand observed radiobiological effects. In this Platinum Issue of Radiation Research (70 years of continuous publication), we present the early history of the development of radiation chemistry and its contributions to all levels of mechanistic radiobiology. We summarize its development as one of the four disciplinary pillars of the Radiation Research Society and its Journal, Radiation Research, founded during the period 1952–1954. In addition, the work of scientists who contributed substantially to the discipline of Radiation Chemistry and to the birth, life and culture of the Society and its journal is presented. In the years following 1954, the increasing knowledge about the underlying temporal and spatial properties of the species produced by various types of radiation is summarized and related to its radiobiology and to modern technologies (e.g., pulsed radiolysis, electron paramagnetic resonance) which became available as the discipline of radiation chemistry developed. A summary of important results from these studies on Radiation Chemistry/Biochemistry in the 20th and 21st Century up to the present time is presented. Finally, we look into the future to see what possible directions radiation chemistry studies might take, based upon promising current research. We find at least two possible directions that will need radiation chemistry expertise to ensure proper experimental design and interpretation of data. These are FLASH radiotherapy, and mechanisms underlying the effects of low doses of radiation delivered at low dose rates. Examples of how radiation chemists could provide beneficial input to these studies are provided.
The concept of radiation-induced clustered damage in DNA has grown over the past several decades to become a topic of considerable interest across the scientific disciplines involved in studies of the biological effects of ionizing radiation. This paper, prepared for the 70th anniversary issue of Radiation Research, traces historical development of the three main threads of physics, chemistry, and biochemical/cellular responses that led to the hypothesis and demonstration that a key component of the biological effectiveness of ionizing radiation is its characteristic of producing clustered DNA damage of varying complexities. The physics thread has roots that started as early as the 1920s, grew to identify critical nanometre-scale clusterings of ionizations relevant to biological effectiveness, and then, by the turn of the century, had produced an extensive array of quantitative predictions on the complexity of clustered DNA damage from different radiations. Monte Carlo track structure simulation techniques played a key role through these developments, and they are now incorporated into many recent and ongoing studies modelling the effects of radiation. The chemistry thread was seeded by water-radiolysis descriptions of events in water as radical-containing “spurs,” demonstration of the important role of the hydroxyl radical in radiation-inactivation of cells and the difficulty of protection by radical scavengers. This led to the concept and description of locally multiply damaged sites (LMDS) for DNA double-strand breaks and other combinations of DNA base damage and strand breakage that could arise from a spur overlapping, or created in very close proximity to, the DNA. In these ways, both the physics and the chemistry threads, largely in parallel, put out the challenge to the experimental research community to verify these predictions of clustered DNA damage from ionizing radiations and to investigate their relevance to DNA repair and subsequent cellular effects. The third thread, biochemical and cell-based research, responded strongly to the challenge by demonstrating the existence and biological importance of clustered DNA damage. Investigations have included repair of a wide variety of defined constructs of clustered damage, evaluation of mutagenic consequences, identification of clustered base-damage within irradiated cells, and identification of co-localization of repair complexes indicative of complex clustered damage after high-LET irradiation, as well as extensive studies of the repair pathways involved in repair of simple double-strand breaks. There remains, however, a great deal more to be learned because of the diversity of clustered DNA damage and of the biological responses.
Preparation for medical responses to major radiation accidents, further driven by increases in the threat of nuclear warfare, has led to a pressing need to understand the underlying mechanisms of radiation injury (RI) alone or in combination with other trauma (combined injury, CI). The identification of these mechanisms suggests molecules and signaling pathways that can be targeted to develop radiation medical countermeasures. Thus far, the United States Food and Drug Administration (U.S. FDA) has approved seven countermeasures to mitigate hematopoietic acute radiation syndrome (H-ARS), but no drugs are available for prophylaxis and no agents have been approved to combat the other sub-syndromes of ARS, let alone delayed effects of acute radiation exposure or the effects of combined injury. From its inception, Radiation Research has significantly contributed to the understanding of the underlying mechanisms of radiation injury and combined injury, and to the development of radiation medical countermeasures for these indications through the publication of peer-reviewed research and review articles.
Mark P. Little, Dimitry Bazyka, Amy Berrington de Gonzalez, Alina V. Brenner, Vadim V. Chumak, Harry M. Cullings, Robert D. Daniels, Benjamin French, Eric Grant, Nobuyuki Hamada, Michael Hauptmann, Gerald M. Kendall, Dominique Laurier, Choonsik Lee, Won Jin Lee, Martha S. Linet, Kiyohiko Mabuchi, Lindsay M. Morton, Colin R. Muirhead, Dale L. Preston, Preetha Rajaraman, David B. Richardson, Ritsu Sakata, Jonathan M. Samet, Steven L. Simon, Hiromi Sugiyama, Richard Wakeford, Lydia B. Zablotska
In this article we review the history of key epidemiological studies of populations exposed to ionizing radiation. We highlight historical and recent findings regarding radiation-associated risks for incidence and mortality of cancer and non-cancer outcomes with emphasis on study design and methods of exposure assessment and dose estimation along with brief consideration of sources of bias for a few of the more important studies. We examine the findings from the epidemiological studies of the Japanese atomic bomb survivors, persons exposed to radiation for diagnostic or therapeutic purposes, those exposed to environmental sources including Chornobyl and other reactor accidents, and occupationally exposed cohorts. We also summarize results of pooled studies. These summaries are necessarily brief, but we provide references to more detailed information. We discuss possible future directions of study, to include assessment of susceptible populations, and possible new populations, data sources, study designs and methods of analysis.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere