0
Commentary ? AI-assigned paper type based on the abstract. Classification may not be perfect — flag errors using the feedback button. Tier 3 ? Commentary, letter, editorial, or conference abstract. Useful context, not primary evidence. Detection Methods Food & Water Gut & Microbiome Human Health Effects Marine & Wildlife Nanoplastics Policy & Risk Remediation Reproductive & Development Sign in to save

Editorial: Five grand challenges in toxicology

Frontiers in Toxicology 2025 1 citation ? Citation count from OpenAlex, updated daily. May differ slightly from the publisher's own count. Score: 43 ? 0–100 AI score estimating relevance to the microplastics field. Papers below 30 are filtered from public browse.
Bengt Fadeel, Jan Alexander, Jan Alexander, Sara C. Antunes, Sara C. Antunes, Sara C. Antunes, Sara C. Antunes, Kim Dalhoff, Kim Dalhoff, Sara C. Antunes, Bengt Fadeel, Ellen Fritsche, Antonio Pietroiusti, Bengt Fadeel, Antonio Pietroiusti, Antonio Pietroiusti, Helena T. Högberg, François Huaux, François Huaux, Stina Oredsson, Antonio Pietroiusti, Terje Svingen, Martin F. Wilks

Summary

This editorial identifies five grand challenges facing toxicology, including how to assess risks from mixtures of chemicals, engineered nanomaterials, and emerging pollutants. It calls for a shift toward predictive toxicological science that moves beyond studying one substance at a time.

Models
Study Type In vivo

IntroductionThe well-known dictum “the dose makes the poison” is familiar to every student of toxicology, though it is important to add that there is no risk of adverse effects without exposure. This essential point is sometimes overlooked when new and emerging risks such as engineered nanomaterials are being discussed (1). The question of the dose of a poisonous substance or toxicant relates to the question of toxicology as a scientific discipline – is toxicology the science of poisons, or has toxicology evolved into a “science of safety”? (2). Furthermore, are we ready to shed the predominantly observational science of the past and usher in a new predictive toxicological science of the future? Fifteen years ago, Thomas Hartung lamented that toxicological studies search for rare events using imperfect models – usually addressing one substance at a time (3). He then listed scientific and strategic developments that are required to implement a new regulatory toxicology including the need for standardization and validation of new approaches, as well as the systematic integration of these approaches into testing strategies (3). The question is, what should serve as the benchmark given that conventional validation processes often rely on animal tests that may lack adequate validation themselves (4). Notwithstanding, it is evident from these discussions that toxicology is both a basic and an applied science. The objectives are thus two-fold: to uncover mechanisms of adverse effects of substances on biological systems while also leveraging this knowledge to protect human health and the environment. To facilitate this translation of knowledge into practice, the flow of information should be bidirectional. Indeed, to promote the field, a dialogue between researchers and regulators is required to identify the kind of knowledge that is needed.Grand challengesThe authors of this editorial are editors of the different sections of Frontiers in Toxicology, a journal that was launched 5 years ago (in 2019). Frontiers in Toxicology is a multidisciplinary journal committed to presenting the latest research on the adverse effects of substances, particles, and mixtures on living organisms and ecosystems, from molecular mechanisms to population effects. Frontiers in Toxicology received its first impact factor in 2024. However, we believe that the true impact of the journal lies in providing a forum for the exchange of knowledge in the toxicological community. Here, we address five grand challenges, with the aim of stimulating further discussion.Challenge 1: investigation – towards a mechanistic toxicologyToxicology is sometimes viewed as a descriptive science tasked with cataloguing the effects of substances or drug candidates. However, toxicology as a basic science devoted to the understanding of mechanisms of toxicity covers the spectrum of adverse effects of chemicals, which in turn may unearth fundamental biological mechanisms. Thus, toxicology is not an afterthought. For instance, mechanistic toxicological studies have delivered critical insights in biology and toxicology including an improved understanding of the role of the gut microbiome in xenobiotic metabolism (5), as well as the role of xenobiotic receptors or environmental sensors such as the aryl hydrocarbon receptor (AhR) in health and disease (6). Furthermore, so-called investigative (or mechanistic) toxicology is gaining momentum in the pharmaceutical sciences (7). Some of the “game-changing” technologies in investigative toxicology, according to a recent survey conducted among medium-sized to large pharmaceutical companies, are organ-on-a-chip microphysiological systems, genomic profiling, and high-content imaging-based assays while the perception of other technologies including the use of stem cells had barely changed when compared to a previous survey (7). It should be emphasized, however, that the ultimate goal of toxicological investigations differs in the pharmaceutical and public health disciplines. In the first case, the goal is to minimize the adverse effects of the drug so that the balance between beneficial and adverse effects is in favor of its use (which implies that some degree of toxicity is considered acceptable), whereas in the second instance, it is obvious that society does not tolerate any toxicity. Indeed, the challenge in occupational medicine is to identify the highest level of exposure at which no adverse effect is elicited. Nevertheless, important lessons can be learned from the pharmaceutical sciences. Computational approaches are being leveraged in drug discovery (8), and physics-driven (quantum mechanics) models as well as data-driven (artificial intelligence) based approaches have been gaining a footing in toxicology too (9).Challenge 2: integration – a problem of multiscale comparisonsBiological pathways are usually defined as a series of interactions among molecules that lead to a certain product or a change in a cell (though it is noted that there are also extracellular biological pathways). Pathway “thinking”, i.e., the view that the adverse effects of substances may be best understood in terms of their disruption of biological (or molecular) pathways, has come to the fore in contemporary toxicology, and it is explicit in the development and use of adverse outcome pathways (discussed below). Indeed, it has been suggested in a recent perspective that the basic tenet of toxicology be recontextualized as “the dose disrupts the pathway” in order to better guide and understand the results of mechanistic toxicological investigations (10). The challenge is thus to design experiments that allow for the investigation of pathways as opposed to reductionist studies of one gene or one protein at a time while also making sure that the model systems are relevant for our understanding of adverse effects in humans. In addition, simultaneous exposure to several chemicals is common, and it is not currently clear if the overall level of toxicity is affected in this situation (for instance, it remains to be understood whether exposure to a mixture in which the individual components are present at a concentration below their no-effect-level may induce toxic effects as a result of so-called “dose addition” wherein each component acts on the same target). Considerable advances have been made in recent years with respect to organ-on-a-chip models including a living, breathing microphysiological lung mimic (11). Using this bio-inspired device, the authors demonstrated that cyclic mechanical strain accentuated inflammatory responses of the lung to silica nanoparticles. The model is promising though it is noted that a human adenocarcinoma cell line, NCl-H441, was used to impersonate real lung epithelial cells. The use of stem cells (see below) instead of cell lines could potentially bridge the gap between in vitro models and the human situation. Moreover, while microphysiological systems are amenable to high-throughput screening (11), the additional cost and complexity could hamper their use.This brings us to one of the major challenges in toxicological testing, namely, how to bridge the divide between in vitro models and the in vivo situation (which is not to say that human in vitro models need to mimic animal models; instead, they should be predictive of relevant biological outcomes in humans which is, after all, the main goal). Furthermore, the impacts of substances play out across multiple spatial and temporal scales, yet we tend to test substances using static model systems which may or may not replicate the biological pathways that come into play in a living organism. Computational approaches can be used to address this problem (12). The challenge is to integrate a multiscale approach wherein biological effects occur at multiple levels of biological organization with input from toxicokinetic and toxicodynamic modeling to better understand what happens when a substance or toxicant is introduced into the body.The devil is in the details. In toxicology, human cell culture techniques have been adopted for a variety of purposes, including the reduction or replacement of animal experiments. However, animal-derived products are often used meaning that these methods are not completely animal-free. For instance, fetal bovine serum (FBS), which is used to supplement cell culture medium, not only gives rise to animal welfare concerns, but also raises questions related to batch-to-batch variation and the ill-defined nature of FBS, not to mention that species differences may call into question the relevance of the results obtained using standard cell culture medium. However, there is a push towards fully humanized cell culture protocols using chemically defined media (13).Challenge 3: computation – beware of drowning in a sea of dataThe term “omics” in toxicology implies technologies such as genomics, transcriptomics, proteomics, metabolomics, and epigenomics coupled with conventional toxicological assays to investigate the underlying molecular mechanisms of action of toxicants. Sydney Brenner famously complained that molecular biology had become, in his words, “low input, high throughput, no output science”, and it is true that omics-based approaches sometimes are equated with hypothesis-free research, but this is a misguided view as is the suggestion that lists of genes or proteins or metabolites can shed light on biological mechanisms. The methodologies listed above are merely tools, albeit very sophisticated ones, and computational deconvolution is needed to make sense of the sea of data. Hence, omics-based approaches will not necessarily reveal a (novel) toxicological mechanism – the results also need to be anchored in a firm understanding of the biology in question. On the other hand, omics-based approaches yield a global view of the perturbations triggered by a substance or mixture of substances and could thus overcome reductionist tendencies in toxicology. It is important to note that the portfolio of omics techniques is rapidly expanding, and single-cell transcriptomics and proteomics has greatly increased the granularity of toxicological investigations while the recent introduction of spatially resolved transcriptomics heralds a new era in the analysis of gene expression at the tissue level (14). The torrential increase in the amount of data generated requires new computational approaches (15), and a new breed of toxicologists well versed in such methods.Predictive toxicology can be viewed, fundamentally, as the prediction of biological activity from chemical structure; this is also known as structure-activity relationship modeling (16). To this end, the data have to be useful, i.e., of good quality, as well as usable (meaning that other investigators must also be able to access the data). However, predictive toxicology may also refer to the prediction of outcomes in complex (living) systems from observations in simple ones (17). The challenge is to ensure that in vitro models are predictive of in vivo outcomes, and that such models can be used to address the impact of substances (toxicants) quantitatively as well as qualitatively. If these challenges are met, this could significantly overhaul human health risk assessment (17). The advent of artificial intelligence (AI) has further expanded the toolbox of toxicologists. Thus, machine learning approaches could promote predictive toxicology and could perhaps be used to support risk assessment and decision making (18). Machine learning gives computers the ability to learn without explicitly being programmed. However, “learning” may be too anthropomorphic; instead, AI algorithms enable pattern recognition which could guide (human) toxicologists and risk assessors. Understanding the reliability of AI-based tools in toxicology is, indeed, a challenge. Machine learning also paves the way from a deterministic towards a probabilistic risk assessment (see below). Probabilistic risk assessment and Bayesian analysis (named after the English statistician Thomas Bayes whose posthumous paper "An Essay Towards Solving a Problem in the Doctrine of Chances" forms the basis of what is now called Bayes’ theorem) are methodologies incorporating uncertainties and variability that provide estimates of the range and likelihood of a hazard or risk in the assessment (19).Challenge 4: regulation – rise of new approach methodologiesHistorically, hazard assessment has been based mainly on animal data, which does not necessarily provide a mechanistic understanding of the observed effects. However, considerable attempts have been made in regulatory toxicology to transit from in vivo (animal) studies to mechanism-based, human-relevant in vitro studies (4). The US Environmental Protection Agency (EPA) initiated the ToxCast, or “toxicity forecaster”, program to develop decision support systems based on in vitro screening results to aid in the prioritization of environmental chemicals for further investigation (20). Indeed, conventional cell cultures serve as a useful screening tool. However, to better mimic the complexity of a living organism, multicellular organotypic models are being developed (21). Here, advances in stem cell biology, especially the development of the human induced pluripotent stem cell (iPSC) technology, have provided a significant boost. Developmental neurotoxicology with its battery of stem cell and primary cell based NAMs can be viewed as a test pilot for NAMs for assessing complex endpoints (21). The challenge is to implement these new technologies in a regulatory setting. Making predictions regarding human health risks based on cell models is non-trivial. However, adverse outcome pathways (AOPs) provide a conceptual framework for the organization of toxicological knowledge, from molecular-level perturbations of a biological system to adverse outcomes at the level of biological organization of regulatory relevance (22). Toxicogenomics data can be used to enrich AOPs (23). It is important, however, to distinguish between an adverse response and an adaptive response as the distinction between the two is critical to regulatory decision making (24).Overall, testing approaches that are reliable and relevant for human biology are needed, and so-called new approach methodologies (NAMs) are being developed with this in mind. This challenge is addressed, for instance, in the Partnership for the Assessment of Risks from Chemicals (PARC), a multinational project aligned with the European Union's Chemicals Strategy for Sustainability and the European Green Deal (www.eu-parc.eu). Similarly, the NIH Common Fund’s Complement Animal Research in Experimentation (Complement-ARIE) program is aimed at speeding up the development, standardization, validation, and use of NAMs (www.commonfund.nih.gov). NAMs are defined as any technology or methodology that can provide information on chemical hazard and risk assessment without the use of animals. However, NAMs are not necessarily newly developed methods; instead, it is their implementation in regulatory decision making that may be considered new (25). Thus, regulatory acceptance is key, and a dialogue with regulatory agencies is needed. Ultimately, integration of NAMs into chemical risk assessment practices will depend not only on demonstrating their scientific merits, but also on addressing perspectives of risk assessors and the public on how to assess risk and the associated uncertainties (26).The translation of the results of in vitro or animal studies to humans is further complicated by factors, which may not be systematically considered in such studies, but which may strongly affect the final biological effect. For instance, pre-existing diseases may cause an increased susceptibility to the effects of chemicals. Co- and cumulative exposures also need to be considered, as well as the sex and life stage (age) of the exposed individual (27). Moreover, certain minority groups may display an unpredictable response to chemicals. Indeed, sex-divergent immune responses were disclosed in a study of individuals undergoing gender affirming hormone treatment (28). Traditional risk assessment has used default uncertainty factors to address inter-individual and inter-species differences, but new probabilistic methods and Bayesian approaches promise to support risk assessment to better protect all populations at risk (29).Ecological risk assessment is a stepwise evaluation approach wherein assessment parameters at the different stages rely on sensitive toxicological assays. The major challenge has been the identification and validation of biomarkers capable of reporting issues before they manifest in effects at the individual or population level, thus allowing decision makers to act and implement measures to safeguard ecosystems. In this context, concerted efforts are being made to identify the gaps in the current mammalian-centric high-throughput screening assay landscape with respect to ecologically relevant endpoints (30). Human health and environmental risk assessments are often performed in isolation, and a key challenge is therefore to implement an integrated risk assessment in which the different aspects of risk assessment are consolidated into a “one health” paradigm (23). This is also emphasized in the safe-and-sustainable-by-design (SSbD) and circular economy approach or vision which guides the innovation process towards creating chemicals and materials that are more sustainable and safer for humans and the environment (31).Challenge 5: education – fostering the future toxicologistsParacelsus, the Father of Toxicology and one of the first physicians to recognize the role of chemistry in medicine, believed that only those who practiced an art knew it: "the patients are your textbook”. He sought the kind of knowledge that cannot be achieved solely through scholastic disputations, and therefore embarked on a series of extensive travels around Europe including spending a year as a physician and university professor in Basel where in medicine based on his not on This us that toxicology on new knowledge and new approaches, and it how toxicologists to challenge The of critical should be first and in our who are the toxicologists of the It is important to to in toxicology that toxicology is a scientific discipline with a Hence, toxicology knowledge from a range of including biology, medicine, and so to this toxicology is usually in thus providing toxicologists with a and However, education also requires time for and Furthermore, toxicology on the use of tools and and an for chemistry high-throughput biology, and science and data large is required to enable the of toxicologists to the challenges of contemporary toxicology, as is a science in which the of a that before they However, toxicologists are not who are able to and the predictive toxicology paradigm to the use of models and methods computational to or as a replacement of descriptive this also implies a from the conventional science towards a science that exposures the of an individual the The has been by some as a However, the may be viewed as a bridge between and toxicological Indeed, a on the of environmental exposures could efforts to unearth at the individual or population important of toxicology the assessment of the effects of on and the animal and that they toxicology can provide an of more complex environmental Furthermore, the of levels in the environment and change makes it essential to understand and issues that are more to and and measures one of the the past years or more has been the of toxicology from a predominantly observational science to a predominantly predictive science that toxicity pathways into Computational approaches are to this new predictive toxicology, as discussed However, it is to the lessons of the past as we to the future this we have five grand challenges in an to contemporary toxicology, and we that this will further Frontiers in Toxicology a forum for the of toxicologists The first in the journal on engineered nanomaterials while the to conventional and techniques for the of and a of The second and in Frontiers in Toxicology provide an of the use of new approach methodologies (NAMs) to regulatory and a survey of the impacts of and substances, or Frontiers in Toxicology currently has to editorial from more of which are in and the However, we also editorial and authors from other we to the of toxicology, fundamental scientific and their to promote human

Sign in to start a discussion.

Share this paper