Papers (2016-Present) by Andrew P Worth
Briefings in Bioinformatics, 2024
The assessment of the allergenic potential of chemicals, crucial for ensuring public health safet... more The assessment of the allergenic potential of chemicals, crucial for ensuring public health safety, faces challenges in accuracy and raises ethical concerns due to reliance on animal testing. This paper presents a novel bioinformatic protocol designed to address the critical challenge of predicting immune responses to chemical sensitizers without the use of animal testing. The core innovation lies in the integration of advanced bioinformatics tools, including the Universal Immune System Simulator (UISS), which models detailed immune system dynamics. By leveraging data from structural predictions and docking simulations, our approach provides a more accurate and ethical method for chemical safety evaluations, especially in distinguishing between skin and respiratory sensitizers. Our approach integrates a comprehensive eight-step process, beginning with the meticulous collection of chemical and protein data from databases like PubChem and the Protein Data Bank. Following data acquisition, structural predictions are performed using cutting-edge tools such as AlphaFold to model proteins whose structures have not been previously elucidated. This structural information is then utilized in subsequent docking simulations, leveraging both ligand–protein and protein–protein interactions to predict how chemical compounds may trigger immune responses. The core novelty of our method lies in the application of UISS—an advanced agent-based modelling system that simulates detailed immune system dynamics. By inputting the results from earlier stages, including docking scores and potential epitope identifications, UISS meticulously forecasts the type and severity of immune responses, distinguishing between Th1-mediated skin and Th2-mediated respiratory allergic reactions. This ability to predict distinct immune pathways is a crucial advance over current methods, which often cannot differentiate between the sensitization mechanisms. To validate the accuracy and robustness of our approach, we applied the protocol to well-known sensitizers: 2,4-dinitrochlorobenzene for skin allergies and trimellitic anhydride for respiratory allergies. The results clearly demonstrate the protocol’s ability to differentiate between these distinct immune responses, underscoring its potential for replacing traditional animal-based testing methods. The results not only support the potential of our method to replace animal testing in chemical safety assessments but also highlight its role in enhancing the understanding of chemical-induced immune reactions. Through this innovative integration of computational biology and immunological modelling, our protocol offers a transformative approach to toxicological evaluations, increasing the reliability of safety assessments.
Briefings in Bioinformatics, 2024
The assessment of the allergenic potential of chemicals, crucial for ensuring public health safet... more The assessment of the allergenic potential of chemicals, crucial for ensuring public health safety, faces challenges in accuracy and raises ethical concerns due to reliance on animal testing. This paper presents a novel bioinformatic protocol designed to address the critical challenge of predicting immune responses to chemical sensitizers without the use of animal testing. The core innovation lies in the integration of advanced bioinformatics tools, including the Universal Immune System Simulator (UISS), which models detailed immune system dynamics. By leveraging data from structural predictions and docking simulations, our approach provides a more accurate and ethical method for chemical safety evaluations, especially in distinguishing between skin and respiratory sensitizers. Our approach integrates a comprehensive eight-step process, beginning with the meticulous collection of chemical and protein data from databases like PubChem and the Protein Data Bank. Following data acquisition, structural predictions are performed using cutting-edge tools such as AlphaFold to model proteins whose structures have not been previously elucidated. This structural information is then utilized in subsequent docking simulations, leveraging both ligand-protein and protein-protein interactions to predict how chemical compounds may trigger immune responses. The core novelty of our method lies in the application of UISS-an advanced agent-based modelling system that simulates detailed immune system dynamics. By inputting the results from earlier stages, including docking scores and potential epitope identifications, UISS meticulously forecasts the type and severity of immune responses, distinguishing between Th1mediated skin and Th2-mediated respiratory allergic reactions. This ability to predict distinct immune pathways is a crucial advance over current methods, which often cannot differentiate between the sensitization mechanisms. To validate the accuracy and robustness of our approach, we applied the protocol to well-known sensitizers: 2,4-dinitrochlorobenzene for skin allergies and trimellitic anhydride for respiratory allergies. The results clearly demonstrate the protocol's ability to differentiate between these distinct immune responses, underscoring its potential for replacing traditional animal-based testing methods. The results not only support the potential of our method to replace animal testing in chemical safety assessments but also highlight its role in enhancing the understanding of chemicalinduced immune reactions. Through this innovative integration of computational biology and immunological modelling, our protocol offers a transformative approach to toxicological evaluations, increasing the reliability of safety assessments.
Toxicological Sciences, 2024
The European regulatory framework on chemicals is at a crossroads. There are calls for the framew... more The European regulatory framework on chemicals is at a crossroads. There are calls for the framework to be more effective, by better protecting people and the environment. There is also room for it to be more efficient and cost-effective, by harmonizing assessment practices across sectors and avoiding the need for unnecessary testing. At the same time, there is a political commitment to phase out animal testing in chemical safety assessments. In this commentary, we argue that these needs are not at odds with each other. On the contrary, the European Commission's roadmap to phase out animal testing could also be the transition pathway to a more efficient, effective, and sustainable regulatory ecosystem. Central to our proposal is a framework based on biological reasoning in which biological questions can be answered by a choice of methods, with non-animal methods progressively becoming the only choice. Within this framework, a tiered approach to testing and assessment allows for greater efficiency and effectiveness, while also introducing considerations of proportionality and cost-effectiveness. Testing strategies, and their component methods, should be developed in tandem and judged in terms of their outcomes, and the protection levels they inform, rather than their ability to predict the outputs of animal tests.
Regulatory Toxicology and Pharmacology, 2023
The body of EU chemicals legislation has evolved since the 1960s, producing the largest knowledge... more The body of EU chemicals legislation has evolved since the 1960s, producing the largest knowledge base on chemicals worldwide. Like any evolving system, however, it has become increasingly diverse and complex, resulting in inefficiencies and potential inconsistencies. In the light of the EU Chemicals Strategy for Sustainability, it is therefore timely and reasonable to consider how aspects of the system could be simplified and streamlined, without losing the hard-earned benefits to human health and the environment. In this commentary, we propose a conceptual framework that could be the basis of Chemicals 2.0-a future safety assessment and management approach that is based on the application of New Approach Methodologies (NAMs), mechanistic reasoning and cost-benefit considerations. Chemicals 2.0 is designed to be a more efficient and more effective approach for assessing chemicals, and to comply with the EU goal to completely replace animal testing, in line with Directive 2010/63/EU. We propose five design criteria for Chemicals 2.0 to define what the future system should achieve. The approach is centered on a classification matrix in which NAMs for toxicodynamics and toxicokinetics are used to classify chemicals according to their level of concern. An important principle is the need to ensure an equivalent, or higher, protection level.
Toxics, 2023
A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the ... more A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the body with physiologically relevant compartments connected via blood flow rates described by mathematical equations to determine drug disposition. PBPK models are used in the pharmaceutical sector for drug development, precision medicine, and the chemical industry to predict safe levels of exposure during the registration of chemical substances. However, one area of application where PBPK models have been scarcely used is forensic science. In this review, we give an overview of PBPK models successfully developed for several illicit drugs and environmental chemicals that could be applied for forensic interpretation, highlighting the gaps, uncertainties, and limitations.
Alternatives to Laboratory Animals, 2022
Prediction of chemical toxicity is very useful in risk assessment. With the current paradigm shif... more Prediction of chemical toxicity is very useful in risk assessment. With the current paradigm shift towards the use of in vitro and in silico systems, we present herein a theoretical mathematical description of a quasi-diffusion process to predict chemical concentrations in 3-D spheroid cell cultures. By extending a 2-D Virtual Cell Based Assay (VCBA) model into a 3-D spheroid cell model, we assume that cells are arranged in a series of concentric layers within the sphere. We formulate the chemical quasi-diffusion process by simplifying the spheroid with respect to the number of cells in each layer. The system was calibrated and tested with acetaminophen (APAP). Simulated predictions of APAP toxicity were compared with empirical data from in vitro measurements by using a 3-D spheroid model. The results of this first attempt to extend the VCBA model are promisingthey show that the VCBA model simulates close correlation between the influence of compound concentration and the viability of the HepaRG 3-D cell culture. The 3-D VCBA model provides a complement to current in vitro procedures to refine experimental setups, to fill data gaps and help in the interpretation of in vitro data for the purposes of risk assessment.
Computational Toxicology, 2022
In this editorial we reflect on the past decade of developments in predictive toxicology, and in ... more In this editorial we reflect on the past decade of developments in predictive toxicology, and in particular on the evolution of the Adverse Outcome Pathway (AOP) paradigm. Starting out as a concept, AOPs have become the focal point of a community of scientists, regulators and decision-makers. AOPs provide the mechanistic knowledge underpinning the development of Integrated Approaches to Testing and Assessment (IATA), including computational models now referred to as quantitative AOPs (qAOPs). With reference to recent and related works on qAOPs, we take a brief historical perspective and ask what is the next stage in modernising chemical toxicology beyond animal testing.
Computational Toxicology, 2022
The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and int... more The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and interpretation of mechanistic data representing multiple biological levels and deriving from a range of methodological approaches including in silico, in vitro and in vivo assays. AOPs are playing an increasingly important role in the chemical safety assessment paradigm and quantification of AOPs is an important step towards a more reliable prediction of chemically induced adverse effects. Modelling methodologies require the identification, extraction and use of reliable data and information to support the inclusion of quantitative considerations in AOP development. An extensive and growing range of digital resources are available to support the modelling of quantitative AOPs, providing a wide range of information, but also requiring guidance for their practical application. A framework for qAOP development is proposed based on feedback from a group of experts and three qAOP case studies. The proposed framework provides a harmonised approach for both regulators and scientists working in this area.
Frontiers in Toxicology, 2022
Editorial on the Research Topic Advances and Refinements in the Development and Application of Th... more Editorial on the Research Topic Advances and Refinements in the Development and Application of Threshold of Toxicological Concern The Threshold of Toxicological Concern (TTC) is an exposure threshold below which there is no appreciable risk to human health. There are two main approaches: TTC values based on cancer potency data from which a one in a million excess lifetime risk is estimated and TTC values based on non-cancer effects. For the latter approach, a distribution is typically fitted to the No Observed Adverse Effect Levels (NOAELs) from repeat dose toxicity studies from which a 5th percentile value is taken and adjusted using an uncertainty factor (usually this is 100). Established TTC values are those based on oral chronic studies that were first developed by Munro et al. (1996) who subcategorised chemicals into one of three Cramer structural classes (Cramer et al., 1978). Kroes et al. (2004) presented a tiered TTC approach that established several human exposure thresholds spanning four orders of magnitude where the lowest TTC was for substances presenting structural alerts for genotoxicity (0.15 μg/d), to the next tier for organophosphates/carbamates (18 μg/d) and the remaining higher TTC values representing the same three Cramer classes originally derived by Munro et al. (1996). The World Health Organization (WHO) and the European Food Safety Authority (EFSA) (European Food Safety Authority and World Health Organization, 2016; EFSA et al., 2019) have determined that the TTC approach is a sound and fit-for-purpose risk assessment tool, with a number of caveats, in cases where chemical-specific repeat dose toxicity data are not available.
Computational and Structural Biotechnology Journal, 2022
Immunotoxicity hazard identification of chemicals aims to evaluate the potential for unintended e... more Immunotoxicity hazard identification of chemicals aims to evaluate the potential for unintended effects of chemical exposure on the immune system. Perfluorinated alkylate substances (PFAS), such as perfluorooctane sulfonate (PFOS) and perfluorooctanoic acid (PFOA), are persistent, globally disseminated environmental contaminants known to be immunotoxic. Elevated PFAS exposure is associated with lower antibody responses to vaccinations in children and in adults. In addition, some studies have reported a correlation between PFAS levels in the body and lower resistance to disease, in other words an increased risk of infections or cancers. In this context, modelling and simulation platforms could be used to simulate the human immune system with the aim to evaluate the adverse effects that immunotoxicants may have. Here, we show the conditions under which a mathematical model developed for one purpose and application (e.g., in the pharmaceutical domain) can be successfully translated and transferred to another (e.g., in the chemicals domain) without undergoing significant adaptation. In particular, we demonstrate that the Universal Immune System Simulator was able to simulate the effects of PFAS on the immune system, introducing entities and new interactions that are biologically involved in the phenomenon. This also revealed a potentially exploitable pathway for assessing immunotoxicity through a computational model.
Computational and Structural Biotechnology Journal, 2022
In many domains regulating chemicals and chemical products, there is a legal requirement to deter... more In many domains regulating chemicals and chemical products, there is a legal requirement to determine skin sensitivity to allergens. While many in vitro assays to detect contact hypersensitivity have been developed as alternatives to animal testing over the past ten years and significant progress has been made in this area, there is still a need for continued investment in the creation of techniques and strategies that will allow accurate identification of potential contact allergens and their potency in vitro. In silico models are promising tools in this regard. However, none of the state-of-the-art systems seems to function well enough to serve as a stand-alone hazard identification tool, especially in evaluating the possible allergenicity effects in humans. The Universal Immune System Simulator, a mechanistic computational platform that simulates the human immune system response to a specific insult, provides a means of predicting the immunotoxicity induced by skin sensitisers, enriching the collection of computational models for the assessment of skin sensitization. Here, we present a specific disease layer implementation of the Universal Immune System Simulator for the prediction of allergic contact dermatitis induced by specific skin sensitizers.
Regulatory Toxicology and Pharmacology, 2022
Structure-activity relationships (SARs) in toxicology have enabled the formation of structural ru... more Structure-activity relationships (SARs) in toxicology have enabled the formation of structural rules which, when coded as structural alerts, are essential tools in in silico toxicology. Whilst other in silico methods have approaches for their evaluation, there is no formal process to assess the confidence that may be associated with a structural alert. This investigation proposes twelve criteria to assess the uncertainty associated with structural alerts, allowing for an assessment of confidence. The criteria are based around the stated purpose, description of the chemistry, toxicology and mechanism, performance and coverage, as well as corroborating and supporting evidence of the alert. Alerts can be given a confidence assessment and score, enabling the identification of areas where more information may be beneficial. The scheme to evaluate structural alerts was placed in the context of various use cases for industrial and regulatory applications. The analysis of alerts, and consideration of the evaluation scheme, identifies the different characteristics an alert may have, such as being highly specific or generic. These characteristics may determine when an alert can be used for specific uses such as identification of analogues for read-across or hazard identification.
Regulatory Toxicology and Pharmacology, 2022
New Approach Methodologies (NAMs) are considered to include any in vitro, in silico or chemistry-... more New Approach Methodologies (NAMs) are considered to include any in vitro, in silico or chemistry-based method, as well as the strategies to implement them, that may provide information that could inform chemical safety assessment. Current chemical legislation in the European Union is limited in its acceptance of the widespread use of NAMs. The European Partnership for Alternative Approaches to Animal Testing (EPAA) therefore convened a 'Deep Dive Workshop' to explore the use of NAMs in chemical safety assessment, the aim of which was to support regulatory decisions, whilst intending to protect human health. The workshop recognised that NAMs are currently used in many industrial sectors, with some considered as fit for regulatory purpose. Moreover, the workshop identified key discussion points that can be addressed to increase the use and regulatory acceptance of NAMs. These are based on the changes needed in frameworks for regulatory requirements and the essential needs in education, training and greater stakeholder engagement as well the gaps in the scientific basis of NAMs.
Computational Toxicology, 2022
In a century where toxicology and chemical risk assessment are embracing alternative methods to a... more In a century where toxicology and chemical risk assessment are embracing alternative methods to animal testing, there is an opportunity to understand the causal factors of neurodevelopmental disorders such as learning and memory disabilities in children, as a foundation to predict adverse effects. New testing paradigms, along with the advances in probabilistic modelling, can help with the formulation of mechanistically-driven hypotheses on how exposure to environmental chemicals could potentially lead to developmental neurotoxicity (DNT). This investigation aimed to develop a Bayesian hierarchical model of a simplified AOP network for DNT. The model predicted the probability that a compound induces each of three selected common key events (CKEs) of the simplified AOP network and the adverse outcome (AO) of DNT, taking into account correlations and causal relations informed by the key event relationships (KERs). A dataset of 88 compounds representing pharmaceuticals, industrial chemicals and pesticides was compiled including physicochemical properties as well as in silico and in vitro information. The Bayesian model was able to predict DNT potential with an accuracy of 76%, classifying the compounds into low, medium or high probability classes. The modelling workflow achieved three further goals: it dealt with missing values; accommodated unbalanced and correlated data; and followed the structure of a directed acyclic graph (DAG) to simulate the simplified AOP network. Overall, the model demonstrated the utility of Bayesian hierarchical modelling for the development of quantitative AOP (qAOP) models and for informing the use of new approach methodologies (NAMs) in chemical risk assessment.
Computational Toxicology, 2021
Physiologically Based Kinetic (PBK) models are valuable tools to help define safe external levels... more Physiologically Based Kinetic (PBK) models are valuable tools to help define safe external levels of chemicals based on internal doses at target organs in experimental animals, humans and organisms used in environmental risk assessment. As the toxicity testing paradigm shifts to alternative testing approaches, PBK model development has started to rely (mostly or entirely) on model parameters quantified using in vitro or in silico methods. Recently, the Organisation for Economic Cooperation and Development (OECD) published a guidance document (GD) describing a scientific workflow for characterising and validating PBK models developed using in vitro and in silico data. The GD provides an assessment framework for evaluating these models, with emphasis on the major uncertainties underlying model inputs and outputs. To help end-users submit or evaluate a PBK model for regulatory purposes, the GD also includes a template for documenting the model characteristics, and a checklist for evaluating the quality of a model. This commentary highlights the principles, criteria and tools laid out in the OECD PBK model GD, with the aim of facilitating the dialogue between model developers and risk assessors.
Computational Toxicology, 2021
New approaches in toxicology based on in vitro methods and computational modelling offer consider... more New approaches in toxicology based on in vitro methods and computational modelling offer considerable potential to improve the efficiency and effectiveness of chemical hazard and risk assessment in a variety of regulatory contexts. However, this presents challenges both for developers and regulatory assessors because often these two communities do not share the same level of confidence in a new approach. To address this challenge, various assessment frameworks have been developed over the past 20 years with the aim of creating harmonised and systematic approaches for evaluating new methods. These frameworks typically focus on specific methodologies and technologies, which has proven useful for establishing the validity and credibility of individual methods. However, given the increasing need to compare methods and combine their use in integrated assessment strategies, the multiplicity of frameworks is arguably becoming a barrier to their acceptance. In this commentary, we explore the concepts of model validity and credibility, and we illustrate how a set of seven credibility factors provides a method-agnostic means of comparing different kinds of predictive toxicology approaches. It is hoped that this will facilitate communication and cross-disciplinarity among method developers and users, with the ultimate aim of increasing the acceptance and use of predictive approaches in toxicology.
Computational Toxicology, 2021
With current progress in science, there is growing interest in developing and applying Physiologi... more With current progress in science, there is growing interest in developing and applying Physiologically Based Kinetic (PBK) models in chemical risk assessment, as knowledge of internal exposure to chemicals is critical to understanding potential effects in vivo. In particular, a new generation of PBK models is being developed in which the model parameters are derived from in silico and in vitro methods. To increase the acceptance and use of these "Next Generation PBK models", there is a need to demonstrate their validity. However, this is challenging in the case of data-poor chemicals that are lacking in kinetic data and for which predictive capacity cannot, therefore, be assessed. The aim of this work is to lay down the fundamental steps in using a read across framework to inform modellers and risk assessors on how to develop, or evaluate, PBK models for chemicals without in vivo kinetic data. The application of a PBK model that takes into account the absorption, distribution, metabolism and excretion characteristics of the chemical reduces the uncertainties in the biokinetics and biotransformation of the chemical of interest. A strategic flow-charting application, proposed herein, allows users to identify the minimum information to perform a read-across from a data-rich chemical to its data-poor analogue(s). The workflow analysis is illustrated by means of a real case study using the alkenylbenzene class of chemicals, showing the reliability and potential of this approach. It was demonstrated that a consistent quantitative relationship between model simulations could be achieved using models for estragole and safrole (source chemicals) when applied to methyleugenol (target chemical). When the PBK model code for the source chemicals was adapted to utilise input values relevant to the target chemical, simulation was consistent between the models. The resulting PBK model for methyleugenol was further evaluated by comparing the results to an existing, published model for methyleugenol, providing further evidence that the approach was successful. This can be considered as a "read-across" approach, enabling a valid PBK model to be derived to aid the assessment of a data poor chemical.
Archives of Toxicology, 2021
In view of the need to enhance the assessment of consumer products called for in the EU Chemicals... more In view of the need to enhance the assessment of consumer products called for in the EU Chemicals Strategy for Sustainability, we developed a methodology for evaluating hazard by combining information across different systemic toxicity endpoints and integrating the information with new approach methodologies. This integrates mechanistic information with a view to avoiding redundant in vivo studies, minimising reliance on apical endpoint tests and ultimately devising efficient testing strategies. Here, we present the application of our methodology to carcinogenicity assessment, mapping the available information from toxicity test methods across endpoints to the key characteristics of carcinogens. Test methods are deconstructed to allow the information they provide to be organised in a systematic way, enabling the description of the toxicity mechanisms leading to the adverse outcome. This integrated approach provides a flexible and resource-efficient means of fully exploiting test methods for which test guidelines are available to fulfil regulatory requirements for systemic toxicity assessment as well as identifying where new methods can be integrated.
Archives of Toxicology, 2021
The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other E... more The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other EU regulations, such as REACH and the Cosmetic Products Regulation advocate for a change in the way toxicity testing is conducted. Whilst the Cosmetic Products Regulation bans animal testing altogether, REACH aims for a progressive shift from in vivo testing towards quantitative in vitro and computational approaches. Several endpoints can already be addressed using non-animal approaches including skin corrosion and irritation, serious eye damage and irritation, skin sensitisation, and mutagenicity and genotoxicity. However, for systemic effects such as acute toxicity, repeated dose toxicity and reproductive and developmental toxicity, evaluation of chemicals under REACH still heavily relies on animal tests. Here we summarise current EU regulatory requirements for the human health assessment of chemicals under REACH and the Cosmetic Products Regulation, considering the more critical endpoints and identifying the main challenges in introducing alternative methods into regulatory testing practice. This supports a recent initiative taken by the International Cooperation on Alternative Test Methods (ICATM) to summarise current regulatory requirements specific for the assessment of chemicals and cosmetic products for several human health-related endpoints, with the aim of comparing different jurisdictions and coordinating the promotion and ultimately the implementation of non-animal approaches worldwide. Recent initiatives undertaken at European level to promote the 3Rs and the use of alternative methods in current regulatory practice are also discussed.
Computational Toxicology, 2021
The COSMOS Database (DB) was originally established to provide reliable data for cosmetics-relate... more The COSMOS Database (DB) was originally established to provide reliable data for cosmetics-related chemicals within the COSMOS Project funded as part of the SEURAT-1 Research Initiative. The database has subsequently been maintained and developed further into COSMOS Next Generation (NG), a combination of database and in silico tools, essential components of a knowledge base. COSMOS DB provided a cosmetics inventory as well as other regulatory inventories, accompanied by assessment results and in vitro and in vivo toxicity data. In addition to data content curation, much effort was dedicated to data governance – data authorisation, characterisation of quality, documentation of meta information, and control of data use. Through this effort, COSMOS DB was able to merge and fuse data of various types from different sources. Building on the previous effort, the COSMOS Minimum Inclusion (MINIS) criteria for a toxicity database were further expanded to quantify the reliability of studies. COSMOS NG features multiple fingerprints for analysing structure similarity, and new tools to calculate molecular properties and screen chemicals with endpoint-related public profilers, such as DNA and protein binders, liver alerts and genotoxic alerts. The publicly available COSMOS NG enables users to compile information and execute analyses such as category formation and read-across. This paper provides a step-by-step guided workflow for a simple read-across case, starting from a target structure and culminating in an estimation of a NOAEL confidence interval. Given its strong technical foundation, inclusion of quality-reviewed data, and provision of tools designed to facilitate communication between users, COSMOS NG is a first step towards building a toxicological knowledge hub leveraging many public data systems for chemical safety evaluation. We continue to monitor the feedback from the user community at [email protected].
Uploads
Papers (2016-Present) by Andrew P Worth
Although considerable progress has been made and is ongoing in view of overcoming these shortcomings, there currently still are gaps in the knowledge of nanomaterial behavior; there is fragmentation in the scientific results and need for more harmonization and standardization. A lack of public access to the results and tools is preventing uptake and use of the models in regulatory decision-making. More guidance is needed and, in particular, more coordination is required to direct the model development to elucidating relevant gaps of knowledge and addressing questions and endpoints relevant for regulatory applications.
Overall, a number of recommendations are made in view of increasing the availability and uptake of computational methods for regulatory nanosafety assessment. They concern inherent scientific uncertainties in the understanding of nanomaterial behavior; data quality and variability; standardization and harmonization; the availability of data and predictive models; and the accessibility and visibility of models, including the need for infrastructures.
Two other consultations conducted under this Fitness Check, one focused on stakeholder organisations and the other on small and medium-sized enterprises will be reported separately. Responses will provide an essential input to the Fitness Check analysis carried out by the JRC. A more detailed analysis of the responses from the three consultations will be published in a synopsis report at the end of the of the process.
In this report we propose that in vitro studies could contribute to the identification of potential triggers for DNT evaluation since existing cellular models permit the evaluation of a chemical impact on key neurodevelopmental processes, mimicking different windows of human brain development, especially if human models derived from induced pluripotent stem cells are applied. Furthermore, the battery of currently available DNT alternative test methods anchored to critical neurodevelopmental processes and key events identified in DNT Adverse Outcome Pathways (AOPs) could be applied to generate in vitro data useful for various regulatory purposes. Incorporation of in vitro mechanistic information would increase scientific confidence in decision making, by decreasing uncertainty and leading to refinement of chemical grouping according to biological activity. In this report development of IATA (Integrated Approaches to Testing and Assessment) based on key neurodevelopmental processes and AOP-informed is proposed as a tool for not only speeding up chemical screening, but also providing mechanistic data in support of hazard assessment and in the evaluation of chemical mixtures. Such mechanistically informed IATA for DNT evaluation could be developed integrating various sources of information (e.g., non-testing methods, in vitro approaches, as well as in vivo animal and human data), contributing to screening for prioritization, hazard identification and characterization, and possibly safety assessment of chemicals, speeding up the evaluation of thousands of compounds present in industrial, agricultural and consumer products that lack safety data on DNT potential. It is planned that the data and knowledge generated from such testing will be fed into the development of an OECD guidance document on alternative approaches to DNT testing.
For 178 chemicals previously tested in vitro with the 3T3 BALB/c cell line using the Neutral Red Uptake cytotoxicity assay, physicochemical parameters were retrieved and curated. Of these chemicals, 83 were run in the VCBA to simulate a 96-well microplate set up with 5% serum supplementation, and their no effect concentration (NEC) and killing rate (Kr) optimized against the experimental data. Analyses of results of partitioning of the chemicals show a strong relation with their lipophilicity, expressed here as the logarithm of the octanol/water partitioning coefficient, with highly lipophilic chemicals binding mostly to medium lipid. Among the chemicals analysed, only benzene and xylene were modelled to evaporate by more than 10 %, and these were also the chemicals with highest degradation rates during the 48 hours assay. Chemical degradation is dependent not only on the air and water degradation rates but also on the extent of binding of the chemical.
Due to the strong binding of some chemicals to medium lipids and proteins we analysed the impact of different serum supplementations (0%, 5% and 10%) on the chemical dissolved concentrations. As expected, for the more lipophilic chemicals, different serum levels result in different dissolved concentrations, with lipid and protein binding reducing chemical loss by evaporation. Still the lack of saturation modelling might mislead the 0 % supplementation since the lipids coming solely from cells exudates are able to sequester chemical to a large extent, eg. after 48 hours, 63% (1.2E-5 M) of dimethyldioctadecylammonium chloride was bound to lipid from the cells. Although highly lipophilic chemicals have a very small bioavailable fraction, cellular uptake rate is also dependent on logKow, which compensates for this lack of bioavailability to some extent.
Based on the relevance of lipophilicity on in vitro chemical bioavailability, we have developed an alert system based on logKow, creating four classes of chemicals for the experimental condition with 10% serum supplementation: logKow 5- 10 (A), logKow <5 (B), logKow <2.5 (C), and logKow <2 (D). New chemicals from Classes A and B, which will in the future be tested in vitro, were run first on the VCBA, without considering toxicity (NEC and Kr set to 0). VCBA simulations indicated that these chemicals are more than 50% bound to medium proteins, lipids and plastic. Therefore, for chemicals with logKow falling in these classes, special care should be taken when extrapolating the obtained in vitro toxic concentrations to in vivo relevant doses.
A comparison of the VCBA-predicted dissolved concentrations corresponding to nominal IC50 values with the available rat oral LD50 values did not improve the previously obtained correlations. This is probably because other in vivo kinetic processes play an important role but were not considered in this in vitro-in vivo extrapolation.
The comparison of the VCBA predicted IC50 dissolved concentrations with the available rat oral LD50 values, did not improve the previously obtained correlations. Nevertheless, other in vivo kinetic processes that are not modelled may play an important role. They should be considered in the in vitro-in vivo extrapolations.
A local sensitivity analysis showed the relative low impact of Molar Volume and Molecular Diffusion Volume on the final dissolved concentration, supporting the use of approximated values obtained through the herein created QSARs. The logkow and Henry Law Constant showed, as expected, a high impact in partitioning. Killing rate was shown to also have a relative low impact in the final chemical concentration, indicating that although its optimization is important, finding the Kr that leads to the absolute best correlation between experimental and predicted concentration-viability curves, is not imperative. The VCBA can be applied to virtually any chemical as long as the physicochemical data (for the fate model) and the experimental toxicity data (that include cell growth/death) are available. However, being such a generic model, several assumptions had to be made: i) no distinction of chemical classes (inorganic, polar organic chemicals), ii) no consideration of metabolism, iii) saturation kinetics and iv) external in vitro conditions.
The advantages of having a generic model are that the VCBA can fit several experimental set ups and should be used in an exploratory manner, to help refinement of experimental conditions. The herein obtained VCBA results should be double checked experimentally the partition with a set of chemical compounds to better understand to what extent VCBA represents chemicals of different properties.
In future developments, it would be important to reduce the uncertainties of the model such as binding-saturation and consider inclusion of other endpoints such as metabolic activity.
The validity of the two-year bioassay though has been questioned in the last decade. Uncertainty is associated with the extrapolation of data from rodents to humans. Furthermore, these studies are extremely time and resource-consuming; the high animal burden has raised ethical concerns. For all these reasons, there is a strong demand for alternative strategies and methods. The development of new in vitro methods for carcinogenicity testing, however, has progressed slowly and those available are far from being accepted for regulatory decision making, especially when evaluating the carcinogenicity of non-genotoxic chemicals or specific classes of compounds such as biologicals, microorganisms and nanomaterials.
The European Union Reference Laboratory for alternatives to animal testing (EURL ECVAM) has carried out an analysis of carcinogenicity testing requirements and assessment approaches across different sectors. This consisted of: a systematic review of the different testing requirements; a review of the number of animals used per sector; an estimation of the number of carcinogenicity and genotoxicity studies conducted or waived in respect of the number of substances authorized per sector per year; a review of the type of justifications for waiving the two-year bioassay and how opportunities for waiving are being used.
Results from this analysis will provide context for initiatives aimed at: 1) reducing the need for animal use where animal testing is still a requirement; 2) ensuring an adequate carcinogenicity assessment in sectors where animal use is banned or limited; and 3) improving the process of cancer hazard identification where existing methods are not suitable.
For the purposes of this work, data on acute and chronic aquatic toxicity (Daphnia and fish) from several databases (US EPA Ecotox database, Aquatic ECETOC, Aquatic OASIS, Aquatic Japan MoE databases and ECHA database as implemented in the OECD QSAR Toolbox Version 2.3) were collated and analysed. Simple linear relationships and interspecies sensitivity ratios were calculated using either acute Daphnia data (48h LC50) or chronic Daphnia data (14 days NOEC) and chronic fish data (>21 days NOEC). Acute to chronic relationships and acute to chronic ratios (ACR) were also calculated based on acute fish data (96h LC50) and chronic fish data. These analyses were carried out on the whole set of chemicals and on subgroups of chemicals classified according to the Verhaar mode of action (MOA) scheme, which attribute general mode of acute aquatic toxic action based on the chemical structure of the molecule. Outliers were identified applying the Robust regression and Outlier removal (ROUT) method.
Our results show that the best fitted relationships for the prediction of chronic fish toxicity are obtained based on acute fish data (r2=0.87) and acute Daphnia data (r2=0.64) when dealing with the whole set of chemicals regardless of the MOA. The quality of the relationships was increased by using the geometric mean (calculated across all the values extracted for a given chemical and a given endpoint) instead of the lowest value for a given endpoint.
When considering the MOA, MOA 3 and MOA 1 chemicals give the strongest acute Daphnia to chronic fish relationship and chronic Daphnia to chronic fish relationship; however the relationships obtained with acute Daphnia data are better (r2= 0.83 and 0.69 for MOA 3 and MOA 1 respectively) than the one obtained with chronic Daphnia data (r2= 0.66 and 0.65 for MOA 1 and 3 respectively). When considering acute fish data, all the MOA classes give strong relationships (r2=0.88 for MOA 3 and MOA 5 chemicals, 0.85 for MOA 4 chemicals and 0.83 for MOA 1 and MOA 2 chemicals). Therefore when acute toxicity data on fish are available, they might give a reliable basis to extrapolate the chronic toxicity on fish as a first tier assessment or within a weight of evidence approach.
There is a correlation between chemicals with high ACR values or interspecies sensitivity ratios and the outliers identified in the above-mentioned relationships. When considering chemicals with a high interspecies sensitivity ratio, Daphnia being more sensitive than fish, several aniline derivatives and pesticides acting through cholinesterase inhibition were identified. When considering high interspecies sensitivity ratio chemicals for which Daphnia is less sensitive than fish, we found pesticides and known endocrine disruptors such as ethynil oestradiol and 17ß-oestradiol. Extreme (i.e. <1 or > 100) interspecies sensitivity ratios were mainly evident for MOA 2, 4 and 5 chemicals. Regarding ACR for fish, around 50% of the chemicals in each MOA class have an ACR within a factor of 10; whereas 100% of MOA 3, 90.9% of MOA 2, 88.3% of MOA 4 and 85.5% of MOA 1 chemicals have an ACR within a factor of 100. Therefore, the safety factor of 100 commonly applied in environmental risk assessment does not seem to be equally protective for every MOA.
Twenty-one case studies were identified, which included human and environmental risk assessments. Several compound classes and environmental media were covered, i.e. pesticides, phthalates, parabens, PBDEs, pharmaceuticals, food contact materials, dioxin-like compounds, anti-androgenic chemicals, contaminants in breast milk, mixtures of contaminants in surface water, ground water and drinking water, and indoor air. However, the selection of chemical classes is not necessarily representative as many compounds groups have not been covered. The selection of these chemical classes is often based on data availability, recent concerns about certain chemical classes or legislative requirements. Several of the case studies revealed a concern due to combined exposure for certain chemical classes especially when considering specific vulnerable population groups. This is very relevant information, but needs to be interpreted with caution, considering the related assumptions, model parameters and related uncertainties. Several parameters that could lead to an over- or underestimation of risks were identified. However, there is clear evidence that chemicals need to be further addressed not only in single substance risk assessment and that mixtures should be considered also across chemical classes and legislative sectors.
Furthermore, several issues hampering mixture risk assessments were identified. In order to perform a mixture risk assessment, the composition of the mixture in terms of chemical components and their concentrations need to be known, and relevant information on their uptake and toxicity are required. Exposure data are often lacking and need to be estimated based on production and use/consumption information. Also relevant toxicity data are not always available. Toxicity data gaps can be filled e.g. using the Threshold of Toxicological Concern approach. Reference values used in single substance risk assessments can be found for several chemical classes, however, they are usually derived based on the lowest endpoint. If a refined toxicity assessment of a mixture for a specific effect/cumulative assessment group is envisaged, this is often hampered by a lack of specific toxicity and mode of action information.
In all case studies, concentration addition based assessments were made, mainly applying the Hazard Index. To further characterise the drivers of the mixture risk, the maximum cumulative ratio was calculated in several case studies. This showed that the scientific methodologies to address mixtures are mostly agreed and lead to reasonable predictions. However, especially for some groups of compounds that are designed as active substances, it cannot be excluded that interactions appear and they should therefore be addressed on a case-by-case basis.
Most of the mixtures addressed in the identified case studies examined specific chemical groups. Only few of them looked at mixtures comprising chemicals regulated under different legislative frameworks. The examples indicated that there is evidence for combined exposure to chemicals regulated under different legislation as well as evidence that such chemicals can elicit similar effects or have a similar mode of action. A mixture risk assessment across regulatory sectors should therefore be further investigated.
The results of this preliminary analysis show that the Munro dataset is broadly representative of the chemical space of cosmetics, although certain structural classes are missing, notably organometallics, silicon-containing compounds, and certain types of surfactants (non-ionic and cationic classes). Furthermore, compared with the Cosmetics Inventory, the Munro dataset has a higher prevalence of reactive chemicals and a lower prevalence of larger, long linear chain structures. The COSMOS TTC dataset, comprising repeat dose toxicity data for cosmetics ingredients, shows a good representation of the Cosmetics Inventory, both in terms of physicochemical property ranges, structural features and chemical use categories. Thus, this dataset is considered to be suitable for investigating the applicability of the TTC approach to cosmetics. The results of the toxicity data analysis revealed a number of cosmetic ingredients in Cramer Class I with No Observed Effect Level (NOEL) values lower than the Munro threshold of 3000 µg/kg bw/day. The prevalence of these “false negatives” was less than 5%, which is the percentage expected by chance resulting from the use of the 5th percentile of cumulative probability distribution of NOELs in the derivation of TTC values. Furthermore, the majority of these false negatives do not arise when structural alerts for DNA-binding are used to identify potential genotoxicants, to which a lower TTC value of 0.0025 µg/kg bw/day is typically applied. Based on these preliminary results, it is concluded that the current TTC approach is broadly applicable to cosmetics, although a number of improvements can be made, through the quality control of the underlying TTC datasets, modest revisions / extensions of the Cramer classification scheme, and the development of explicit guidance on how to apply the TTC approach.
Objectives: The aim of the suggested strategy for chemical read-across is to show how a traditional read-across based on structural similarities between source and target substance can be strengthened with additional evidence from new approach data--for example, information from in vitro molecular screening, "-omics" assays and computational models--to reach regulatory acceptance.
Methods: We identified four read-across scenarios that cover typical human health assessment situations. For each such decision context, we suggested several chemical groups as examples to prove when read-across between group members is possible, considering both chemical and biological similarities.
Conclusions: We agreed to carry out the complete read-across exercise for at least one chemical category per read-across scenario in the context of SEURAT-1, and the results of this exercise will be completed and presented by the end of the research initiative in December 2015.
New approach methodologies encompassing a range of innovative technologies, including in vitro methods employing 3D tissues and cells, organ-on-chip technologies, computational models (including machine learning and artificial intelligence), and 'omics (transcriptomics and metabolomics), are developed, evaluated, and integrated into assessment frameworks in order to enhance the efficiency and effectiveness of hazard and risk assessment of chemicals and products across various regulatory contexts. Furthermore, substantial efforts are directed at promoting the development and utilisation of non-animal approaches in fundamental and applied research, where the majority of animal testing occurs, as well as for educational purposes. The achievements and accomplishments documented in this report are the culmination of collaborative efforts with EURL ECVAM's dedicated partners and stakeholders.
The principle of the Three Rs, i.e. Replacement, Reduction and Refinement of animal use in basic, applied and translational research, as well as for regulatory purposes is firmly anchored in EU legislation, full replacement of animal testing being the ultimate goal.
New Approach Methodologies including a variety of innovative technologies, such as in vitro methods using 3D tissues and cells, organ-on-chip, computational models (including AI) and ‘omics (genomics, proteomics, metabolomics) technology are developed, evaluated and integrated in assessment frameworks with a view to improve the efficiency and effectiveness of chemical and product hazard and risk assessment in a variety of regulatory contexts. Important activities to promote the development and use of non-animal approaches are also pursued in the areas of basic and applied research, where most of the animals are used, as well as for education purposes.
EU policies and legislation call for innovative and more efficient ways of safety testing and chemical risk assessment that do not depend on animal testing. Advanced technologies such as computational models, in vitro methods and organ-on-chip devices are being developed, evaluated and integrated to translate mechanistic understanding of toxicity into safety testing strategies. The ultimate goal is to achieve better protection of human health and the environment while supporting EU innovation and industrial competitiveness, without the use of animals.
The development and use of non-animal models and methods are also essential for advancing basic, applied and translational research. Education also plays an essential role in enabling a shift to non-animal methods through the introduction of the Three Rs (Replacement, Reduction and Refinement of animal use in science) into secondary school curricula and programmes of higher education.
Replacement, reduction and refinement of animal testing (Three Rs) in basic, applied and translational research, as well as for regulatory purposes is anchored in EU legislation. Innovative technologies (e.g., computational models, in vitro methods, organ-on-chip devices) are developed, evaluated and integrated in assessment strategies for chemicals and products safety testing in order to preserve a high level of protection for human health and the environment. Important activities to promote the development and use of alternative methods and approaches are also pursued in the areas of basic and applied research as well as for education purposes.