Abstract New approach methodologies (NAMs) are an integral part of Next Generation Risk Assessment (NGRA). NAMs consider in-vitro, in-silico, in-chemistry test methods. These can be linked by PBK modelling and in-vitro in-vivo extrapolation (IVIVE) to external exposure concentrations. Thus NAM-based test methods require translation from external exposure to internal exposure and vice versa. Additionally aggregated external human exposure by inhalation and dermal route will be often based on measurements or are estimated by tools usually applied under REACH. PBK modelling allows to predict chemical and metabolite concentrations over time and in relevant body compartments (human plasma and/or tissues) for given exposure scenarios (forward dosimetry). By turning around the approach (reverse dosimetry) acceptable external exposure levels (such as OEL) may be derived on internal thresholds (in-vitro effect data, compound specific threshold values, human biomonitoring data). Thereby, maximal external exposure concentrations below which human exposure is considered safe can be derived. The presentation will contain the description of the approach to derive acceptable occupational exposure levels (OELs) based on internal threshold values. The approach will be discussed with a case study to illustrate the applicability. The approach is beneficial especially for case compounds such as NMP where aggregated exposure is determined by inhalation and dermal by air.
Abstract This project evaluated the applicability of existing alternative data, such as chemical, biological and metabolite similarity, to improve the selection of relevant source compound (SC). This information was modularly integrated into read‐across (RAX) case studies addressing systemic toxicity after repeated exposure or developmental toxicity. For this purpose, data‐rich reference classes of pesticides were defined, with propiconazole and iodosulfuron methyl sodium as target compounds (TCs). The combination of chemical and biological similarity for TC propiconazole detected mostly relevant SC from reference class compounds. Biological similarity was calculated using binary hit call from ToxCast dataset, which is highly dependent on the data density. Low data density was used as a measure of uncertainty. In the case of the TC iodosulfuron methyl sodium, ToxCast data confirmed overall low activity. Second case study started with biological similarity calculated from ToxCast dataset. This approach resulted in an overwhelming number of candidate SCs. This indicates that the biological hit call data are relatively unspecific, as they are activated by many compounds. The integration of shared metabolites can efficiently restrict the selection of SCs to the most relevant compounds, coupled with integration of chemical and/or biological similarity. In absence of observed in vivo data, metabolites can be predicted using available tools, which generated comparable results. Based on apical findings from in vivo legacy studies, compound classes were not able to be discerned, primarily due to induced hepatotoxicity observed in about 60% of all repeated dose oral exposure studies. Overall, a RAX assessment framework integrating existing information on metabolites and biological properties to identify SC in a modular approach is recommended. The case studies presented suggest an increased confidence of SC identification using metabolite similarity. This suggestion complements the workflow proposed by EU‐ToxRisk, which focuses on targeted testing and assessment of SC upon their identification.
Abstract In the European Union, all chemical substances that are manufactured, imported or used in tonnages equal or higher than one tonne per year must be registered under the REACH regulation. This regulatory framework requires the collection and dissemination of detailed information on the properties, uses, and potential health and environmental impacts of chemicals. However, some of these chemicals may pose emerging risks in the food chain, necessitating the development of approaches to identify and assess such risks. The SCREENER project addressed this critical issue by analysing food items for the presence of chemicals that may pose unrecognised hazards. Building on the findings from the previous REACH 1 and REACH 2 projects, which identified a suspect list of 212 chemicals, qualitative multi‐residue high‐resolution mass spectrometry (HRMS) methods have been developed. Diverse food items such as wheat flour, kale, carrots, potatoes, peas, strawberries, oranges, meat, hen's eggs, and trout were pooled from 3 samples each, while cow's milk, and other fish types (salmon and herring) were pooled from 4 samples each to form a single sample. This pooling procedure resulted in 194 analytical samples, which were screened for the presence of the chemicals in the suspect list. Additionally, non‐target analysis (NTA) was performed on the same samples, focusing on detecting halogenated compounds. The aim was to detect substances potentially introduced into the food chain unintentionally through industrial and anthropogenic activities, which were previously unrecognised in the food chain. In the final stage of the project, 15 chemicals were further prioritized for identification and quantitative analysis. Quantitative methods were developed and then applied to the same 194 samples, allowing for the identification and quantification of those chemicals. Subsequently, hazard characterization, exposure assessment, and risk characterisation were conducted. A preliminary characterization of the potential risk posed by the chemicals found in the samples during quantitative confirmatory analysis indicated no risk to human health in all but three cases where additional analysis on occurrence and detailed evaluation of the hazard can be considered as a possible follow‐up action in the future.
Phenols are regarded as highly toxic chemicals. Their effects are difficult to study in in vitro systems because of their ambiguous fate (degradation, auto-oxidation and volatility). In the course of in vitro studies of a series of redox-cycling phenols, we found evidences of cross-contamination in several in vitro high-throughput test systems, in particular by trimethylbenzene-1, 4-diol/trimethylhydroquinone (TMHQ) and 2,6-di-tertbutyl-4-ethylphenol (DTBEP), and investigated in detail the physicochemical basis for such phenomenon and how to prevent it. TMHQ has fast degradation kinetics followed by significant diffusion rates of the resulting quinone to adjacent wells, other degradation products being able to air-diffuse as well. DTBEP showed lower degradation kinetics, but a higher diffusion rate. In both cases the in vitro toxicity was underestimated because of a decrease in concentration, in addition to cross-contamination to neighbouring wells. We identified four degradation products for TMHQ and five for DTBEP indicating that the current effects measured on cells are not only attributable to the parent phenolic compound. To overcome these drawbacks, we investigated in detail the physicochemical changes occurring in the course of the incubation and made use of gas-permeable and non-permeable plastic seals to prevent it. Diffusion was greatly prevented by the use of both plastic seals, as revealed by GC–MS analysis. Gas non-permeable plastic seals, reduced to a minimum compounds diffusion as well oxidation and did not affect the biological performance of cultured cells. Hence, no toxicological cross-contamination was observed in neighbouring wells, thus allowing a more reliable in vitro assessment of phenol-induced toxicity. ; The authors wish to acknowledge the support of the European Union's Horizon 2020 research and innovation 493 program under Grant Agreement No 681002 (EU-ToxRisk). We are indebted to CIBEREHD (ISCIII), for supporting our research program on hepatotoxicity of xenobiotics. L.T. was ...
The use of new approach methodologies (NAMs) in support of read-across (RAx) approaches for regulatory purposes is a main goal of the EU-ToxRisk project. To bring this forward, EU-ToxRisk partners convened a workshop in close collaboration with regulatory representatives from key organizations including European regulatory agencies, such as the European Chemicals Agency (ECHA) and the European Food Safety Authority (EFSA), as well as the Scientific Committee on Consumer Safety (SCCS), national agencies from several European countries, Japan, Canada and the USA, as well as the Organisation for Economic Cooperation and Development (OECD). More than a hundred people actively participated in the discussions, bringing together diverse viewpoints across academia, regulators and industry. The discussion was organized starting from five practical cases of RAx applied to specific problems that offered the oppor-tunity to consider real examples. There was general consensus that NAMs can improve confidence in RAx, in particular in defining category boundaries as well as characterizing the similarities/dissimilarities between source and target substances. In addition to describing dynamics, NAMs can be helpful in terms of kinetics and metabolism that may play an important role in the demonstration of similarity or dissimilarity among the members of a category. NAMs were also noted as effective in providing quanti-tative data correlated with traditional no observed adverse effect levels (NOAELs) used in risk assessment, while reducing the uncertainty on the final conclusion. An interesting point of view was the advice on calibrating the number of new tests that should be carefully selected, avoiding the allure of "the more, the better". Unfortunately, yet unsurprisingly, there was no single approach befitting every case, requiring careful analysis delineating the optimal approach. Expert analysis and assessment of each specific case is still an important step in the process.
In: Rovida , C , Escher , S E , Herzler , M , Bennekou , S H , Kamp , H , Kroese , D E , Maslankiewicz , L , Moné , M J , Patlewicz , G , Sipes , N , Van Aerts , L , White , A , Yamada , T & Van de Water , B 2021 , ' NAM-supported read-across : From case studies to regulatory guidance in safety assessment ' , A L T E X. Alternatives to Animal Experimentation , vol. 38 , no. 1 , pp. 140-150 . https://doi.org/10.14573/altex.2010062
The use of new approach methodologies (NAMs) in support of read-across (RAx) approaches for regulatory purposes is a main goal of the EU-ToxRisk project. To bring this forward, EU-ToxRisk partners convened a workshop in close collaboration with regulatory representatives from key organizations including European regulatory agencies, such as the European Chemicals Agency (ECHA) and the European Food Safety Authority (EFSA), as well as the Scientific Committee on Consumer Safety (SCCS), national agencies from several European countries, Japan, Canada and the USA, as well as the Organisation for Economic Cooperation and Development (OECD). More than a hundred people actively participated in the discussions, bringing together diverse viewpoints across academia, regulators and industry. The discussion was organized starting from five practical cases of RAx applied to specific problems that offered the oppor-tunity to consider real examples. There was general consensus that NAMs can improve confidence in RAx, in particular in defining category boundaries as well as characterizing the similarities/dissimilarities between source and target substances. In addition to describing dynamics, NAMs can be helpful in terms of kinetics and metabolism that may play an important role in the demonstration of similarity or dissimilarity among the members of a category. NAMs were also noted as effective in providing quanti-tative data correlated with traditional no observed adverse effect levels (NOAELs) used in risk assessment, while reducing the uncertainty on the final conclusion. An interesting point of view was the advice on calibrating the number of new tests that should be carefully selected, avoiding the allure of "the more, the better". Unfortunately, yet unsurprisingly, there was no single approach befitting every case, requiring careful analysis delineating the optimal approach. Expert analysis and assessment of each specific case is still an important step in the process.
Abstract OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an .
In: Hardy , B , Douglas , N , Helma , C , Rautenberg , M , Jeliazkova , N , Jeliazkov , V , Nikolova , I , Benigni , R , Tcheremenskaia , O , Kramer , S , Girschick , T , Buchwald , F , Wicker , J , Karwath , A , Gütlein , M , Maunz , A , Sarimveis , H , Melagraki , G , Afantitis , A , Sopasakis , P , Gallagher , D , Poroikov , V , Filimonov , D , Zakharov , A , Lagunin , A , Gloriozova , T , Novikov , S , Skvortsova , N , Druzhilovsky , D , Chawla , S , Ghosh , I , Ray , S , Patel , H & Escher , S 2010 , ' Collaborative development of predictive toxicology applications ' , Journal of Cheminformatics , vol. 2 , no. 7 , pp. 1-29 . https://doi.org/10.1186/1758-2946-2-7
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.