Symposium and Workshop Sessions are three hours each and Mini-Symposia are scheduled for 75 minutes each.
To ensure the safety of all attendees, we recommend that each individual select their seat for the entire duration of the session, including upon return from the break.
Symposia and Workshops are scheduled for Monday, November 14, 2022 from 9:00 AM–12:00 Noon and in the afternoon from 2:00 PM–5:00 PM; Tuesday, November 15, 2022 from 9:00 AM–12:00 Noon and in the afternoon from 2:00 PM–5:00 PM; and Wednesday, November 16, 2022 from 9:00 AM–12:00 Noon and in the afternoon from 2:00 PM–5:00 PM.
The following learning levels have been identified for ACT educational offerings.
Focuses on core skills or fundamental understanding of a topic
The converse of basic, if not basic then advanced
How to do it—procedures, design, reporting, use of tools, tips, tricks, experiences, and/or advice
New, unusual, or uncommon techniques, modalities, ROA, guidance, tools/equipment (hardware and software), and/or changes to standard practice
Symposia and Workshops are scheduled for:
Session Chairs: Timothy J. McGovern, ACT Past President, Crownsville, MD; and
Alisa Vespa, Health Canada, Ottawa, ON, Canada
Educational Support Provided by: Instem and Lhasa Limited
An Addendum to the International Council for Harmonisation (ICH) S1B(R1) Guideline (Rodent Carcinogenicity Studies for Human Pharmaceuticals) was published in 2021 and is expected to be finalized in 2022. This Addendum results from a prospective study initiated in 2012 and expands the testing scheme for assessing the human carcinogenicity risk of small molecule pharmaceuticals. The conclusion from this prospective evaluation confirmed that an integrated weight of evidence (WOE) approach could be used to adequately assess the human carcinogenic risk for certain pharmaceuticals in lieu of conducting a two-year rat study. The approach provides the specific WOE criteria that inform whether or not a two-year rat study adds value in completing a human carcinogenicity risk assessment. In addition, a comprehensive analysis was conducted to assess exposures and outcomes in rasH2-Tg mouse studies. The results of this analysis indicate that there is no value in exceeding a 50-fold exposure ratio for high dose selection in this model. This Symposium will discuss the history of the effort behind the development of the WOE criteria approach, including retrospective and prospective studies that were conducted, an overview of the ICH S1B(R1) Addendum, presentations of approaches for generating data to support a WOE assessment, and a discussion of the data supporting the 50-fold exposure ratio for high dose selection in the rasH2-Tg mouse model.
Assessment of carcinogenic potential is necessary for the nonclinical safety evaluation of investigational pharmaceuticals. The recognized limitations of long-term rodent bioassays have driven efforts to redefine the current WOE approach to better balance the investment in terms of animal use and other resources in supporting the safety assessment of small molecule pharmaceuticals. In 2013, an expert working group of the ICH initiated a prospective study based on the premise that a set of WOE criteria centered on drug target pharmacology and general toxicology could, in certain cases, provide an adequate assessment of carcinogenic potential without the conduct of a long-term rat bioassay. Over the course of several years, prospective WOE assessments were generated to predict both the tumor outcome and value of the two-year rat bioassay before its conduct or completion. This effort supported the creation of an addendum to the ICH S1B guidance whereby an integrative WOE presentation of key biologic, pharmacologic, and toxicologic information is used to determine whether or not a two-year rat study would add value in completing a human carcinogenicity risk assessment. This proposed addendum does not replace but rather expands the testing scheme described in the original S1B guideline and progresses the field toward a more mechanism-based carcinogenicity assessment for small molecule pharmaceuticals.
To assess the human carcinogenic risk of a small molecule pharmaceutical, the carcinogenic potential is typically evaluated in a two-year rat carcinogenicity study in addition to a second rodent carcinogenicity study conducted in mice (two-year or short-term). As a result of an independent prospective study conducted under the auspices of ICH “Proposed Change to Rodent Carcinogenicity Testing of Pharmaceuticals—Regulatory Notice Document,” the testing scheme is being expanded to include a WOE approach, which is based on a comprehensive assessment of the various WOE factors, to inform if the conduct of a two-year rat study will or will not add value to the assessment of human carcinogenic risk. This session will provide an overview of the WOE approach described in the ICH S1B Addendum. Case examples will be used to illustrate how the value of a two-year rat study can be ascertained using a WOE approach.
The pharmacokinetic endpoint of a 25-fold multiple of human exposure is one of the specified criteria for high-dose selection for two-year carcinogenicity studies in rodents outlined in ICH S1C(R2). However, this has not been universally accepted for six-month carcinogenicity studies in rasH2-Tg mice. To evaluate an appropriate pharmacokinetic multiple-based endpoint for rasH2-Tg mice, we identified study data available for 53 compounds. From the results of our survey, we confirmed that there has been no value in exceeding a high dose level at a 50-fold multiple of human exposure in rasH2-Tg mouse carcinogenicity studies. When the dose is not limited by MTD, pushing exposures past a 50-fold multiple of human exposures does not add value to the utility of the model for human carcinogenicity risk assessment.
The recently published addendum of the ICH S1B guideline introduces a WOE approach to determine whether a two-year rat carcinogenicity study would add value in assessing human carcinogenic potential. A working group has been established to develop a protocol to support such an assessment in a transparent, consistent, and defendable manner. This consortium workgroup is building upon the extensive work already completed as part of the in silico toxicology protocol project that includes a visual framework for combining experimental and in silico results to establish an overall assessment and confidence score for any toxicological endpoint (e.g., carcinogenicity). Multiple protocols have already been developed (e.g., genetic toxicology and endocrine activity) along with a position paper describing the state-of-the-art in the prediction of carcinogenicity. Leveraging this work, the working group is mapping the key characteristics of carcinogens onto the ICH S1B WOE factors and developing transparent WOE rules and principles for combining information and deriving a confidence score for the assessment as part of a new in silico toxicology protocol. This presentation will review the progress towards developing a protocol to support carcinogenicity assessment based on the ICH S1B WOE factors.
Adverse outcome pathways are known as an efficient way of organizing knowledge of modes of action to support safety assessment. This knowledge framework has been used as the basis for the development of defined approaches and integrated approaches to testing and assessment (IATAs) for different endpoints. This presentation will address the application of a carcinogenicity AOP network to conduct the WOE assessment proposed in the ICH S1B addendum, to decide on the need to conduct rodent assays for pharmaceuticals. This framework allows the contextualization and reasoning of different sources of evidence, including in silico models and experimental data, to support decision-making in a consistent and transparent way, which can facilitate the implementation and regulatory acceptance of the concepts proposed in the addendum.
Session Chairs: Julie Douville, n-Lorem Foundation, Carlsbad, CA; and
Archit Rastogi, Ionis Pharmaceuticals, Carlsbad, CA
It has been estimated that 300 million people worldwide are living with one of the approximately 7,000 rare or orphan diseases. By definition, rare and orphan diseases individually affect a small percentage of the overall population with patient populations ranging from a few thousand patients to one or two patients. Oftentimes the number of patients affected is not sufficient to generate commercial interest to develop a drug for the specific indication. Rising to fill this gap, foundations and families have taken the lead in developing drugs for family members. In this symposium, we will explore how to design IND-supporting research and provide examples of rare and orphan disease drug development programs based on the size of the target patient population (n = 1–3, n = 10–100, n = ≥1000). Additionally, we will discuss how to navigate the regulatory landscape to design lean, cost-effective, time-sensitive studies without compromising patient safety.
An efficient and successful drug development program is key in the pursuit to find treatment or prevention of a disease. Although the regulatory requirements for marketing approval for drugs to treat rare and common diseases are the same, the issues encountered are frequently more difficult to address in the context of a rare disease for which there is often limited medical and scientific knowledge, natural history data, and drug development experience. However, regulations provide some flexibility in applying regulatory standards and one of them is the customization of nonclinical development programs based on the size of the target disease population. In this presentation, the focus will be on the nonclinical development approach for large populations of rare diseases such as Angelman’s Syndrome (AS) which would need a more comprehensive approach but perhaps not as comprehensive as a common disease indication. AS is a rare, monogenic neurodevelopmental disorder affecting ~1:12,000–1:20,000 and is commonly diagnosed at an age of two to five years. ION582, targeting UBE3a, is an antisense oligonucleotide currently being developed jointly by Ionis and Biogen for the treatment of Angelman’s Syndrome.
Jacifusen is an experimental ASO for patients with FUS-associated amyotrophic lateral sclerosis (ALS). Mutations in the FUS gene are present in about 5% of familial ALS and about 1% of sporadic ALS cases and may lead to some of the most aggressive forms of ALS, including a type that begins during adolescence. Jacifusen was named for Jaci Hermstad, a 26-year-old Iowa woman who was diagnosed with FUS-associated ALS in February 2019. This presentation will focus on how the program, which was initially designed for an n=1 clinical trial with a very urgent need to initiate therapy, was later scaled up to allow the enrollment of additional patients sharing the same mutation. The initial nonclinical support was limited to rodent studies and was considered appropriate for the risk-benefit treatment of a single patient. As additional patients with this mutation were identified, the nonclinical safety program was expanded to include a monkey study but still reflected the rare and urgent nature of this particular patient population.
Michael Pirovolakis was diagnosed with Spastic Paraplegia 50 (SPG50), which is caused by a mutation in the AP4M1 gene on April 2, 2019, when he was just 18 months old. Michael’s parents were told by doctors that there was no known cure for his disease and that he would deteriorate very quickly, leaving him completely paralyzed and with little brain function. Within weeks of his diagnosis, Michael's father, Terry, read hundreds of thousands of articles, spoke with hundreds of doctors from around the world, and developed a contingency plan of what he needed to do to give his son the best possible quality of life. He attended conferences and met with several leading experts in the gene therapy field as well as experts from the US National Institutes of Health and the US Food and Drug Administration. Terry flew around the world seeking answers from medical experts before finally meeting Dr. Steven Gray of UT Southwestern Medical Center, who agreed to create a gene therapy treatment to help replace Michael’s missing gene. The following presentation gives a parent's perspective on the scientific, financial, logistical, and regulatory challenges encountered when developing a tailored therapy for a single patient.
The availability of annotated complete human genome databases, and the ability to conduct whole-genome sequencing rapidly and cheaply, have given clinicians tools that allow them to pinpoint the exact molecular cause of a particular patient’s disease in some instances. This, coupled with the relative ease with which high-quality oligonucleotides can be manufactured, has created a situation where it is possible for clinicians to develop an antisense oligonucleotide drug candidate that is tailored to an individual patient’s specific genetic variant. This represents an important advance in treatment for those with exceedingly rare genetic diseases, especially those for which there are no adequate therapies available to treat their disease. Often, these exceedingly rare diseases are rapidly progressing, debilitating, and in many cases, can lead to premature death if left untreated. To support research clinicians seeking to develop these potentially life-saving therapies, the FDA has issued draft guidances that provide our current thinking on how best to balance the immediacy of the need for treatment with our mandate to ensure that clinical studies are conducted safely and ethically. This presentation will focus on the recommendations regarding the nonclinical safety assessment for these types of programs.
Session Chairs: Vincent Reynolds, Eli Lilly and Company, Indianapolis, IN; and
June Hope, TwinStrand Biosciences, Seattle, WA
Educational Support Provided by TwinStrand Biosciences
This forward-looking symposium will provide attendees with an increased awareness of the capabilities and power of Duplex Sequencing (DS), an ultra-accurate form of error-corrected next-generation DNA sequencing (NGS), and a clear understanding of how DS can be harnessed to solve problems faced routinely by toxicologists. DS has emerged as an enormously powerful tool that permits the detection and quantification of mutations in individual cells of multicellular organisms, including humans and standard mammalian toxicology test systems. The technology relies on DNA sequence information obtained from both complementary strands of duplex DNA and can identify and correct polymerase errors that arise during the acquisition of the DNA sequence information. Importantly, DS can define mutational signatures to help identify the specific causative mutagenic agent(s) responsible for the mutation(s). An introductory presentation will present a high-level overview of key conceptual, technical, and logistical points that underpin DS in this session. This will be followed by detailed descriptions of how/where DS can be fitted into regulatory frameworks currently defined for genetic toxicology by ICH S2(R1) and how DS may be used to assess carcinogenicity (including nongenotoxic carcinogenicity) and thereby offer a potential alternative to the carcinogenicity assessments currently specified by ICH S1A/S1B/S1C(R2). Additional presentations will highlight the value DS can bring to evaluations for off-target mutagenesis in gene editing (CRISPR-Cas9 and other methods) as well as potential applications for DS in addressing problems in the realm of genotoxic impurities including nitrosamines in pharmaceutical development.
DS is a recently developed technology that relies on obtaining DNA sequence information from both complementary strands of duplex DNA. An error-correction algorithm permits errors and mutations introduced by DNA sample handling and by DNA polymerase infidelity during clonal expansion to be identified and corrected. As a result, extremely accurate DNA sequence and mutation information can be acquired from individual cells from any desired genomic site in any organism, including humans and commonly used toxicology rodent and nonrodent test systems. This session will (1) provide a schematic overview of the concepts and techniques involved in NGS and DS; (2) describe linkages between mutagenesis and carcinogenesis; and (3) illustrate with examples and days how DS can be used for genetic toxicology applications.
Evaluation of a substance’s potential to cause mutagenicity is a critical component of human and environmental health risk assessment. Regulatory agencies worldwide have developed test guidelines to determine whether chemicals cause mutations. While these tests have served the regulatory community well, they have important limitations. First, we continue to rely on bacterial mutagenicity as the gold standard. Second, current in vivo tests generally measure mutations in one reporter gene and/or require transgenic rodent models and standalone experiments. Third, the present assays do not generally provide mechanistic insight into mutagenicity without extensive follow-up studies. DS is an error-corrected next-generation sequencing (ecNGS) technology that enables highly precise quantification of mutation frequency and characterization of spectrum in potentially any species, tissue, and cell culture model. Foundational studies from our laboratory and others using in vivo and in vitro models are (1) exploring mutagenic responses by DS to confirm performance across different genotoxic modes of action and diverse tissues/models; (2) establishing the degree of qualitative and quantitative concordance relative to conventional mutagenicity endpoints; (3) defining optimal experimental designs; and (4) investigating the added value of the mechanistic information produced by DS. The work thus far indicates a robust ability to detect mutations by DS for prototypical mutagenic exposures in numerous models, concordance of DS mutation frequencies with conventional endpoints, and remarkable cross-laboratory agreement. Altogether, these efforts set the stage for the modernization of mutagenicity testing to protect human and environmental health.
There has been much earnest debate within the toxicology community regarding the value of rodent two-year carcinogenicity bioassays. This talk will address the purpose of a two-year rodent bioassay and briefly summarize the perspectives of toxicologists, veterinary pathologists, regulatory scientists, and other stakeholders on costs in time and money, animal usage, and the ultimate value of the data obtained from these studies. The focus will then shift to applications of DS carcinogenicity assay(s) for the assessment of carcinogenesis (including nongenotoxic carcinogenesis). The underlying proposal is that new approaches using DS may be used to detect clonal expansions marked by cancer driver gene mutations as the phenotype of early carcinogenesis. The possibility of establishing a new carcinogenicity test battery and waiver to skip a two-year cancer bioassay will be discussed.
The use of genome editing to make better cellular therapies opens a whole range of potentially curative medicines. However, genome editing comes with risks, one of which is the risk of off-target edits. Several methods have been proposed to identify off-target DNA double-strand breaks, but many lack the required sensitivity and it is not clear what is the real mutational consequence of any identified DNA breaks. With the support of HESI CT TRACS, we have instigated an international validation project to assess the innovative technology INDUCE-seq, which has the promise of improving the sensitivity and economy of off-target assessment. The project is progressing to investigate the true mutational consequence of any identified breaks using the highly sensitive sequencing technology Duplex-seq. With support from Broken-String (for INDUCE-seq) and Twin-Strand (for Duplex-seq) data will be presented on how INDUCE-seq has been used to “find” off-targets and the plans for the use of Duplex-seq to “confirm” whether they translate to real mutations of potential toxicological consequence.
Managing and controlling genotoxic impurities in pharmaceutical lots during development and in the postmarketing phase is an important concern for human safety. Multiple examples exist where mutagenic contaminants had the potential to jeopardize the well-being of patients. This presentation will explore current problems with nitrosamine contamination in postmarket drugs. These include the need for testing to approve lot release and challenges in meeting and maintaining control limits for contaminants. Of particular concern is the possibility that relatively simple genetic toxicology tests such as the bacterial gene mutation test (Ames assay) may not be suitable for the detection of nitrosamines as mutagens while other tests such as transgenic rodent mutation assays are cumbersome and not feasible for rapid screening. This talk will suggest proposals for using DS as a new approach to help in the management of genotoxic impurities. Because DS can identify mutational signatures that can be associated with specific causative mutagens, there is a clear potential for DS to establish valuable point-of-departure (POD) and NOEL information to help guide risk management decisions when genotoxic impurities are detected in lots of pharmaceutic agents.
Session Chairs: David Ponting, Lhasa Limited, Leeds, United Kingdom; and
Kevin Cross, Instem, Columbus, OH
Educational Support Provided by: Instem and Lhasa Limited
The discovery of a highly potent N-nitrosamine impurity, N-nitrosodimethylamine (NDMA), in several widely prescribed marketed pharmaceuticals such as angiotensin II receptor blockers (e.g., Valsartan), used to treat high blood pressure, and Metformin, used to treat type 2 diabetes, led to a requirement for an investigation into the potential for N-nitrosamine contamination and the mutagenic and carcinogenic risk to patients. The US FDA announced a voluntary recall for Valsartan in 2018, and updated regulations requiring marketing authorization holders for medicines that contain chemically synthesized active substances to review them for the presence of nitrosamines and test products at risk. The US FDA has published a Guidance for Industry, Control of Nitrosamine Impurities in Human Drugs, which recommends steps manufacturers should take to detect and prevent unacceptable levels of nitrosamine impurities in pharmaceuticals. In parallel to product review, pharmaceutical industry members have worked alongside academia and expert toxicologists in a concerted effort to identify methods of risk evaluation and control strategies to eliminate increased carcinogenic risk to patients. Proposed methods for addressing the issue of an identified N-nitrosamine impurity in drug products include assessing the carcinogenic risk based on structure-activity relationships and read-across, in vitro and in vivo assessment, and purging to reduce quantities to an acceptable level. This newly defined knowledge and methodology may contribute to shaping future regulations for the control of N-nitrosamines as mutagenic impurities in pharmaceuticals. This topic is rapidly evolving, and this session will present developments in the field in the past year.
Nitrosamine impurities have been an issue of concern since the Zhejiang Huahai Valsartan incident. Following this incident, nitrosamine impurities have resulted in recalls, withdrawals, and delays in clinical trials involving other drugs such as ranitidine, nizatidine, metformin, rifampin/rifapentine, varenicline, etc. ICH M7(R1) provides a context on how to control the risk of mutagenic impurities, however several of the principles have been excluded in recent Health Authority nitrosamine guidances. Nitrosamines, as a class, are considered a cohort of concern (COC), where the limits set forth by ICH M7(R1) for mutagenic impurities would not be health protective. For many nitrosamines, there are carcinogenicity data sufficient to develop an acceptable intake (AI). Two of the most potent and well-studied nitrosamines are N-nitrosodimethylamine (NDMA) and N-nitrosodiethylamine (NDEA). However, for other types of nitrosamines, the carcinogenicity data may be limited. This session will discuss current strategies and evolving landscape for AI setting of nitrosamine impurities, including instances where there is limited data for a nitrosamine.
The principles of nitrosamine formation will be considered along with how these principles can and have been applied to risk assess the presence of nitrosamine contamination in drug substance (DS), drug product (DP), and drug product packaging (DPP) during typical pharmaceutical processing. While the solution phase chemistry for the formation of nitrosamines has been well-studied, characterized, and published, the solid-solution phase and gas phase characteristics are not as well understood. Some general analytical challenges will be described for each area (DS, DP, and DPP). Improved definitions of molecular attributes which lead to a COC molecule and realization of requirements for remediation will help industry and regulators secure medicines to maintain the health and safety of patients.
The default acceptable intake (AI) for nitrosamines in pharmaceuticals with unknown carcinogenic potential is currently based on structurally simple and highly potent nitrosamines. In contrast, a significant proportion of N-nitrosamine drug product impurities are unique/complex structures impurities that may be less potent carcinogens compared to simple nitrosamines. An alternative to applying a default limit is to conduct a read-across assessment, in which case structurally similar nitrosamines that have been tested in carcinogenicity studies are used to estimate the potency of the nitrosamine impurity of interest. In principle, this approach has the potential to be quite useful in differentiating highly potent nitrosamines from those with structural features that diminish potency. However, for a read-across approach to be applied in practice, there needs to be an agreed framework. Several case studies will be shared to illustrate the current challenges associated with establishing acceptable intakes using a read-across approach.
ICH M7 provides guidance on the hazard identification and risk assessment process for genotoxic impurities in pharmaceuticals. QSAR followed by Ames testing are cornerstones of this guidance Negative QSAR/Ames allows control of impurities according to ICH Q3A/B principles, and a positive Ames leads to controlling the mutagenic impurity at the TTC of 1,5 µg/day. Nitrosamine impurities are exceptions to this rule, as they are part of the Cohort of Concern. In the last few years, concerns were raised if the Ames test as described in the ICH M7 is sufficiently sensitive and specific to detect all potentially carcinogenic nitrosamines. Most of those studies raising this concern did not follow the OECD 471 standard protocol and some results changed by using this protocol. To further investigate this topic a HESI nitrosamine subgroup was formed, bringing together scientists from regulatory agencies, academia, and industry to work on those questions together. The goal of the HESI nitrosamine subgroup is to (1) identify a robust Ames protocol predictive for the carcinogenic potential of nitrosamines as part of the hazard identification process; (2) fill data gaps to support better SAR; (3) identify/verify in vitro assays with metabolically competent cells supporting nitrosamines risk identification; and (4) develop an in vivo strategy to verify Ames data within the frame of ICH M7 for risk assessment, including setting acceptable limits. The presentation will give an overview of this program and will show the first results of the program.
Session Chairs: Laura Crawford, Nirogy Therapeutics, Framingham, MA; and
Bettina Donato, SciLucent, San Diego, CA
Educational Support Provided by: Biomere
The development and validation of robust bioanalytical methods is a critical component of drug development. Toxicologists ensure the necessary components (e.g., assays, validations, and reports) are included in the nonclinical development program, however, toxicologists are not always equipped to participate in the assay development and troubleshooting efforts. Specifics such as the upper and lower limits of quantification (ULOQ and LLOQ) and stability timepoints are high-level requirements for validation, but the assay performance, reproducibility, specificity, and accuracy are not always well understood by nonclinical toxicology experts. As assays become more sensitive, equipment changes occur, programs transition from nonclinical to clinical, and as the need for incurred sample reproducibility (ISR) has been added to regulatory requirements, bioanalytical efforts are labor-intensive. The goal of this session is to help toxicologists navigate the requirements and challenges of bioanalysis during drug development. Speakers will cover and share experiences regarding (1) small molecule validations; (2) large molecule validations; (3) antidrug antibody assays; (4) qPCR; (5) regulatory expectations (including ISR and ISR failures); and (6) optimizing bioanalysis for the overall program (nonclinical to clinical). It is the goal of this session to expand the knowledge base for experienced nonclinical toxicologists and to introduce newer toxicologists to the world of bioanalysis as a critical component of drug development.
Small molecule drugs have been a mainstay for drug developers for decades. Although a larger percentage of growth is expected in the large molecule arena, small molecules still make up 90% of the marketing of pharmaceutical drugs. The toxicokinetics and pharmacokinetics of small molecule drugs are critical components of regulatory submissions. The concentrations of these small molecule drugs in biological matrices, or bioanalysis, are largely conducted by LC-MS (liquid-chromatography interfaced with mass spectrometry). The validation of these methods is governed by international guidelines and regulations. The US FDA issued its final guidance on Bioanalytical Method Validation in 2018 and the International Council on Harmonization (ICH) is expected to issue its final guidance in 2022. This talk will focus on key components of bioanalysis in toxicology studies, for example, sample size and volume, limits of quantitation, dynamic concentration range, incurred sample reproducibility (ISR), use of stable label internal standards, sensitivity, and stability of the small molecule in bioanalytical fluids and also areas of potential failure like cross-contamination, incorrect anticoagulants, sample clotting, and sample storage conditions. Additionally, there will be a discussion on the translation and use of the nonclinical bioanalytical methods to clinical methods and the continuity of the bioanalytical method development process between drug development phases. Microsampling as new technology will be introduced. Finally, case studies will be presented for discussion.
Building a sound strategy for characterizing the pharmacokinetics and immunogenicity of biologic therapy is crucial during every stage of the drug developmental process. Identifying the impact of immunogenicity on safety and efficacy as well as the incidence of immunogenicity in nonclinical studies and clinical trials is a requirement of regulatory agencies. There are also lasting benefits in developing robust TK, PK, and ADA assays early in a drug development program with knowledgeable partners. Having appropriate and sensitive TK, PK, and ADA assays in place early prevents study delays and allows for the reliable interpretation of pharmacokinetic and safety results, in addition to savings on costs of repeat sample analysis and validation. Selecting the ideal format for your ADA assay that provides optimal drug tolerance is also critical to the interpretation of the results. A greater understanding of TK, PK, and ADA assays and the regulatory requirements will guide drug development teams in developing better and more effective strategies for designing toxicology studies and the translation of the data and the assays to the clinic. This talk will discuss these requirements as well as anticipating the regulatory needs and considerations for characterizing pharmacokinetics and immunogenicity at the various stages of drug development (preclinical, IND, BLA, and postmarketing), which help ensure that a biologic is best positioned for success.
With many new modalities such as CAR-T cells, oligonucleotides (antisense oligos/siRNAs), and adeno-associated virus (AAV) gene therapy drugs, there is an increased need for scientific expertise to develop new bioanalytical strategies as well as use nontraditional platforms such as qPCR, RT-qPCR, and ddPCR within the regulated bioanalysis space. The 2019 ICH M10 Bioanalytical Method Validation Draft guidance provides recommendations for the validation of bioanalytical assays for chemical and biologic drug quantification and their application in the analysis of study samples. This guidance provides little or no direction on how to approach assay development and validation when evaluating the exposure to these complex cell and gene therapy drugs which may require multiple assays to evaluate the distribution, presence, and clearance of the dosed therapy. This talk will provide case examples of various approaches when using molecular platforms to develop fit-for-purpose assays used to support preclinical toxicology programs.
Session Chairs: Kathy Derakhchan, Takeda Pharmaceutical Company Limited, Cambridge, MA; and
Derek J. Leishman, Eli Lilly and Company, Columbus, OH
Educational Session Provided by: Safety Pharmacology Society
Most people are familiar with the classic five senses of sight, hearing, touch, smell, and taste. However, the concept of a sense can be extended to include sensations such as hunger or thirst. Other senses such as thermoception, equilibrioception, proprioception, nociception, or even a sense of the passage of time. These “senses” are all important in determining our quality of life. We are also aware of small acute changes in some of these “senses.” Given the potential impact of these “senses” on our overall quality of life there is a question of how a safety pharmacologist characterizes and quantifies an impact on these senses. Following an overview of the senses and some examples of drug effects on them, we will be presented with focused examples. We will start with the example of temperature since this is a relatively simple signal which can be recorded in our animal species. We are all well enough aware of seasonal fevers and chills to know that it is something we can be acutely aware of. We will then look at assessing an impact on vision (and how we can take advantage of equilibrioception in measuring vision). We will close the session with consideration of the ear and ototoxicity and the impact on hearing and balance.
Changes in body temperature outside of normal homeostasis can have major impacts on body function and survival. Mammalian body temperature is tightly regulated by the central nervous system. Peripheral and central thermosensory neural information is integrated in the preoptic area (POA) in the brain, which then provides signals to efferent organs to initiate involuntary processes that regulate temperature such as shivering and vascular changes. Transient receptor potential (TRP) channels in the peripheral nervous system are Ca2+ permeable ion channels that are involved in thermosensing. A subset of TRP channels is activated by warm or cold temperatures, some at innocuous temperatures and some at noxious levels. TRPV1 is activated by heat and other ligands (capsaicin) and TRPM8 is activated by cool temperatures and other ligands (menthol). In addition to thermosensing, TRPV1 and TRPM8 channels are also expressed in other organs and tissues and have implications for gastrointestinal, cardiovascular, and respiratory diseases as well as pain sensation. As such, the modulation of TRPM8 and TRPV1 have been targets for drug development. This presentation will include background on thermoregulation physiology, implications of dysregulations, and case studies that demonstrate how altered thermosensing through modulation of TRPM8 or TRPV1 can impact thermoregulation and other intricately connected physiology, as well as potentially other sensory perceptions.
Drug-induced retinal toxicity can be a showstopper in preclinical development programs, particularly when irreversible. Such toxicity is usually detected as histopathological findings during repeat-dose regulatory toxicology studies, prior to first-time-in-human investigations. It would be useful to have a convenient method of assessing the visual function that could be applied earlier in the preclinical safety evaluation program. The electroretinogram (ERG) provides a reliable assessment of broadscale retinal function, with the a- and b-waves indicating photoreceptor and second-order cell responses, respectively, and thereby is a useful tool in investigating any deficits in retinal function. However, this is a highly specialized method requiring anesthesia and a dedicated laboratory set-up, which does not lend itself to incorporation within repeat-dose toxicology studies. Therefore, ERG is not used routinely either by safety pharmacologists or pharmaceutical toxicologists but is restricted to the investigation of known or suspected actions of drugs, or to generate functional correlates of retinal histopathology. One method of measuring visual acuity that has been applied in rodents utilizes the optomotor reflex, whereby a rodent is surrounded by four computer screens that display a vertical moving grating. The moving grating triggers distinctive, reflex head-tracking movements, and the threshold of visual acuity is taken as the maximum gratings frequency to which the animal responds. Recent advances in in vitro microphysiological technology have enabled better evaluation in vivo studies.
Hearing loss can occur as an unintended consequence of drug administration Where there is evidence of risk, be it target-based, drug class or observational, nonclinical ototoxicity studies can be conducted to specifically characterize the target structures associated with hearing loss as well as the exposure-response cross species. Available guidelines provide nonclinical approaches to safely support drug development for drugs either delivered directly into the ear or systemically. This presentation will cover the physiology of hearing and highlight potential mechanisms of drug-induced hearing loss. The specialized nonclinical techniques used to evaluate hearing loss both functionally and microscopically will be discussed in addition to the design of ototoxicity studies. Examples of successful nonclinical ototoxicity packages to support approved products will be covered.
Session Chairs: Bettina Donato, SciLucent, Herndon, VA; and
Laura Crawford, Nirogy Therapeutics, Framingham, MA
Educational Support Provided by: SciLucent, Inc.
The use of toxicity study data to evaluate drugs in humans can be complex. Multiple factors, such as the type of molecule (e.g., small or large), absorption, distribution, metabolism, and excretion (ADME) properties of the drug, clinical route, indication(s), and of course biological differences among species used in nonclinical testing can often impact study results and make data interpretation about human relevance difficult. Even with the many challenges that come with designing appropriate toxicology studies, these studies are required by regulation and form the core of the safety assessment of a new drug. Planning general and specialized studies can both present unique considerations to appropriately characterize the drug. This symposium will discuss scenarios that are often encountered in toxicology study design but do not have a clear decision tree via regulatory guidance. In this symposium, the speakers will share their experiences and case studies regarding special considerations in toxicology study design. Topics include (1) normal versus diseased animal models for testing; (2) use of surrogates in safety evaluations; (3) toxicokinetic best practices and relevant scenarios; and (4) thinking outside the box during development. While is it impossible to eliminate all challenges in toxicology study design, discussion of these considerations is useful to design studies that maximize study utility and reduce the generation of unexpected or clinically irrelevant results. This symposium will benefit regulatory toxicologists who work in nonclinical safety and development and routinely plan toxicology studies in early and late-stage development.
Nonclinical evaluation of drug toxicity in the context of disease has recently become a more widely accepted development strategy. Hybrid pharmacology-toxicology studies in relevant disease models, when available, are encouraged by regulatory agencies as preferred approaches for certain drug modalities and therapeutic areas. The benefits and limitations of using an animal model of disease for toxicology evaluations must be carefully evaluated to determine the value, if any, of predicting similar effects in humans. Drug development for rare genetic diseases requires toxicologists to consider the use of animal disease models that mimic human pathophysiology resulting from monogenic gain-of- or loss-of-function mutations. Therapeutic restoration to a normal phenotype may reduce complications associated with disease, but safety issues related to differential species sensitivities and drug modalities may concurrently arise. Conversely, toxicology studies conducted in normal animals are also acceptable for evaluations of therapies for rare diseases, but results from these toxicology studies can be confounding if exaggerated pharmacology leads to adverse effects that may not manifest in a disease setting. Here, a case study of the nonclinical pharmacology and toxicology evaluations in normal and diseased animals will be presented for an mRNA therapy being developed for the treatment of Glycogen Storage Disease Type III.
A fundamental principle of study design for nonclinical safety assessment is the selection of relevant animal species. A relevant animal species is defined as one in which the drug or biologic is pharmacologically active. Biologics and other advanced modalities (e.g., gene and cell therapies) are designed to have highly targeted biological activity, frequently leading to species and/or tissue specificity that precludes toxicity testing in standard animal models (i.e., rats and dogs). Relevant species selection can employ a variety of techniques such as characterization of target sequence homology, receptor/epitope expression pattern, binding affinity, tissue cross-reactivity, and pharmacodynamic activity. Generating toxicity data in a nonrelevant species can be misleading and is discouraged, particularly since the toxicity of highly target biologics can arise from exaggerated pharmacology. A further complication is that many advanced therapeutic modalities target disease-associated pathways that do not play a role in normal physiology and are not active in healthy animals. In this case, alternative approaches such as toxicity testing in animal models of disease can be considered. Other alternative approaches include using a surrogate molecule that is active in animals, conducting studies in human transgenic animals, or generating pivotal safety data in vitro using human blood or tissues. Case examples illustrating these various alternative approaches for species selection as well as challenges associated with them, such as lack of historical control data, uncertainty in regulatory acceptance, and complexity in study execution and interpretation, will be described.
The speaker will discuss three different drug/device development programs. The first will be considerations when developing an antibody and will include species selection, dosing regimen, and justification for using a single toxicology species. This program will be most relevant to those toxicologists who are used to working with small molecules as opposed to biologics as many of the concepts are quite different. The second program will describe the safety studies designed to assess a human blood product. This was a complicated program that required the design of a surgical model to mimic how the product would be used in the “field.” Administering the product to “normal” animals would have potentially introduced artifacts unrelated to the toxicity of the product. Frequent FDA interactions were required to explain the rationale behind the study design. In addition, there was a learning curve in developing the surgical model. The final program will describe the development of a product that consisted of a device, biological, and gene therapy. Once again, a scientifically based rationale had to be developed to obtain buy-in from regulators and a clinically relevant surgical animal model was developed.
Session Chairs: Hideo Fukui, Axcelead Drug Discovery Partners, Inc., Fujisawa, Kanagawa, Japan; and
Timothy J. McGovern, ACT Past President, Outreach Committee Chair, Crownsville, MD
Educational Support Provided by AstraZeneca and Charles River
Novel delivery modalities including exosomes, antibody-drug conjugates, and lipid nanoparticles, have emerged in recent years to enhance the absorption and improve the targeting of drugs. In this international symposium developed in association with the Japanese Society of Toxicology, insights into the current status of the therapeutic uses and safety aspects of exosome products will be discussed including a discussion of related toxicity concerns. In addition, the challenges with developing successful antibody-drug conjugates (ADCs) will be discussed including some of the key aspects that impact the pre-clinical development of ADCs, and how the same factors that drive efficacy, such as antibody-specificity, linker-composition, and payload-specific mode of action pharmacology, also drive the toxicity of the ADC. Finally, the regulatory toxicology considerations for the “non-target” portions of novel modalities will be discussed including an evaluation of the full modality and potential off-target toxicities. Case examples will be discussed.
Cell therapy such as mesenchymal stem cells (MSCs) have been widely tested and more than 1,000 registered clinical trials are ongoing worldwide. Currently, it is apparent that MSCs exert their therapeutic functions in a paracrine manner through the secretion of extracellular vesicles (EVs) which have a lipid bilayer membrane with a diameter of 50 to 150 nanometers (small EVs), commonly called exosomes, and are used as a communication tool between cells. EVs are composed of a variety of cellular components: (1) the EV membrane comprises lipids such as sphingomyelin, cholesterol, ceramide, and phosphatidylserine; (2) the EV contains various proteins, amino acids, and tricarboxylic acid cycle (TCA) intermediates; (3) abundant nucleic acids molecules, particularly RNA species such as mRNA and microRNA, have been consistently detected from EV; and (4) DNA has also been detected from EV as a potential for horizontal gene transfer. Since EVs are nonliving and nonreplicative and have a transient presence in the body, thus it is considered a biological medicine for cell-free therapy. Together, cell-origin EV preparations are potentially safer and easier to translate into the clinic than cellular products. However, there are inherent challenges in the development of EV drugs for regenerative medicine. Especially, quality control metrics to measure key identity and potency features of EVs preparations have to be specified during the development of EV therapeutics. Here we will discuss minimal requirements for prospective potency assays that ideally reflect the mechanism of action (MOA) to predict the therapeutic effectiveness of the drug substance in accordance with International Technical Requirements for Pharmaceuticals for Human Use guidelines. Especially, we will highlight challenges and mitigation measures to enhance the manufacture of consistent, safe, and reproducibly potent EV preparations for reliable regenerative medicine.
In recent years, it has become clear that exosomes, which are vesicles of lipid bilayer membranes secreted by various cells in the body, are present in the blood. Exosomes contain RNA, DNA, and proteins that are cell-specific. For example, tumor cell-specific exosomes are used as biomarkers to claim a diagnostic accuracy of over 90%. These indicators are also expected to be useful as novel biomarkers for the evaluation of various diseases and toxicity, i.e., liquid biopsy. In addition to its characteristics as a biomarker, exosomes have been shown to have the ability to transmit materials to distant cells. Taking advantage of this feature, it has been reported that mesenchymal stem cell-derived exosomes are effective for wound healing and iPS myocardial-derived exosomes improve myocardial infarction (Zhang B. et al., Stem Cells 2015, Gao L. et. Al., Science Translational Medicine 2020). However, for exosome products, it is necessary to evaluate off-target toxicity due to unintended pharmacological action (off-target action) in addition to the intended pharmacological action (on-target action). In addition, we introduce that exosome products have a new risk due to horizontal gene transfer.
Antibody-drug conjugates have been hailed as a step change in the goal to specifically and safely deliver highly toxic chemotherapeutic agents, and by so doing, increase their anti-tumor efficacy while reducing dose-limiting toxicity. Despite several successful regulatory approvals and impactful clinical outcomes for cancer patients, developing a successful ADC is still challenging. The different classes of cytotoxic warheads employed in ADCs have overlapping and unique dose-limiting toxicities in the clinic, which are not always predicted by nonclinical toxicology studies. This presentation will cover some of the key aspects that impact the preclinical development of ADCs, and outline how the same factors that drive efficacy, such as antibody-specificity, linker-composition, and payload-specific mode of action pharmacology, also drive the toxicity of the ADC. A case example of the development path of an ADC will be shared with an emphasis on the challenges faced by the toxicologist to deliver a well-characterized risk assessment for the clinic.
Biologics have continued to advance over the decades. There are often novel portions of these modalities that, although not part of “core” therapeutic activity, may confer toxicological challenges as part of their evaluation. These may include portions that are used to extend the half-life (pegylation, Fc’s) or used for targeting (mAbs, Fabs) or to help with cellular uptake (LNPs, exosomes). Toxicological evaluation of the full modality must consider these aspects. What fundamental characteristics of the modality do they change (PK) or modify (cellular uptake); what is known about the targeting activities? There may be off-target toxicities as well. Regulatory and indication elements should also be considered. This talk will attempt to discuss how to integrate these factors in a full toxicological assessment.
Session Chairs: Owen McMaster, US FDA, Silver Spring, MD; and
Marcus Delatte, Allucent, Cary, NC
There is overlap in the procedures used to evaluate risks to the respiratory system following the inhalation of pharmaceuticals and chemical products, but there is no unified approach across regulatory agencies such as the EPA and FDA. This symposium will explore the similarities and differences in approaches to assessing and managing risks related to products regulated by these agencies. The goals of the symposium are to (1) compare and contrast the respiratory system in humans and typical experimental animal species; (2) discuss the strengths and weaknesses of in vitro and in vivo models typically employed for hazard assessment in the respiratory system; (3) review approaches used by US regulatory scientists to assess and manage risks in the respiratory system following the inhalation of pharmaceutical and chemical products; and (4) provide perspectives on how to address current and future challenges related to product-related risks across different regulatory contexts. Speakers include a pharmaceutical consultant with expertise in comparative anatomy and animal models, two US regulators (FDA and EPA) with expertise in assessing and managing risks in the respiratory system to address the differences in risk assessment when trying to protect patients versus large populations, and a consultant with extensive experience evaluating inhaled products to provide perspectives on current and future challenges. Attendees will gain an understanding of the assessment and mitigation of risks to the respiratory system following the inhalation of pharmaceutical and chemical products and the limitations of the current state of the art to inform research in this area.
This talk will focus on approaches to translate nonclinical findings to humans, which helps to ensure that the findings leveraged to inform safety decisions are clinically relevant and appropriate to use in assessing risks. Such approaches rely on sufficiently understating the anatomy and physiology of the respiratory system in humans and typical experimental animal species, as well as key factors in risk assessment. This understanding provides insight into the usefulness of data from various in vivo and in vitro models across different experimental contexts. This insight is also useful in the discussion of approaches to address the challenges related to leveraging these data within a risk assessment framework and translating these experimental findings to inform safety decisions regarding humans. The speaker will provide insight into selecting relevant experimental models for evaluating hazards in the respiratory system and leveraging these data within a risk assessment framework to set the stage for more in-depth discussions by the EPA and FDA regulators in subsequent talks. Learning objectives of this talk: (1) discuss the fundamental elements of risk assessments; (2) discuss approaches to translate nonclinical data to humans, so that these data may be leveraged to inform clinical safety; and (3) review approaches to address challenges to translating findings from animals to humans.
This talk will discuss key guidance documents and risk assessment frameworks used to evaluate and manage hazards identified in the respiratory system following the inhalation of pharmaceutical products in individual patients. The speaker will briefly discuss the strengths and weaknesses of specific in vitro and in vivo models used to evaluate inhaled pharmaceutical products that produce hazards in the respiratory system, as well as present relevant case studies. The case studies will provide approaches to assess and manage risks identified in the respiratory system following the inhalation of pharmaceuticals, and insight that will be used to highlight how nonclinical data are used to inform the design of clinical protocols. Overall, this talk will inform the attendees on how to leverage key guidance documents and data to inform approaches used to manage potential respiratory risks related to the inhalation of pharmaceuticals in individuals. Learning objectives of this talk: (1) review key nonclinical guidance documents from the FDA and other relevant institutions on the safety evaluation of pharmaceuticals; (2) discuss risk assessment frameworks for pharmaceutical products; (3) review relevant in vitro and in vivo models employed to evaluate pharmaceutical-induced hazards in the respiratory system; and (4) review case studies on assessing and managing respiratory risks in individual patients following the inhalation of pharmaceutical products.
This talk will discuss key guidance documents and risk assessment frameworks used to evaluate hazards identified in the respiratory system following the inhalation of chemical products across populations. Strengths and weaknesses of specific in vitro and in vivo models for evaluating hazards from inhaled chemical products across populations as well as existing general examples will be discussed. These examples will provide approaches to assess risks identified in the respiratory system following the inhalation of chemicals, highlighting how oral and inhalation toxicity data are used to evaluate hazards and risks. Insights gained from these examples will be used to compare and contrast the approaches used to assess risks from exposure to chemicals. This talk will inform the attendees on how to leverage key guidance documents and risk assessment frameworks to inform approaches used to assess potential respiratory risks related to the inhalation of chemicals across populations, as well as to understand how these approaches may be similar or different to those used to assess hazards from pharmaceutical products. Learning objectives of this talk: (1) review key nonclinical guidance documents and risk assessment frameworks from the EPA and other relevant institutions on risk evaluation of chemicals; (2) review relevant in vitro and in vivo models employed to evaluate chemical-induced hazards in the respiratory system; and (3) review examples on assessing respiratory risks across populations.
This talk will focus on identifying and addressing current challenges and future directions expected to impact the regulation of inhaled chemicals and the development of safe aerosolized pharmaceuticals. The value and limitations of animal models, the impact of laws aimed at reducing or eliminating the use of animals, and the human relevance and regulatory acceptance of alternative methods will be discussed. Deposition and PK/PD modeling, computational toxicology, in vitro/ex vivo assays, and other nontraditional approaches will change the way we evaluate inhalation hazards and related risks, but are we there now? How will these approaches be validated to ensure the usefulness of each before gaining regulatory acceptance? As such methods rapidly gain acceptance, how will professionals be trained in their use? Overall, this talk will attempt to provide a picture of the current challenges and future directions sponsors and regulatory agencies face when regulating chemicals and pharmaceuticals, as well as the benefits related to potential approaches to address these challenges. Learning objectives of this talk: (1) identify and address current challenges related to evaluating the risk of inhaled substances; (2) discuss new approaches to better predict risk after inhalation exposure; and (3) review approaches to validate these new models and gain regulatory acceptance.
Session Chairs: David Clarke, Eli Lilly and Company, Indianapolis, IN; and
Helen Prior, National Centre for the Replacement, Refinement, and Reduction of Animals in Research (NC3Rs), London, United Kingdom
Educational Support Provided by: Labcorp
Chronic repeated-dose toxicity studies are required to support late-stage clinical trials, providing data to characterize adverse effects of potential concern for human safety. This workshop will review study designs and flexible approaches, which may provide opportunities to reduce time, cost, compound requirement, and animal use within drug development programs. The outcomes from two recent industry collaborations will be presented to illustrate current practices and the value of chronic toxicity studies for different molecule types. The IQ-DruSafe consortium has new data on the predictivity of clinical adverse events from chronic toxicity studies. A study by the Netherlands Medicines Evaluation Board/UK National Centre for the Replacement, Refinement, and Reduction of Animals in Research (NC3Rs) has evaluated the weight of evidence (WOE) model with a three-month study for monoclonal antibodies (mAbs) rather than a six-month study. Speakers and a panel discussion will then cover aspects of study design that challenge some common practices and discuss potential new approaches to minimize animal use. These include potential opportunities for single species chronic toxicity studies for small molecules, peptides, and oligonucleotides, and whether a six-month duration nonrodent study can be used more routinely than a nine-month study (similar to ICHS6(R1) for biological products). Opportunities to optimize recovery animal use if warranted and whether restriction to one study only can be applied more widely within and outside ICHS6(R1). This workshop is applicable to all toxicologists to discuss different approaches for chronic toxicity studies to navigate the regulatory landscape while balancing human risk assessment with the 3Rs.
Results from the IQ consortium nonclinical to clinical translational database composed of both small and large molecules highlighted the importance of the current regulatory paradigm of animal toxicology studies to support FIH clinical trials1. While nonclinical studies can demonstrate great value in the positive predictive value (PPV) for certain species and organ categories, the negative predictive value (NPV) was the stronger predictive performance indicating that an absence of toxicity in animal studies strongly predicts a similar outcome in the clinic. To determine the potential value of longer-duration toxicology studies, the IQ-DruSafe Leadership Group conducted part two of the translational database to evaluate if any new safety liabilities were identified in chronic studies and to determine the PPV of such observations. The potential value of chronic toxicology studies conducted in two species and differences between the rodent and nonrodent in identifying safety liabilities will also be addressed. 1Monticello et al., 2017 TAP, 334:100–109.
To support the registration of monoclonal antibodies (mAbs) for chronic indications, six-month toxicity studies are generally sufficient. Experience with mAb development has shown a relatively benign and well-understood safety profile for this class, with most toxicity findings anticipated based on pharmacology. We evaluated whether a six-month toxicity study is optimal to assess the long-term safety of mAbs. Data on First-in-Human (FIH)-enabling and chronic toxicity studies were shared for 142 mAbs submitted by 11 companies. For 71% of mAbs, no toxicities or no new toxicities were noted in chronic studies compared to FIH-enabling study findings. New toxicities of concern or that changed trial design were identified in 13.5% of cases, with 7% being considered critical and 2% leading to program termination. FIH-enabling studies of three-month duration led to better prediction of toxicities in chronic studies. We developed an iterative, WOE model which considers factors that influence the overall risk for a mAb to cause toxicity. This model enables an evidence-based justification suggesting when three-month toxicity studies are likely sufficient to support late-stage clinical development and registration for some mAbs.
To support clinical dosing of six months or longer, the ICH M3(R2) guideline outlines recommendations for a study duration of six months in rodents; however, regional differences exist for nonrodent studies, whereby a six-month duration is accepted within the EU while a nine-month duration is generally requested for submissions to Japan and the United States. It is therefore common for nine-month studies to be performed to provide sufficient data within one study for marketing in other regions. Given the experience gained using these dual approaches since the adoption of the original ICH M3(R2) guidance, this discussion will explore whether there could be opportunities to broaden the acceptance of six-month nonrodent studies for small molecules, peptides, and oligonucleotides, similar to current practices for ICH S6(R1) biological products.
An assessment of recovery from adverse findings is required during pharmaceutical development, but there is flexibility around how and when this is performed. Both ICH S6(R1) and ICH M3(R2) guidance outline the option for inclusion of recovery animals in a single study within a package if the use of animals is warranted. Data from recent industry collaborations were reviewed to evaluate current practices for small molecules and monoclonal antibodies (mAbs). There are examples of packages with no recovery animal use, but recovery animals are still routinely included in a high number of toxicity studies. The number of recovery groups is often minimized to control + one test article-dosed group (usually the high dose) but inclusion in all groups is also common. Group sizes of 2M+2F (nonrodents) or 5M+5F (rodents) are typical. Recovery groups are often included in multiple studies across the FIH-enabling and chronic toxicity package for both small molecules and mAbs. The use of a recovery assessment within a single study and species, along with optimized study designs remain an opportunity to further reduce animal use within development programs
In 2020, a large international working group published data1 reviewing the use of one or two species within regulatory toxicology packages for different drug modalities. The group explored a hypothetical concept for potential expansion of ICH S6(R1) approaches, whereby if two species were used during short-term toxicology studies (generally IND-enabling studies supporting Phase I trials), a single species may be sufficient for longer-term toxicology studies (e.g., up to six months dosing duration) if there were similar toxicity profiles in the two species. The potential for use of a single species for six-month studies across a wider range of drug modalities such as small molecules, peptides, and oligonucleotides will be discussed further, exploring questions around the risk associated with a reduced data package in association with the topics explored elsewhere within the session. 1Prior et al., 2020 RTP 113:104624.
Session Chairs: Bert Haenen, Janssen Pharmaceutical Companies, Beerse, Belgium; and
Cynthia Rohde, Pfizer Inc., Pearl River, NY
Over time, nonclinical safety testing of vaccines has undergone a change from initial safety testing of vaccine batches to GLP toxicity study testing before going into the clinic. In the last decade, new vaccine modalities have been produced (i.e., viral vector vaccines and mRNA-based vaccines) and a platform approach has been adopted for the creation of vaccines against a number of infectious diseases. These developments and several more recent pandemics/outbreaks (Ebola, Zika, COVID-19) have led to changes in the nonclinical testing paradigm for new vaccines. So, when are repeat-dose toxicity evaluations for vaccines still informative? Have they prevented the start of an FIH study? Can we do without them? If so, when? Can we use alternative approaches to nonclinical safety evaluation that would better align with 3Rs, such as broadening the definition of the platform approach or including some safety parameters in pharmacology studies? This symposium will provide the results of a survey among vaccine companies related to these questions and the current approaches that are being used. The pharma view will be presented (GSK: with a focus on adjuvants) and Pfizer (effect of platform approach and small formulation changes). Also, the view from an EU regulator and FDA will be provided.
Safety testing of vaccines has been changed rigorously in the 20th century, as previously vaccines were accepted by the so-called abnormal toxicity test and tests on sensitizing potency and pyrogenicity, all conducted with every batch. When vaccines were included in the EU legislation on medicines, EU experts have written a guideline formulating the requirements for modern vaccines to support safety, as was requested for all other pharmaceuticals for human use in the EU directive 75/318 (now 2001/83). Since then, many new vaccines have received marketing authorization, including the COVID-19 vaccines during the recent pandemic. A review of GLP repeated dose toxicity data as well as results from the reproductive toxicity studies in the COVID-19 vaccines thus far revealed that the animals showed only inflammatory signs related to the vaccine, and no remaining toxicities. The studies have been conducted using the standard low number of animals and were not designed to pick up the (very) rare adverse effects in humans. It will be discussed whether the low impact of these studies justifies the use of these animals from an animal welfare point of view. Animals should not be used only to pleasure the regulatory authorities.
In this symposium, the position of vaccine-producing companies on nonclinical safety testing of prophylactic vaccines and therapeutic vaccines will be presented. The survey will entail all current vaccine modalities, from classical antigen vaccines to vectorized and mRNA vaccines, etc. Some questions that will be asked in the survey: When have repeat dose toxicity studies prevented the start of an FIH study of a vaccine? If so, what was the reason? When does vaccine pharma think repeat dose toxicity studies are informative? Do adjuvanted vaccines always need to be tested in animals before human use? Are repeat dose toxicity studies performed just because the guideline asks for them? What kind of change in vaccine formulation necessitates a repeat of nonclinical safety testing? Can pharmacology studies with built-in safety parameters replace repeated dose toxicity studies for vaccines? How many repeat dose toxicity studies constitute a platform approach for safety testing?
This presentation will focus on two questions: (1) How can platform data be used to inform on safety and potentially reduce time to clinical administration vs the normal test paradigm?; and (2) Are toxicity studies useful for formulation changes with known variables? To address these questions, the speaker will provide case examples related to taking a platform vaccine approach using mRNA COVID-19 vaccines, developing next-generation vaccines which incorporate additional serotypes targeting the same bacterial strain, and making changes to vaccine formulation excipients (excluding adjuvants) or adding established preservatives (e.g., 2-phenoxyethanol).
An overview of the toxicology programs conducted to support the development of adjuvants and adjuvanted vaccines will be presented. Industry case examples will be shared providing a platform approach that could potentially be utilized for the toxicology evaluation of new adjuvanted vaccines.
One of the FDA's primary objectives is to assure the safety of enrolled patients or healthy volunteers in clinical trials. Stringent nonclinical evaluation of new vaccines, before testing in humans, is a critical factor in the evaluation of the vaccine’s safety. GLP repeat-dose toxicology studies are an important part of the nonclinical safety evaluation of vaccines. In recent years, several platform vaccine systems have been developed with the potential to easily insert new antigens. Such platform vaccines can be tremendously helpful for quickly and safely developing vaccines, especially in the case of a potential pandemic outbreak. Evaluating prior nonclinical and clinical data of highly similar vaccine products can help to develop a tailored and focused nonclinical toxicology program for such platform vaccines. In this presentation, we will discuss the nonclinical safety evaluation of platform vaccines, especially during pandemic outbreak situations.
Session Chairs: Jacquelynn Lucas, BlueRock Therapeutics, Cambridge, MA; and
Zhechu Peng, Boehringer Ingelheim Pharmaceuticals, Inc., Ridgefield, CT
Educational Support Provided by: Boehringer Ingelheim and American College of Toxicology
Over the last few years, multiple gene therapies have been approved by regulatory agencies and numerous clinical-stage assets are approaching regulatory filings. Fueled by the identification and invention of novel gene delivery vectors, gene therapy efforts now hold promise for treating a wide range of diseases and are seen as a crucial part of the growth of the biopharmaceutical industry. The industry has witnessed the use of recombinant adeno-associated virus vectors (rAAVs) for in vivo or systemic treatments, and the use of lentiviral vectors (LVs) for ex vivo gene-modified cell therapies. Together with an increased interest in utilizing recombinant viral vectors, ongoing research continuously reveals unprecedented knowledge of viral vectors on the host genome, which may subsequently affect the mutagenic and carcinogenic potential of these therapies. This research further impacts the regulatory requirements on mutagenicity and carcinogenicity risk assessment for gene therapy products. This session aims to focus on the history and current knowledge of the rAAVs and LVs associated risk of mutagenicity/carcinogenicity and attempts to address the current and future regulatory requirements. Three topics will be discussed in this session: (1) an overview of the history and current scientific knowledge on rAAVs and LVs related mutagenicity/carcinogenicity risk, as well as its current regulatory landscape; (2) two case studies on the mutagenicity/carcinogenicity risk assessment of rAAVs and LVs products; and (3) current regulatory perspective on the mutagenicity/carcinogenicity assessment of rAAVs and LVs gene therapy products.
Adeno-Associated viral (AAV) vectors are currently in various stages of clinical trials for the targeted delivery of genes to treat genetic diseases. At the present, two AAV-based therapies, Luxturna for treatment of rare inherited retinal dystrophy, and Zolgensma for treatment of spinal muscular atrophy have been approved for marketing by FDA. In recent years, lentiviral vectors (LVs), particularly self-inactivating (SIN) LVs, have been used for ex vivo modification of T cells to generate chimeric antigen receptor (CAR) T cells for cancer chemotherapy and have revolutionized cancer treatment. Recently, Kymriah, an engineered CAR T cell that uses a lentiviral vector to stably deliver the anti-CD19 CAR gene construct to cells, was approved by FDA for the treatment of pediatric B Cell acute lymphoblastic leukemia. One safety concern with viral vectors is potential vector-mediated genotoxic events that may lead to the insertion of the viral construct into specific sites within the genome. These concerns are thought to occur primarily through promoter insertion, promotor activation, or gene transcript truncation, with enhancer-mediated activation of an active gene near their site of action leading to the upregulation of cellular proto-oncogenes. There are viral and nonviral factors that influence the potential genotoxicity/carcinogenicity of viral vectors. Viral vector risk factors include virus type, integration target site, and target cell type. Nonviral vector risk factors include age, disease, and dosage. In this overview, the current scientific understanding of the risk of AAV- and LV-mediated genotoxicity/carcinogenicity in animal models and humans, related risk factors, and associated risk of genotoxicity/carcinogenicity will be discussed.
Third-generation self-inactivating (SIN) lentiviral vectors (LVs) have now been used in multiple clinical trials to introduce transgenes into hematopoietic stem cells (HSCs) to correct various monogenic diseases, including lysosomal disorders, primary immunodeficiencies, and hemoglobinopathies. Since 2003, these LVs have been used to treat more than 400 patients with mono-inherited genetic diseases (some now more than 10 years post-infusion) using ex vivo LV-transduced CD34+ cells, and several thousand oncology patients using ex vivo LV-transduced T cells to produce either chimeric antigen receptor T cells (CARTs) or T cell receptor (TCR) T cells. Preclinical evaluation of potential adverse mutagenic events and tumorigenicity relies on a combination of in vitro and in vivo assessments. The in vitro immortalization (IVIM) assay and next-generation surrogate assays for genotoxicity assessment (SAGA) aid in evaluating the risk of vector-induced transformation in an in vitro setting and insertion site analysis (ISA) is used to assess the integration site profile after in vivo transplant of the LV-transduced HSCs in animal models. Together with robust toxicology evaluation (e.g., histopathology, vector copy number, hematopoietic cell reconstitution), the results of these assessments may contribute to characterizing the potential risk of vector-induced adverse mutagenicity for these therapies.
Wild-type Adeno-Associated viral vectors (WT AAVs) can integrate at a low rate into the host genome in a preferential (human chromosome 19 [q13.4 ter]) or nonspecific manner. Integration of the WT AAV genome adjacent to oncogenes in hepatocellular carcinoma has been observed, although a causal relationship has not been established. Recombinant AAV (rAAV) vectors are replication-defective and contain no viral genes; thus, they lack the ability of the expression cassette to integrate site-specifically like WT AAV. Nonetheless, low-frequency integration events remain possible following the administration of an rAAV and have been demonstrated to occur. As rAAV vectors do not contain proteins needed for causing double-stranded DNA breaks and must rely on the host’s cellular machinery, unlike retroviral vectors, it is currently thought that rAAV vectors will preferentially integrate into genomic DNA regions that are already broken or being actively transcribed. The toxicologist must consider these traits of rAAV vectors, clinical dose, biodistribution, target tissue(s), animal model(s), and duration of nonclinical studies while assessing the risk of carcinogenicity of rAAV gene therapy products. Patient risk factors and the current standard of care must also be considered when developing a balanced and data-driven risk assessment for a specific product. This presentation will further describe these considerations, methods for detection, and practical approaches for risk assessment for an rAAV gene therapy.
This talk will provide a regulatory perspective by CBER/FDA regarding preclinical considerations in evaluating the mutagenicity and carcinogenicity risks of rAAV and LV-based gene therapies, including preclinical study designs, selection of animal species/disease models, and the challenges in addressing the vector-related genotoxicity concerns, and opportunities to interact with CBER/FDA.
Session Chairs: Smita Salian-Mehta, Gilead Sciences, Foster City, CA; and
James Smith, Boehringer Ingelheim Pharmaceuticals, Inc., Ridgefield, CT
Educational Support Provided by: Charles River and Labcorp
The reduced availability of nonhuman primates (NHPs) for biomedical research has been a growing concern for several years but is now widely viewed as a challenge to the pharmaceutical industry’s ability to deliver innovation. The continuous efforts to evaluate animal usage in preclinical studies with a supply scarcity have further motivated pharmaceutical companies to reexamine their animal usage practices. Research scientists affiliated with the International Consortium on Innovation and Quality in Pharmaceutical Development (IQ) conducted cross-industry surveys to identify and evaluate how the application of the 3Rs principles to traditional toxicology study designs could reduce and refine animal use without negatively impacting scientific data quality and regulatory acceptability. The presentations in this symposium cover: (1) survey outcomes from an IQ NHP reuse working group (WG) survey on the practice and challenges of reusing NHPs that are not protein naive; (2) survey outcomes from an IQ NHP Husbandry WG survey concerning NHP health issues that occur in the prestudy and dosing phase; (3) cross-industry approaches to utilizing nonterminal recovery controls in nonrodent GLP toxicology studies; (4) benchmarking the use of recovery rodent and nonrodent animals from the IQ Recovery Animals WG and case-study examples challenging standard study design; and (5) the application of historical control databases instead of control rodent and nonrodent animals in the recovery phase. This symposium intends to expand the consideration of reduction/refinement by raising awareness of pharmaceutical industry strategies and experiences that have an impact beyond the current NHP supply issues.The reduced availability of nonhuman primates (NHPs) for biomedical research has been a growing concern for several years but is now widely viewed as a challenge to the pharmaceutical industry’s ability to deliver innovation. The continuous efforts to evaluate animal usage in preclinical studies with a supply scarcity have further motivated pharmaceutical companies to reexamine their animal usage practices. Research scientists affiliated with the International Consortium on Innovation and Quality in Pharmaceutical Development (IQ) conducted cross-industry surveys to identify and evaluate how the application of the 3Rs principles to traditional toxicology study designs could reduce and refine animal use without negatively impacting scientific data quality and regulatory acceptability. The presentations in this symposium cover: (1) survey outcomes from an IQ NHP reuse working group (WG) survey on the practice and challenges of reusing NHPs that are not protein naive; (2) survey outcomes from an IQ NHP Husbandry WG survey concerning NHP health issues that occur in the prestudy and dosing phase; (3) cross-industry approaches to utilizing nonterminal recovery controls in nonrodent GLP toxicology studies; (4) benchmarking the use of recovery rodent and nonrodent animals from the IQ Recovery Animals WG and case-study examples challenging standard study design; and (5) the application of historical control databases instead of control rodent and nonrodent animals in the recovery phase. This symposium intends to expand the consideration of reduction/refinement by raising awareness of pharmaceutical industry strategies and experiences that have an impact beyond the current NHP supply issues.
The IQ Consortium NHP Reuse WG was formed due to the ongoing ethical discussion around NHP use within the pharmaceutical industry and comprises members from 15 pharmaceutical and biotechnology companies. In 2020, the WG developed and distributed a detailed questionnaire on protein non-naïve NHP reuse within nonclinical programs at all stages of research and development to the WG member companies. Responses were received from key stakeholders that included principal investigators, facility managers, animal welfare officers, and research scientists. This presentation will focus on the questionnaire responses and discuss the challenges and opportunities surrounding protein non-naïve NHP reuse, and future directions recommended by the WG.
NHPs with abnormal fecal changes and stress-related health conditions is a common condition in a research setting that significantly impacts animal health and well-being. Over the past several years, the successful management of these conditions has been even more critical given the severe shortage of NHPs and the use of younger animals. A deeper understanding of the extent and management of these issues is warranted as the impact also has the potential to interfere with data quality or interpretation, therefore potentially requiring the use of more animals or repetition of studies. A cross-industry survey was conducted by the NHP Husbandry WG jointly cosponsored by the IQ Consortium’s DruSafe and 3Rs Translational and Predictive Sciences Leadership Groups to determine the frequency of NHP health issues and benchmark current NHP husbandry practices to address these issues before placing animals on studies. Key findings from this initial survey indicated that excluding NHPs from the study for gastrointestinal health issues and signs of abnormal stress resulted in a higher quality of study cohorts with better overall health. In addition, preventative health practices, especially diet modifications, environmental enrichment, and behavior assessment training have a positive impact on NHP health and study outcomes. This presentation will discuss the results from the prestudy phase survey and proposed recommendations for husbandry practices during the study phase that can minimize health issues, enhance animal welfare consistent with the 3Rs principle, and improve data quality from toxicity studies utilizing NHPs.
Historical control data are routinely used in the evaluation of reproductive toxicity and carcinogenicity studies. Despite recognized challenges in developing and maintaining comparable data sets, including variability related to animal genetics, husbandry, and anatomic pathology endpoints, the use of control data in these studies is integral. In general toxicity programs, historical data sets are more commonly employed only when there is a need to determine the probability that an unusual finding in a single animal is due to the animal’s background experience versus a test article-related event. Today, with the increasing availability of large datasets and advanced computational modeling tools, the potential to expand the use of historical datasets for the development of virtual controls is being explored. For example, the National Centre for the Replacement Refinement and Reduction of Animals in Research (NC3R) is currently sponsoring a CRACK IT Challenge to develop a Virtual Second Species to model toxicology endpoints. This presentation will review the challenges and limitations of the use of historical control animals in current practice and discuss how these new modeling efforts may address these challenges, with the eventual goal of reducing the number of animals needed as on-study controls.
The inclusion of recovery animals in preclinical studies supporting clinical trials is undertaken with a diversity of approaches despite operating under the same 3Rs principles and guidance. While the use of recovery animals may provide valuable assessment of delayed toxicity or reversibility of adverse toxicity, there are instances where such an assessment may be confidently made while also minimizing animal use. This presentation uses specific case studies provided by IQ Consortium Recovery Animals WG members (sponsored by the IQ 3R TPS LG) to represent their company strategies and to highlight challenges and opportunities for continuous refinement in the use of recovery animals. Rationales behind the inclusion/exclusion of recovery animals will be shared to increase awareness of successfully progressing drugs to clinical application while also minimizing the utilization of recovery animals.
General toxicology studies generally contain a vehicle (control) group and low, intermediate, and high dose test article treatment groups. In addition, to demonstrate the reversibility of toxicities observed during the dosing phase, a treatment-free period (recovery phase) is sometimes included that consists of a subgroup of animals from the control group and one or more treatment groups. While recovery phase controls are often not essential for the interpretation of the toxicology study results, they are frequently terminated along with test article-treated recovery animals, which can be avoided in routine toxicology studies. Therefore, not terminating and reusing animals designated as recovery controls in nonrodent GLP toxicology studies supports the 3Rs goal of “reducing animal usage.” This also helps to address the significant shortage in the supply of NHPs. This talk provides opinions on scenarios that may or may not necessitate the termination of control animals from the recovery phase in nonclinical toxicology studies conducted in nonrodent species like dogs and monkeys.
Session Chairs: Shuangwei Li, Calibr at Scripps Research, San Diego, CA; and
Grace Furman, Paracelsus, Inc., Leucadia, CA
The FDA Animal Rule provides a unique product development pathway for the approval of new drugs and licensure of biological products when human efficacy studies are not ethical and field trials are not feasible, codified in 21 CFR 314.600 through 314.650 for drugs and 21 CFR 601.90 through 601.95 for biological products. This symposium will introduce how to apply the FDA Animal Rule principles to an IND-enabling program and how to integrate pharmacology, toxicology, and clinical pharmacology for registration. This symposium will also provide perspectives on how Animal Rule concepts may be used for human dose selection.
The development of Medical Countermeasures (MCMs) against chemical, biological, radiological, and nuclear (CBRN) agents, pandemic influenza, and emerging infectious diseases present unique regulatory challenges. For many of these threats, clinical trials in support of MCM approval are not feasible or ethical to conduct. The Animal Rule is an alternative pathway for the US Food and Drug Administration (FDA) approval of MCMs. Under the Animal Rule pathway, well-characterized animal models are used to assess the efficacy of proposed MCMs. The Biomedical Advanced Research and Development Authority (BARDA), within the Office of the Assistant Secretary for Preparedness and Response (ASPR) in the US Department of Health and Human Services (HHS), along with the Department of Defense (DoD) and the National Institutes of Health (NIH), are key players that promote MCM development along with industry partners to protect Americans and respond to 21st-century health security threats. BARDA supports a diverse portfolio of MCMs, of which 62 products have received FDA approval, licensure, or clearance, some of which used the Animal Rule development pathway. This interactive presentation will discuss the requirements for using the Animal Rule drug development pathway. It will also discuss BARDA’s experience in developing MCM under the Animal Rule, provide practical guidance on when to use the Animal Rule, and the need to have early interactions with the appropriate FDA review division to ensure that the Animal Rule is the most appropriate regulatory pathway.
Each nonclinical development program for drug development under the FDA Animal Rule is unique due to its indication, relevant animal models, and the drug modality. This presentation will discuss differences between nonclinical studies typically performed to support traditional drug development and nonclinical studies used to support evidence of efficacy under the Animal Rule. This presentation will also introduce some of the strategies that have been used in support of Animal Rule applications.
As the information from animal efficacy study is used directly to assess and evaluate the safety and efficacy of candidate medical countermeasures, animal models utilized in Animal Rule programs should be well characterized and may be qualified in accordance with the FDA Drug Development Tools (DDT) program. This presentation will introduce considerations that should help guide the development of animal models, potential limitations in developing animal models, and an example of a qualified animal model under the FDA DDT program.
This presentation will introduce how a human efficacious dose is determined under the FDA Animal Rule and the challenges in human dose selection under this unique regulatory pathway. In addition, this presentation will also discuss opportunities for integrating clinical pharmacology knowledge and the application of the totality of evidence for human dose selection in other emergent situations.
Smallpox disease is one of the deadliest infectious diseases in human history and despite its official eradication in 1980, smallpox is still viewed as a significant threat due to its potential as a biological weapon. TEMBEXA is an orally bioavailable lipid conjugate of cidofovir with potent in vitro activity against variola virus, the causative agent of smallpox. Since clinical efficacy studies were not feasible, the efficacy of TEMBEXA was demonstrated in two well-characterized animal models (mousepox and rabbitpox) according to the FDA Animal Rule. The selection of an effective human dose was based on extrapolations from therapeutic exposures in animals using modeling and simulation. Both circulating brincidofovir and peripheral blood mononuclear cell concentrations of the active metabolite, cidofovir diphosphate, were used to establish that adequate exposure would be achieved over the two-week treatment period in humans. This talk will focus on the complexities of drug development under the Animal Rule and the specific challenges faced with TEMBEXA in establishing that brincidofovir should be effective against smallpox.
Session Chairs: Mansi Krishan, Meta Platforms, Inc., Broomfield, CO; and
Douglas A. Donahue, GlaxoSmithKline, Rockville, MD
Societies of Toxicology (EUROTOX) was established in 1994. It constitutes a list of toxicologists who excel by high standards of education, skills, experience, and professional standing. The intention is to foster competence in practice and science and to provide the public with an authoritative source of information on toxicological competencies. The high professional competence of the European Register of Toxicologists (ERT) is recognized Europe-wide/worldwide. This is a joint session between the ACT Early Career Professional Subcommittee and the EUROTOX. Speakers in this interactive session (1) will provide an overview of the ERT and its value to early career toxicologists; (2) will present requirements defined by EUROTOX and National Societies of Toxicology for ERT registration and reregistration; and (3) recently registered ERTs will share the application process through one of the National Registers. The session will conclude with a panel discussion where recently registered and reregistered ERTs will share their experiences, the applicability of the ERT within the US, and the utility of the registration for career advancement while answering additional questions from the audience.
An increasing need for professional recognition of qualified toxicologists has emerged as a necessity in Europe starting from the 1980s. The German, British, and Dutch toxicology societies started developing national registers which led to a joint European registration following EUROTOX in 1994. The aim was to recognize the proficiency and provide Europe with qualified toxicologists by harmonized and standardized criteria, although there are slight and specific differences between the current 21 National Registers. In most instances, National Registers are open to any toxicologists globally for membership/application. Upon application, each National Registration Board evaluates the applications and admits successful applicants to their register. Subsequently, upon request from the National Register, EUROTOX certifies these individuals as ERT without further evaluation. The requirements for becoming an ERT encompass an academic degree in a toxicology-relevant subject, basic competence in 14 core (obligatory) and two out of nine specialized (elective) topics of toxicology, at least five years of relevant toxicological experience, documentation of the practical experience, and current professional engagement in the practice of toxicology. The overall history, requirements, and application and recertification processes will be summarized and discussed in this presentation.
The ERT is an accreditation system initiated within Europe but is now recognized worldwide. ERT ensures that practicing toxicologists first achieve and then retain an appropriate level of professional competence to practice toxicology via education, training, and continued professional development. Overall, this advances our community by ensuring that the discipline of toxicology is respected and trusted within the scientific community. Membership in the ERT is increasingly expected within the European toxicology community as an accreditation held by those making key decisions on human and environmental safety. In some instances, European authorities have requested or required membership of the ERT to validate the status and credibility of toxicologists involved in submission work. Thus, it is a considerable advantage for toxicologists at all stages of their careers to consider applying to join the ERT.
The title ERT is accorded to qualified individuals who have been nominated by their National Registry to EUROTOX. To begin the application process, the individual needs to determine the National Register which will review and make the nomination. Not all countries have the same requirements. Some have residency requirements, others have nationality requirements, and some require the individual to be a member of the National Registry. Once an applicant has selected the National Register that he/she/they plan to apply through, the next step is to develop the application according to the requirements of that register. This presentation will discuss the application process for ERT through the UK Register of Toxicologists.
This will be a panel discussion for the audience to ask questions, which may also include specific questions on their ERT application. The attendees are encouraged to bring their ERT applications, questions, concerns, and thoughts to discuss during the panel discussion.
Session Chairs: Jane Sohn, US FDA, Silver Spring, MD; and
Janice Lansita, ToxAlliance, Kennett Square, PA
A traditional part of the final afternoon of the ACT Annual Meeting is the Hot Topics Symposium. Each year this session includes a variety of topics from leading experts focusing on late-breaking regulatory or scientific advances related to toxicology.
Effective February 1 of this year, the European Medicines Agency (EMA) changed when they release to the public information about pharmaceuticals in development. The EMA will post information shortly after first-in-human studies are performed. This is a major shift from waiting until products are approved. Many companies perform first in human studies in Europe by filing their investigator brochure (IB). The guidance states that the IB will be considered proprietary and not posted for UP TO five to seven years. There is no experience in the release of IBs, and thus companies need to assume all IBs will be available to the public immediately. This presentation will focus on the new guidance document and changes people may want to consider in their nonclinical IB submissions to Europe.
In this talk, participants will learn about recent findings regarding the human and ecological impacts of microplastics in water, and the application of a novel probabilistic risk assessment framework to inform management of microplastics in an estuarine environment.
T cell-engaging bispecific antibodies (TCB) have shown promising clinical activity and are currently developed by many companies for various oncology indications. By simultaneously binding CD3 on T cells and a specific tumor-associated antigen on cancer cells, TCBs create a so-called immunological synapse resulting in cytokine release and potent anti-tumor activity at relatively low doses in nonclinical models. The in vivo nonclinical safety assessment of these new modalities has shown to be challenging because of pronounced immunogenicity and/ or dose-limiting cytokine release as part of their unique pharmacodynamic mode of action. Part one of this presentation will provide an overview of current industry experience, provide case examples outlining the various nonclinical challenges and discuss opportunities to optimize the use of nonhuman primates for nonclinical safety assessment of these biotherapeutics. Part two will provide a regulatory perspective on the various aspects of nonclinical safety strategies for these modalities based on current agency experience with these therapeutic modalities.
Session Chairs: Leanne Bedard, Bedard ADME-Tox Solutions, Kirkland, QC, Canada; and
Grace Furman, Paracelsus, Inc., Leucadia, CA
Educational Support Provided by: Roundtable of Toxicology Consultants and Veloxity Labs, LLC
As a follow-up to last year’s workshop entitled “So You Want to Be a Consultant,” this interactive mini symposium will address one of the most important (and confusing) components of consulting agreements: conflicts of interest. When first striking out on their own as business owners, many consultants do not know how to effectively negotiate consulting contracts and strike a deal. However, companies are eager to enter into consulting agreements with you because of your expertise and reputation! You have the leverage of much-needed experience and expertise! New consultants may be swayed to accept less than optimal language in their consulting agreements. Currently-practicing toxicology consultants will share their perspectives and practical experiences on different approaches used in addressing conflict of interest language in agreements so that you can collaborate effectively with your potential new client while protecting your consulting business and maintaining the freedom to operate. Specific and real-world examples encountered by consultants regarding contractual conflicts of interest will be shared, such as working for competing clients, interacting with CROs, and being compensated with equity (stock options or shares). Session participants are encouraged to bring their questions and/or experiences to the mini symposium. All toxicologists that are currently consulting or are thinking of pursuing consulting as a career are encouraged to attend what will no doubt be an engaging and thought-provoking session!
Session Chairs: Lorrene Buckley, Eli Lilly and Co., Inc., Indianapolis, IN; and
Dexter Sullivan, Gad Consulting Services, Raleigh, NC
Educational Support Provided by: Eli Lilly and Company
The final formulated drug product of an orally administered (small molecule) drug may be packaged in various formats—tablets in a bottle or a blister pack, for example. It is the product that the patient buys, uses, and stores in the medicine cabinet. Apart from physicochemical properties, there are safety considerations that influence what packaging is selected for the commercial product. While a product must be accessible to the patient (e.g., older patients), it also must often be inaccessible to young children. The Consumer Product Safety Commission (per the Poison Prevention Packaging Act of 1970) established guidance to determine when child-resistant packaging is required. To determine the most appropriate child-resistant packaging, a toxic dose in children must be estimated. The toxic dose is an amount of drug product that may produce serious personal injury or serious illness for an 11.4 kg child (approximately four years old). The uncertainties inherent in estimating such a dose and concern for the safety of children prompt a careful assessment of all available data and consideration of several factors including bioavailability, the nature and reversibility of the effect, and extrapolation of animal and adult human data to the pediatric setting. Typically, the clinical pharmacologist, in partnership with the toxicologist, will take the lead role in establishing the estimated dose. Attendees will leave this session with a practical understanding of why these assessments are necessary, factors to consider in developing a relevant safety opinion, and the skill sets that inform these opinions.
This talk will introduce the topic, including considerations for appropriate packaging of oral drug products and how safety assessments inform packaging decisions. Speakers will provide the perspectives of the CMC packaging engineer, the toxicologist, and the clinical pharmacologist.
Engineers in Container Closure Packaging groups of pharmaceutical companies are responsible for developing and submitting container closure systems for use with oral drug products to regulatory authorities. This talk will provide the CMC perspective regarding the regulatory basis for packaging requirements and the timing and factors that drive packaging considerations for various oral drug products in development.
Since the enactment of the Poison Packaging Prevention Act, several household substances, including select prescription and over-the-counter pharmaceuticals, have required child-resistant packaging. The goal of child-resistant packaging is to reduce the risk of poisoning in children. Many different strengths and types of packaging are available and are selected based on the potential hazard present to children. This talk will provide an overview of the regulations for child-resistant pharmaceutical packaging and explore safety assessment approaches to determine the appropriate packaging.
For pharmaceuticals in development, a plethora of data from nonclinical pharmacology and toxicology studies and clinical trials support clinical development. An expert evaluation led by the clinical pharmacologist in partnership with the toxicologist and drug disposition scientist is required to determine the most relevant data to guide decisions regarding pediatric safety in the context of accidental ingestion.
Session Chairs: Grace M. Furman, Paracelsus, Inc., Leucadia, CA; and
Derek Leishman, Eli Lilly and Company, Indianapolis, IN
Educational Support Provided by: Roundtable of Toxicology Consultants and Veloxity Labs, LLC
In March 2022, the E14/S7B Implementation Working Group (IWG) published initial training materials and ICH E14/S7B Q&As (adopted February 2022) relating to best practices for the design, conduct, analysis, interpretation, and reporting of in vitro, in silico, and in vivo nonclinical assays in order for these assays to influence nonclinical and clinical evaluations. The training materials include four topics specifically relating to the ICH S7B guideline: (1) Integrated Risk Assessment; (2) Best Practice Consideration for in vitro Studies; (3) Best Practice Considerations for the In Vivo QT Studies; and (4) Principles of Proarrhythmia Models. What impact will the ICH S7B have on the nonclinical cardiovascular assessment of new drug products? What approaches should toxicologists be recommending to their development teams or clients, based on these? In this mini symposium, experts will share their interpretations of the emerging ICH S7B-related science and discuss how they are recommending the implementation of current strategies and methodologies. The theme of this mini symposium will be an interactive dialogue amongst all workshop participants, to share and learn from the collective experience. The session co-chairs, Grace Furman and Derek Leishman will facilitate discussions. All toxicologists with an interest in the advances in science and methods related to the assessment of QT prolongation are encouraged to participate in this interactive and thought-provoking forum!