Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

(http://www.covance.

com)

Non-Compartmental Analysis of Pharmacokinetic Data: Considerations to Gain


Efficiencies and Meet SEND Requirements
Posted by Dan Lynch (/author/dan-lynch.html)
Published On May 16 2018, 3:14 PM (https://blog.covance.com/2018/05/non-compartmental-analysis-of-pharmacokinetic-data-considerations-to-gain-efficiencies-
and-meet-send-requirements.html)

Pharmacokinetic (PK) data gathered in the early phases of drug discovery program can provide insights on a
compound’s mechanism of action, identify specific attributes of interest and guide decision points to optimize
downstream development. Selecting the most appropriate analysis technique is essential to computing PK
parameters.

This article discusses how non-compartmental analysis (NCA) of pharmacokinetic data can help support
regulatory filings, create predictive simulations and help researchers select lead molecules or formulations. We
also explore the topic of data handling as differing approaches and anomalous results can cause delays through investigations and inconsistencies across a
program. Finally, we’ll cover unique considerations when working with biologics and the challenges involved with submitting regulatory filings formatted to the
Standard for Exchange of Nonclinical Data (SEND) specifications.

Setting up your sampling scheme

Protocol development represents an ideal time to discuss study endpoints with your study director and pharmacokineticist. A robust, efficient sampling
scheme is best reverse-engineered from the critical pharmacokinetic (PK) parameters. To start, it is universally important to capture the maximum
concentration (Cmax), time of maximum concentration (Tmax), and area under the curve (AUC). The collection of additional samples during the absorption
(absorption rate constant (Ka)) and the terminal elimination (half-life, clearance and volume of distribution, (t1/2, CL, Vd, respectively)) phases can be equally
vital to determine parameters relevant to compartmental modeling for translational simulations and predictions, development or refinement of more advanced
pharmacokinetic/pharmacodynamics (PK/ PD), physiologically based pharmacokinetic (PBPK) models and strategic adjustments to dose levels or
frequencies.

In toxicology studies, the toxicokinetic (TK) parameters are a secondary endpoint and, accordingly, the sampling schemes are less robust. Figure 1 highlights
the risk of characterizing elimination in with a limited sampling scheme. A stripped-down approach centered on Cmax and AUC should be considered to allow
for a focus on exposure and, ultimately, safety margin. In addition, inclusion of additional trough samples during the dosing phase can provide important
information regarding the magnitude and onset of accumulation, while collections during a recovery phase can help confirm a similar rate of elimination as
seen in pharmacokinetic studies for compounds with longer half-lives, especially biologics.

Figure 1: Concentration-Time profiles showing additional time points in the elimination phase

Managing data handling


In parallel with sampling scheme discussions, a plan for the handling of data and unexpected study events should be outlined. In a preclinical realm, the
limited blood volume from research models within PK dosing windows and the limited number of research models included on a study places a cap on the
precision of the results. Therefore, a simplistic data handling approach using nominal doses and sampling times, unless SOP-defined deviations are
documented, can provide consistent parameters and, in our experience, is preferred by the majority of the industry.

An area with more variety of opinions in the industry is the treatment of concentrations that are below the lower limit of quantitation (BLQ). As bioanalytical
methods have become more sensitive, the portion of the AUC affected by BLQ values has been decreased, as presented in Figure 2. Taking a conservative
approach for toxicology work by unconditionally treating BLQs as zero will minimize the exposure at any dose level and ensure future projections are not based
on inflated values. For consistency, we also recommend this approach for pharmacokinetic studies. Ultimately, the most important part of choosing BLQ
handling criteria is consistency across the program. This ensures the same set of assumptions when calculating parameters and bringing data together for
regulatory filings.

Figure 2: Concentration-time profiles showing impact of different BLQ handling

Taking a conservative approach for toxicology work by unconditionally treating BLQs as zero will minimize the exposure at any dose level and ensure future
projections are not based on inflated values. For consistency, we also recommend this approach for pharmacokinetic studies. Ultimately, the most important
part of choosing BLQ handling criteria is consistency across the program. This ensures the same set of assumptions when calculating parameters and
bringing data together for regulatory filings.

At bioanalytical data receipt, the pharmacokineticist will perform a full review of the data for any impact, including sample reconciliation, assessment of study
conduct and review for potential anomalous values. If anomalous values are observed, consider performing an investigation into any root cause noted in the
study conduct, for example, a missed dosing, sampling time, or vomitus following oral administration or bioanalysis, such as an improperly diluted sample or a
sample switch.

If no cause is identified, statistical tests such as Dixon’s Q test can be used to determine if any value is an outlier. Any outlier or value impacted from study
conduct or bioanalysis could be excluded from interpretation, while any anomalous value that is not a statistical outlier may be presented with PK results
including and excluding the value in order to ensure that the reader can determine the overall impact on PK interpretation.

Incorporating special considerations for biologics

Recombinant protein technology and RNA-interference has opened up new classes of molecules that offer highly efficacious potential with minimal off-target
toxicity. As the industry continues to shift towards these biologic drugs, a variety of approaches is required to handle the many different classes of therapies,
including monoclonal antibodies (mAbs), antibody drug conjugates (ADCs), and small interfering

RNAs (siRNAs). We can provide guidance on the additional considerations for PK/TK analysis that these molecules require, such as the presence of potential
anti-drug antibodies (ADAs).

During protocol development, the sample collection scheme may need to be altered from a traditional small molecule approach to ensure that critical PD
endpoints are also captured, for example, sample collections at both Cmax and the maximum effect, Emax. For mAbs, the recovery phase may be used to
characterize elimination, attempt to gauge the impact of target-mediated drug disposition (TMDD) or ADA on drug clearance, and quantify the total exposure
over the duration of the study. For ADCs, additional time points following administration may be selected in order to determine exposure to the payload. For
siRNAs, it may be critical to collect tissue samples at key organs, such as the liver (1), kidney, or spleen, to confirm uptake as the plasma data for these
compounds tends to be less representative of the overall exposure.

While the industry generally agrees that the presence of ADAs and potential impact should be discussed in a TK report, there is little consensus on how to best
present TK results. A variety of viewpoints exists, such as a preference to include all research models regardless of ADA status, exclusion of research models
with ADAs regardless of exposure impact, exclusion of research models with ADAs and impacted exposure, exclusion of research models based on toxicology
findings, presentation of descriptive statistics with and without these criteria, and everything in between.

With the variety of indications and pharmacology of biologic therapies along with the variability still observed in ADA assays, we recommend ensuring that the
impact of ADAs is clearly presented and criteria are consistently applied across the program.

Meeting SEND requirements

To ensure consistency of data being submitted across different sites or studies within a program and to allow FDA to use query and visualization tools on all
datasets at once, applicable studies supporting IND filings beginning in December 2016 were required to be accompanied by SEND-formatted data
submissions. With representation on the Pharmacokinetic Concentration (PC) and Pharmacokinetic Parameter (PP) domains through the PC/PP Work-stream
group, we helped develop the current industry consensus on how to best prepare SEND submissions and can generate PC/PP domains for other vendors.

While SEND formatted data provides another layer of complexity, as presented in Figure 3, it also provides opportunities for efficiency. Covance has built
Phoenix® WinNonlin® workflows using validated objects which are capable of using bioanalytical data in either PC or traditional linear formats, provided
either internally or from external sources.

Figure 3: Considerations for SEND

To facilitate seamless incorporation of the pharmacokinetic parameter (PP) files into the overall SEND submission, Covance also provide a series of
supporting files and validate the PP Domain results using the Pinnacle 21 Validator Tool.

Pool Def file, which translates which research models belong to each pharmacokinetic profile, is included only in studies where a sparse sampling scheme is
used. The PP Define file and Study Data Reviewer’s Guide (SDRG), which define the methodology used in analysis, deviations from the in-life phase
incorporated in analysis, and the results of the PP domain validation, are included for all studies. Taken together, these files allow faithful recreation of the PP
domain results directly from the bioanalytical data.

In summary, there are many steps to consider when analyzing PK data. Studies often progress without ensuring all the endpoints needed are being gathered,
which can lead to critical data misses, study repeats and delayed timelines. Other challenges may occur along the way, such as misaligned sampling schemes
that can cause duplication of effort, missed PK parameters, or even discrepancies that may require further investigation. The earlier conversations about your
strategy begin, the better.

With 170 clients in the last year, the Covance modeling and simulation team is in a unique position to survey the industry and provide consistent and efficient
approaches that can meet study objectives across all phases of drug development. Let’s start the conversation to see how we can support your needs
(http://www.covance.com/).

1 Dinkel, V., et al., Serial Survival Liver Biopsies in Dogs and Monkeys. Poster presented at: Society of Toxicology, the 52nd Annual SOT Meeting and
ToxExpo, 10-14 March 2013.

RE L ATE D P O S TS

(/2015/02/integrated- (/2012/09/dried- (/2012/09/study- (/2012/09/identifying- (/2012/09/best-

solutions.html) blood-spot- links-selection-of- challenges-and- practices-and-

Integrated Solutions:
Combining Multiple
Endpoints into a
Single Study Delivers
Greater Confidence preferred-central-lab- proposing-solutions- solutions-in-the-new-
(/2015/02/integrated-
solutions.html)

You might also like