Back to the Top
Does anyone know of a reference to 1/2LLOQ as the least biased
estimate of the true concentration to be assigned in AUC calculation
when a BLQ value falls between two quantifiable values?
David Pierce,
Dept of Clinical Pharmacology & PK, Shire Pharmaceuticals
Back to the Top
1) If it is a GLP or a GLP assay done to support a clinical trial how
did
you get a valueother
than2) If you are going to disregard the LLOQ as meaningless, I would
strongly
urge that you at least follow Roger Jellife suggestions of using the
Fisher
information to justify use of any such point.
Edward F. O'Connor, PhD
78 Marbern Drive
Suffield, CT 06078
efoconnor.-at-.cox.net
Back to the Top
The following message was posted to: PharmPK
David,
I do not think this is correct as a general statement. If this happens
on the declining portion of the curve, the quantifiable level of the
next measurement means that both measurements (LOQ and the next) are in
the range of variability of the assay around LOQ. Therefore, the least
biased value would be LOQ. The same reasoning applies to the rising part
of the curve with a switch of "next" to "previous". Also, in the context
of estimating the PK parameters (eg calculation of AUC), this would
still give a biased result since the next value (in the case of
declining curve) should strictly speaking be also approximated by LOQ.
To correct for the upward bias of the next value, the LLOQ value should
be approximated by a lower value, say LOQ minus 1/2 range of variability
of the assay at LOQ. And last, if you want statistically rigorous "least
biased" value, you'll need to make assumptions (on behavior the curve,
on timepoints spread, on distribution of assay variability). You result
will depend on those assumptions.
Regards,
Katya
--
Ekaterina Gibiansky
Senior Director, PKPD, Modeling & Simulation
ICON Development Solutions
Ekaterina.Gibiansky.at.iconplc.com
Back to the Top
Dear Ekaterina:
This question comes up over and over again, and never seems
to die its proper death. The thing to do is simply not to censor low
data. There is no need. There are a gazillion files about this in the
PharmPK archives already. Ask David how you can look at them. Also, go
to our web site,
http://www.lapk.org/
Click on PK/PD advances. Click on the new thing there. You will see
the context and the reasoning behind why the ideas of CV% and BLQ are
obsolete approaches which were developed back in the times when lab
data was not fitted quantitatively to make a model. You know, even a
dumb MD like me can figure out what a measurement like 0.2 ug/ml, +/-
0.8, means, for example. The measurement is the best estimate of what
is there, not some random number or maybe 1/2 the LOQ. There is NO
NEED for the lab to attempt to try to pre-empt the thought processes
of others on such points by censoring low data. You can quantify the
SD all the way down to and including the blank. Censoring low data
limits the ability of the lab to be useful to those who want to use
its results intelligently in the 21st century. We now can do much
better.
Also, click around on the teaching topics. The whole idea is
to be able to develop maximally precise drug dosage regimens for
patient care with potentially toxic drugs, in developing the initial
regimen, in monitoring and updating the model of the drug in each
individual patient as we learn from the serum concentrations and other
responses, for example, and how we then do the most precise adjustment
of the regimen to hit our targets most precisely once again. This
article describes the tools to do the job properly, and the software
in which they are used. Also, you can look at a couple of oldies, with
older and less precise mathematical methods and software, in
Jelliffe RW: Explicit Determination of Laboratory Assay Error
Patterns - A Useful Aid in Therapeutic Drug Monitoring (TDM). Check
Sample Series: Drug Monitoring and Toxicology, American Society of
Clinical Pathologists Continuing Education Program, Chicago, Il, 10
(4) : pp.1-6, 1990.
Jelliffe RW, Schumitzky A, Van Guilder M, Liu M, Hu L, Maire P, Gomis
P, Barbaut X, and Tahani B: Individualizing Drug Dosage Regimens:
Roles of Population Pharmacokinetic and Dynamic Models, Bayesian
Fitting, and Adaptive Control. Therapeutic Drug Monitoring, 15:
380-393, 1993.
The idea has been around a long time.
Very best regards,
Roger Jelliffe
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
email= jelliffe.aaa.usc.edu
Our web site= http://www.lapk.org
Back to the Top
Dear David
It is good to consider it as BLQ only rather than assigning some value
I suggest to refer winnonlin user guide if any further info is required
Regards
Rajasekhar
Back to the Top
The following message was posted to: PharmPK
The problem is that a typical LC method for an unknown will not have
that
data. It will simply be 0.2 ug/mL. It is a single measure. Ligand
binding assays will have a 0.2 ug/mL +/- XX based on two or three
replicates
and might be more amenable. In either case, unfortunately, or
fortunately,
regulations covering reporting and use of data from clinical or non
clinical
bioanalytical work rely on the LLOQ, ULOQ, and measures such as %Bias
and %
CV for assay acceptance and reporting.
Ed F. O'Connor,PhD
78 Marbern Drive
Suffield, CT 06078-1533
email: efoconnor.aaa.cox.net
Back to the Top
Thanks to all who replied to my question about *LLOQ and I apologize
for bringing up this old soar again without fully searching the
archive. I appreciate the continuing debate about whether it is
correct or old-fashioned to censor data below LLOQ. However, I think
that one or two of the respondents in the archive caught the
appropriate note of pragmatism when they drew attention to what is
likely to happen, rightly or wrongly, when confronted with a
regulatory auditor, if an appropriately validated LLOQ has not been
applied to the bioanalytical data before PK data analysis. The debate
on this issue may continue to rage in the PK community, and there's
probably no single right answer that is not dependent on the use to
which you want to put the analysis. However, I'll sign off and try to
avoid perpetuating the discussion further in these pages for the time
being.
Thanks again.
David Pierce
Back to the Top
Dear David:
Please do not apologize for bringing up the issue. It surely
needs to be discussed in all its forms. The important thing is to
educate the regulatory auditor, and not to be passive. Science is what
counts, not passive submission to an obsolete approach. The use to
which the analysis is put is what is important. In the past, such
data was never fitted using any mathematical approach. However,
another issue is that of censoring low data of PCR assays, etc. Having
a viral load of <50 copies, for example, is NOT enough. This is was,
and you want to kill ALL the bad guys. You really want to get the
measurement down to zero if possible, without any artificial debate as
to whether or not it is "still present". Everyone can easily
understand, for example, what a result of 4 +/- 15 means, for example.
Censoring low data inserts very significant bias into assay results,
and corrupts our knowledge of what is going on. Let's keep up the
debate, and get labs to be more useful than they have in the past. Go
to www.lapk.org. Click on new PK/PD approaches. Click on
Pharmacometric tools......, and you will see the issue in its fuller
context.
Very best regards,
Roger Jelliffe
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
email= jelliffe.aaa.usc.edu
Our web site= http://www.lapk.org
Back to the Top
I agree. Perhaps we should get an online meeting together, hash out
with
the regulatories would go for, then do some comparisons. See what the
impact on assays would be (cost, time, numbers). Try a few validations
using retrospective data and go from there.
Edward F. O'Connor, PhD
78 Marbern Drive
Suffield, CT 06078
efoconnor.-a-.cox.net
Back to the Top
The following message was posted to: PharmPK
Dear Roger and other colleagues,
How relevant compound potency or efficacy (vs LLOQ) is to
this discussion? Let us assume 2 scenarios, LLOQ is 1 ng/mL but EC50
is 1) 20
ng/mL and 2) 2 ng/mL. How relevant the shape of the conc-time curve
is? E.g., rapid
decline (e.g., from 1000 ng/mL to BLOQ of 1 ng/mL) vs slow decline
(from 10
ng/mL to BLOQ of 1 ng/mL).
Rostam
Back to the Top
Dear Ed:
That sounds super! How do we do this? Do you have any
software to do a videoconference? Or, just call me and let's map out a
campaign.
All the best,
Roger
[It is an interesting idea. The advantage of this and similar mailing
lists is that you don't need to get everyone together at once.
However, if there is a topic such as this that might interest a number
of people willing to communicate at the same time a text chat, audio
or video conference might be useful. I've seen demos of the Apple
iChat program that will allow 3-4 people video conferencing but
haven't used it past two at a time. I think Skype now provides video
conferencing but I don't know the limitations. I image there are other
free options and of course a number of commercial options.
Two questions?
1) How many people might be interested in participating about this
question?
2) Anyone with experience of video, audio or text options? Note, text
options would be easier to archive.
I'd be happy to keep track of answer to both questions if there is an
interest. Email me at david.at.boomer.org and compile a list of answer
to both questions - db]
Back to the Top
The following message was posted to: PharmPK
Firstly my apologies for coming in on this discussion a little late, so
I hope I am not repeating what has already been said. My two cents worth
is as follows:
I have recently (again!) been involved in the debate concerning
reporting units for drug assays for TDM. The answer here seems to be
that the units used are effectively determined by the purpose to which
the result is applied, and I suspect that this is also the answer to the
issue of including/excluding low concentration data points for PK
studies.
Given that data points below the LLOQ are often useful from a PK
analysis perspective, there would seem no reason to exclude such data.
On the other hand, regulators are only comfortable with objective
answers, and the LLOQ, widely accepted for drug assays for patient care,
provides the only universal objective answer currently available.
Perhaps those of us who are from a lab background and those from a pop
PK background can put together a joint set of principles to be applied
in determining inclusion/exclusion criteria for PK studies. These could
then be applied on a case by case basis and replace the current
contentious LLOQ approach.
Such a set of criteria may take into account such factors as EC50, LLOD
and the impact of assay variability at low concentration on the PK
parameters to be determined (Current modelling programs would allow
estimates of the impact of precision and accuracy, based on data
generated according to set criteria - If such data do not already
exist!).
I have no doubt that preparation of such a list would generate robust
discussion, and would not receive universal acceptance of the outcome,
but at least it would move us towards an improved situation compared
with that which currently prevails.
Regards,
Ross Norris,
Associate Professor. PhD, MAppSc, BAppSc.
Australian Centre for Paediatric Pharmacokinetics & Therapeutic Advisory
Service, Mater Pharmacy Services, Raymond Terrace, South Brisbane, Q
4101, Australia.
Back to the Top
The following message was posted to: PharmPK
The other point, just to make sure of the discussion is that the LLOQ is
required for drug development and filing for IND and NDA, it has been
a bit
since I was in clinical chemistry supporting TDM but the requirements
for
reporting may be more relaxed under CLIA, which would cover analytical
work
applied to actual patient care. DO you know if this is the case or
are the
requirements more LOD for CLIA and LLOQ for FDA?
Ed F. O'Connor,PhD
78 Marbern Drive
Suffield, CT 06078-1533
email: efoconnor.-a-.cox.net
Back to the Top
Dear Ed;
Thanks for your note. I do not know, but I do know that if
there are obsolete regulations in drug development, we need to get
them chenged. How do we do this? Any suggestions?
Roger
Back to the Top
Dear Rostam:
I don't understand your question. As I see it, the issue is
independent of the potency of a drug. The issue is in getting the best
model based on the data and its measurement errors. Once again, there
is no valid reason for setting an LLOQ or LOD. The credibility of the
data at each point is the issue. This is what determines the shape of
the 2 curves you are discussing.
Very best regards,
Roger
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
email= jelliffe.at.usc.edu
Our web site= http://www.lapk.org
Back to the Top
The following message was posted to: PharmPK
There are several needs to explore adoption of Fisher Information for
acceptance, and it has to be done carefully. The recent issue of the
Bland
Altman approach is an illustration of this. While almost thoroughly
denigrating correlation analysis, they ignored the fact that most
correlations, when performed with the analysis of residuals and test of
significance, yield the same conclusion as their test.
1) Deciding on curve and QC levels in a validation effort to demonstrate
comparability or improvement between use of Fisher and the use of the
currently accepted LLOQ.
2) In most cases, the LLOQ is in fact set up by the sponsor or the
validation and may be modified somewhat by the laboratory in conjunction
with the sponsor. Regulatory issues, time and finances restrict the
insertion of additional points below the LLOQ. An exception is in ligand
based assays where additional points may be used to "anchor" the
regression.
3) The work then would first need to be done in a development mode, then
transferred to a validation mode where it would be tested to
demonstrate and
characterize performance of standards and QC in the range of LLOQ to
ULOQ
(current acceptance) and extending, in the case of Fisher acceptance,
from
LLOQ to lower concentrations.
4) Some party would need to underwrite this effort since most
pharmaceutical
Cos and Biotechs focus on satisfying timelines, budgets and regulatory
requirements, and this effort would be outside their immediate
interest. It
is possible that an instrument (LC-MS/MS and ligand binding-MDS?),
reagent
or software company might underwrite this effort and this avenue needs
to be
explored.
Ed F. O'Connor,PhD
78 Marbern Drive
Suffield, CT 06078-1533
email: efoconnor.-at-.cox.net
Want to post a follow-up message on this topic?
If this link does not work with your browser send a follow-up message to PharmPK@boomer.org with "Assignment of conc value to BLQ in middle of profile for AUC calculation" as the subject | Support PharmPK by using the |
Copyright 1995-2011 David W. A. Bourne (david@boomer.org)