SUMMARY
REPORT
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

Contract
No.
68­
C­
98­
189
Versar
Work
Assignment
No.
3­
49
Prepared
for:

U.
S.
Environmental
Protection
Agency
Office
of
Water
Office
of
Science
and
Technology
Health
and
Ecological
Criteria
Division
401
M
Street,
S.
W.
Washington,
D.
C.
20460
Prepared
by:

Versar,
Inc.
6850
Versar
Center
Springfield,
Virginia
22151
September
2002
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
TABLE
OF
CONTENTS
I.
INTRODUCTION
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
1
II.
BACKGROUND
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
2
III.
CHARGE
TO
THE
PEER
REVIEWERS
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
3
IV.
GENERAL
COMMENTS
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
4
V.
RESPONSE
TO
CHARGE
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
7
VI.
SPECIFIC
COMMENTS
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
39
VII.
MISCELLANEOUS
COMMENTS
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
43
LIST
OF
FIGURES
FIGURE
1.
Cause
and
Effect
Chart
for
Uncertainty
and
Detection
in
Analytical
Measurements
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
9
FIGURE
2.
Contribution
to
the
Measurement
Uncertainty
for
Dioxin/
Furan
Analysis
in
Food
and
Feed
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
10
FIGURE
3.
Coefficient
of
Variance
Versus
Concentration
for
Feed
Samples
at
Parts­
per­
Trillion
Levels
Using
an
EPA
Method
1613b
Equivalent
Procedure
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
17
APPENDIX
A:
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts
APPENDIX
B:
Charge
to
Peer
Reviewers
APPENDIX
C:
Dr.
W.
Marcus
Cooke
Curriculum
Vitae
APPENDIX
D:
Dr.
Walter
W.
Piegorsch
Curriculum
Vitae
APPENDIX
E:
Dr.
David
M.
Rocke
Curriculum
Vitae
APPENDIX
F:
Dr.
A.
Dallas
Wait
Curriculum
Vitae
APPENDIX
G:
Dr.
W.
Marcus
Cooke
Comments
APPENDIX
H:
Dr.
Walter
W.
Piegorsch
Comments
APPENDIX
I:
Dr.
David
M.
Rocke
Comments
APPENDIX
J:
Dr.
A.
Dallas
Wait
Comments
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
1
of
49
I.
INTRODUCTION
The
purpose
of
this
peer
review
was
to
evaluate
the
scientific
credibility
of
the
document
entitled
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts."
The
document
and
associated
references
are
provided
in
Appendix
A.
The
peer
review
was
performed
to
obtain
an
independent
evaluation
by
expert
authorities
on
analytical
chemistry
and
statistics
to
determine
whether
the
Agency's
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"
is
sound.

In
January
1993,
responding
to
recommendations
in
the
report,
"
Safeguarding
the
Future:
Credible
Science,
Credible
Decisions,"
Administrator
William
Reilly
issued
an
Agency­
wide
policy
for
peer
review.
Administrator
Carol
Browner
confirmed
and
reissued
the
policy
on
June
7,
1994
and
instituted
an
Agency­
wide
implementation
program.
The
principle
underlying
the
Peer
Review
Policy
is
that
all
major
scientific
and
technical
work
products
should
be
peer
reviewed.

Peer
review
is
a
process
for
enhancing
a
scientific
or
technical
work
document
so
that
the
decision
or
position
taken
by
the
Agency,
based
on
the
technical
document,
has
a
sound,
credible
basis.
The
goal
of
the
Agency's
Peer
Review
Policy
is
to
ensure
that
scientific
and
technical
work
products
receive
appropriate
levels
of
critical
scrutiny
from
scientific
and
technical
experts
as
part
of
the
overall
decision
making
process.
Generally,
this
technical
review
precedes
the
customary,
more
broadly
based
public
review
of
the
total
decision.

The
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"
was
reviewed
by
a
panel
of
four
peer
reviewers.
The
charge
to
the
peer
reviewers
is
provided
in
Appendix
B.
These
panelists
were
selected
because
of
their
expertise
in
the
field
of
statistics
and
analytical
chemistry
and
absence
of
conflict
of
interest.
The
peer
review
panel
did
not
include
any
experts
that
directly
or
indirectly
contributed
to
the
development
of
EPA's
method
detection
limit
(
MDL)
or
minimum
level
(
ML).
The
four
panelists
that
were
selected
to
perform
the
review
were
Dr.
W.
Marcus
Cooke,
Dr.
Walter
W.
Piegorsch,
Dr.
David
M.
Rocke,
and
Dr.
A.
Dallas
Wait.
Curriculum
vitae
for
each
of
the
four
panelists
are
provided
in
Appendices
C
through
F.
Peer
review
comments
received
from
each
of
the
four
panelists
are
provided
in
Appendices
G
through
J.

The
comments
and
recommendations
from
the
four
reviewers
have
been
combined
and
organized
as
follows:


General
comments;


Response
to
charge;


Specific
comments
by
page
number
referenced
by
commentor;


Miscellaneous
comments.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
2
of
49
II
BACKGROUND
A
peer
review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"
was
conducted
according
to
Agency
policy.
On
June
8,
1999,
EPA
promulgated
Method
1631B:
Mercury
in
Water
by
Oxidation,
Purge
and
Trap,
and
Cold
Vapor
Atomic
Fluorescence
Spectrometry
(
64
FR
30417).
The
method
was
developed
specifically
to
measure
concentrations
of
mercury
at
ambient
water
quality
criteria
levels,
and
includes
a
method
detection
limit
(
MDL)
of
0.2
ng/
L
(
ppt)
­
approximately
400­
times
lower
than
previously
approved
methods.

Shortly
after
promulgation,
EPA
was
notified
of
a
legal
challenge
to
the
method.
The
basis
of
the
challenge
included
several
specific
aspects
of
Method
1631
itself,
as
well
as
the
procedures
used
to
establish
the
specific
MDL
and
ML
published
in
the
method.
EPA
has
applied
those
procedures
to
most
of
the
environmental
measurement
methods
published
by
the
Office
of
Water.
The
MDL
procedure
has
been
widely
adopted
across
EPA.

On
October
19,
2000,
EPA
entered
into
a
settlement
agreement
with
the
Alliance
of
Automobile
Manufacturers,
Inc.,
the
Chemical
Manufacturers
Association,
the
Utility
Water
Act
Group
and
the
American
Forest
and
Paper
Association.
The
settlement
agreement
requires
EPA
to
reassess
EPA's
method
detection
limit
(
MDL;
40
CFR
136,
Appendix
B)
and
minimum
level
of
quantitation
(
ML)
procedures.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
3
of
49
III.
CHARGE
TO
THE
PEER
REVIEWERS
The
charge
for
the
peer
reviewers
was
provided
by
the
EPA
WAM
and
included
introduction
and
background
information
related
to
the
document,
along
with
the
charge
questions.
The
charge
to
the
peer
reviewers
is
provided
in
Appendix
B.
The
reviewers
were
asked
to
respond
to
the
following
charge
questions:

(
1)
In
Chapter
2,
EPA
recognizes
and
is
willing
to
accept
other
detection
and
quantitation
concepts,
and
has
attempted
to
identify
concepts
that
have
been
widely
used
or
are
widely
known.
Are
there
other
concepts
and
procedures
that
EPA
should
evaluate?
If
so,
please
provide
supporting
rationale
and
citations.

(
2)
In
Chapter
3,
has
EPA
adequately
identified
and
characterized
the
issues
that
need
to
be
considered
when
evaluating
detection
and
quantitation
limit
concepts
in
the
context
of
implementation
under
the
Clean
Water
Act?
If
not,
please
identify
additional
issues
and
provide
a
rationale
for
each
addition.
Are
there
any
issues
discussed
that
are
not
critical
and
can
be
deleted?
If
so,
please
identify
those
issues
and
provide
a
rationale
for
each
deletion.

(
3)
Do
the
evaluation
criteria
in
Chapter
4
adequately
reflect
the
discussion
of
issues
identified
in
Chapter
3?
If
not,
please
explain.
Do
you
believe
EPA
should
eliminate
any
of
the
six
evaluation
criteria
or
add
other
criteria?
If
yes,
please
identify
the
criteria
to
be
added
or
eliminated
and
explain
your
rationale.

(
4)
Is
the
assessment
in
Chapter
5
of
the
TSD
valid?
Are
the
detection/
quantitation
concepts
presented
in
the
that
chapter
conceptually
and
operationally
sound?
Identify
positive
and
negative
features
and
justifications
for
your
conclusions.

(
5)
Do
you
agree
with
the
conclusions
presented
in
Chapter
6?
If
not,
please
explain.

(
6)
Prior
to
proposal
of
revised
detection
and
quantitation
concepts,
should
EPA
evaluate
other
available
data
sets?
Bearing
in
mind
that
in
order
to
effectively
assess
various
concepts,
data
sets
must
reflect
measurements
made
below
the
detection
limit,
in
the
range
of
detection
and
quantitation
limits,
and
in
the
normal
measurement
range
of
the
method,
are
there
any
available
data
sets
that
you
recommend
EPA
consider?
If
so,
please
identify
them
and
explain
why
they
are
appropriate.

(
7)
Has
EPA
dealt
with
the
interlaboratory
versus
intra
laboratory
issues
appropriately
and,
if
not
what
recommendations
would
you
make
for
dealing
with
the
issues
more
appropriately?

(
8)
Can
you
recommend
any
improvements
to
the
detection
and
quantitation
procedures
described
in
the
TSD?
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
4
of
49
IV.
GENERAL
COMMENTS
Dr.
Cooke's
Comments
The
U.
S.
Environmental
Protection
Agency
(
EPA)
developed
Method
1631B
for
determination
of
mercury
(
Hg)
"
in
the
range
of
0.5
 
100
ng/
L"(
1).
EPA
Method
1631B
has
a
broad
range
of
applications.
"
The
Method
is
based
on
a
contractor­
developed
method
(
Reference
1)
and
on
peer­
reviewed,
published
procedures
for
the
determination
of
mercury
in
aqueous
samples,
ranging
from
sea
water
to
sewage
effluents(
2
 
6)."

The
method
detection
limit
(
MDL)
using
procedures
described
in
40
CFR
136,
Appendix
B
was
stated
in
Method
1631B
to
be
0.2
ng/
L
when
no
interferences
are
present.
The
minimum
level
of
quantitation
(
ML)
was
stated
as
having
been
established
at
0.5
ng/
L.
Method
1631B
states
that
MDLs
as
low
as
0.05
ng/
L
can
be
achieved
for
low
Hg
samples
by
"
using
a
larger
sample
volume,
a
lower
BrCl
level
(
0.2%),
and
extra
caution
in
sample
handling".
EPA
has
published
a
number
of
guidance
documents
and
training
materials
to
assist
in
clean
sampling
and
sample
handling
with
Method
1631.

Method
1631B
further
states
that
detection
limit
and
minimum
levels
of
quantitation
usually
are
"
dependent
on
the
level
of
interferences
rather
than
instrumental
limitations".

Although
Method
1631B
is
cited
as
a
"
performance­
based
method",
equivalency
must
be
met.
In
fact
Method
1631B
states
that
any
modification
of
this
Method,
beyond
those
expressly
permitted,
"
shall
be
considered
as
a
major
modification
subject
to
application
and
approval
of
alternate
test
procedures
under
40
CFR
136.4
and
136.5".
Thus
the
detailed
methodology,
sampling,
sample
handling,
and
Quality
Control
techniques
described
in
Method
1631B
will
be
rigidly
followed
by
any
group
regulated
under
the
National
Toxics
Rule,
the
Great
Lakes
Water
Quality
Initiative,
or
National
Pollutant
Discharge
Elimination
System
(
NPDES)
permitting
under
the
Clean
Water
Act.
As
such
it
is
important
to
provide
technically
accurate,
efficient,
and
laboratory­
friendly
guidance
for
any
group
attempting
to
generate
defensible
data
using
Method
1631B
or
any
of
the
EPA
Office
of
Water
approved
methods
used
in
the
aforementioned
regulations.

Dr.
Cooke
prepared
a
document
in
response
to
the
"
Charge
to
Reviewers"
working
under
a
subcontract
to
Versar
Incorporated
to
provide
a
peer
review
for
Method
1631B
(
7)
as
described
in
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts".
"
Charge
to
Reviewers"
does
not
cite
Method
1631,
Revision
C,
which
is
dated
March
2001
in
the
pertinent
method
document,
and
modifies
test
method
sections
12.4.2
and
9.4.3.3
to
clarify
use
and
reporting
of
field
blanks(
8).
This
document
refers
only
to
a
review
of
EPA
Method
1631B,
not
modifications
made
through
EPA
Method
1631C.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
5
of
49
References
Cited
1.
"
Method
1631,
Revision
B:
Mercury
in
Water
by
Oxidation,
Purge
and
Trap,
and
Cold
Vapor
Atomic
Fluorescence
Spectrometry",
United
States
Environmental
Protection
Agency,
Office
of
Water
EPA­
821­
R­
99­
005,
May
1990.(
c.
f,
Guidelines
Establishing
Test
Procedures
for
the
Analysis
of
Pollutants;
Measurement
of
Mercury
in
Water
(
Method
1631,
Revision
B);
Final
Rule
at
40
CFR
part
136,
published
in
the
Federal
Register
(
64
FR
30417;
June
8,
1999).)

2.
Bloom,
Nicolas,
Draft
"
Total
Mercury
in
Aqueous
Media",
Frontier
Geosciences,
Inc.,
September
7,
1994.

3.
Fitzgerald,
W.
F.;
Gill,
G.
A.
"
Sub­
Nanogram
Determination
of
Mercury
by
Two­
Stage
Gold
Amalgamation
and
Gas
Phase
Detection
Applied
to
Atmospheric
Analysis,"
Anal.
Chem.
1979,
15,
1714.

4.
Bloom,
N.
S;
Crecelius,
E.
A.
"
Determination
of
Mercury
in
Sea
water
at
Subnanogram
per
Liter
Levels,"
Mar.
Chem.
1983,
14,
49.

5.
Gill,
G.
A.;
Fitzgerald,
W.
F.
"
Mercury
Sampling
of
Open
Ocean
Waters
at
the
Picogram
Level,"
Deep
Sea
Res
1985,
32,
287.

6.
Bloom,
N.
S.;
Fitzgerald,
W.
F.
"
Determination
of
Volatile
Mercury
Species
at
the
Picogram
Level
by
Low­
Temperature
Gas
Chromatography
with
Cold­
Vapor
Atomic
Fluorescence
Detection,"
Anal.
Chim.
Acta.
1988,
208,
151.

7.
Work
Assignment
3­
49,
"
Peer
Assessment
of
Detection
and
Quantitation
Concepts,
Versar
Contract
68­
C­
98­
189
to
the
U.
S.
EPA,
August­
September,
2002.

8.
Method
1631,
Revision
C:
Mercury
in
Water
by
Oxidation,
Purge
and
Trap,
and
Cold
Vapor
Atomic
Fluorescence
Spectrometry",
United
States
Environmental
Protection
Agency,
Office
of
Water,
EPA­
821­
R­
01­
024,
March
2001.

Summary
Statement
EPA
has
done
an
exemplary
job
of
communication
with
the
regulated
community
through
"
out
reach"
programs
associated
with
Method
1631.
Several
supporting
documents
like
Guidance
for
Implementation
and
Use
of
EPA
Method
1631
for
the
Determination
of
Low­
Level
Mercury
(
40
CFR
part
136)
(
EPA
821­
R­
01­
023,
March
2001),
and
Method
1669
("
Sampling
Ambient
Water
for
Trace
Metals
at
EPA
Water
Quality
Criteria
Levels")
are
examples
of
valuable
supporting
tools
EPA
provides.

Through
public
meetings,
training,
and
documents,
EPA
staff
help
practitioners
master
the
challenges
of
sampling,
clean
containers,
field
and
laboratory
blanks,
handling
ultra
trace
metals
samples,
and
conducting
these
difficult
tests.
EPA
is
commended
on
the
way
they
have
publicly
communicated
technical
issues
associated
with
complex
new
methods
like
Method
1631.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
6
of
49
Dr.
Piegorsch's
Comments
The
Technical
Support
Document
(
TSD)
is
well­
organized
and
intelligently
thought­
out.
It
has
strong
scientific
merit
and
establishes
a
good
baseline
from
which
further
discussion
and
debate
may
continue
on
the
important
issue
of
detection
limits
and
quantification
of
contaminants
in
the
nation's
water
supply.

Dr.
Rocke's
Comments
Dr.
Rocke
had
no
general
comments.

Dr.
Wait's
Comments
The
crux
of
the
Alliance
of
Automobile
Manufacturers,
Inc.
et
al.
v.
Carol
Browner
settlement
agreement
involves
the
age­
old
battle
between
theoretical
science
and
practical
science,
with
both
sides
waving
the
flag
of
sound
science.
The
Legislative
Branch
has
recently
joined
the
fray
by
acknowledging
the
importance
of
data
quality
in
an
amendment
attached
to
a
law
enacted
by
the
106th
Congress
[
PL106­
554].
The
law,
known
as
the
"
Data
Quality
Act"
or
the
"
Information
Quality
Law,"
mandates
that
the
Office
of
Management
and
Budget
(
OMB)
issue
guidance
to
Federal
agencies
for
"
ensuring
and
maximizing
the
quality,
objectivity,
utility,
and
integrity
of
information
(
including
statistical
information)
disseminated
by
federal
agencies."
In
turn,
OMB
has
mandated
Federal
Agencies
such
as
EPA
to
implement
data
quality
guidelines
by
October
1,
2002.
I
believe
the
construct
of
EPA's
Technical
Support
Document
(
TSD)
is
consistent
with
the
spirit
of
this
law,
as
it
should
be.
In
addition,
EPA
should
be
applauded
for
invoking
the
Daubert
factors
(
testing
and
validation,
peer
review,
rate
of
error,
and
general
acceptance
in
the
scientific
community);
a
ruling
about
which
I
have
previously
written
opinions
(
See,
Brilis,
Worthington
and
Wait,
"
Quality
Science
in
the
Court
Room:
US
EPA
Data
Quality
and
Peer
Review
Policies
and
Procedures
Compared
to
the
Daubert
Factors
[
Environ.
Forensics
1:
197­
203],
and
Wait,
"
Environmental
Forensic
Chemistry
and
Sound
Science
in
the
Courtroom
[
Fordham
Environ.
Law
Journal
12:
293­
327]).
The
use
of
the
Daubert
approach
is
defensible
and
should
give
the
resultant
consensus
document
long­
term
standing.
Overall,
I
believe
the
TSD
effort
is
a
rigorous,
open
and
honest
attempt
by
EPA
to
resolve
a
technically
and
operationally
difficult
matter
in
a
manner
fair
to
all
sides.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
7
of
49
V.
RESPONSE
TO
CHARGE
QUESTIONS
Dr.
Cooke's
Comments
In
recent
years
budgetary
constraints
have
limited
EPA's
ability
to
perform
the
extensive
validation
studies
conducted
in
the
1980s
on
earlier
methods
like
MCAWW
and
"
600
Series"
methods.
While
monies
were
constrained,
legal
and
technical
drivers
moved
ahead
unabated:

S
Improved
method
measurement
techniques
(
e.
g.,
atomic
fluorescence
for
Hg)

S
New
laws
and
regulations
requiring
more
analytes
and
lower
levels
(
e.
g.,
The
Biosolids
Rule)

S
Advances
in
basic
knowledge,
especially
risk­
related
effects
(
e.
g.,
the
EPA
Dioxin
Reassessment
Alternate
Approaches
Could
Include
a
Body
of
Method
Verification
at
Low
Concentration
Detection
Performed
in
Europe
­

EPA
could
consider
data
reliability
and
detection
approaches
that
have
been
developed
by
the
European
Union
(
EU).
For
a
long
time
EPA
methods
have
been
considered
the
"
gold
standard"
in
much
of
the
world.
In
recent
years
the
EU
has
been
using
and
extending
basic
EPA
methods,
especially
in
the
area
of
operational
quality
control.

One
example
is
EPA
Method
1613b,
a
method
used
globally
to
measure
dioxin
and
furan
using
high­
resolution
mass
spectrometry
(
HRMS)
in
many
matrices,
not
just
Clean
Water
Act
samples
for
which
the
method
was
developed.
On
July
1,
2002,
the
EU
implemented
comprehensive
regulation
of
human
food
and
animal
feed
using
the
operational
equivalent
of
Method
1613b.
Vegetable
foodstuffs
as
an
example
must
be
tested
at
levels
below
0.3
pg/
g
(
parts
per
trillion)
Toxic
Equivalence
Quotient
(
TEQ).

Chapters
2
and
3
address
the
issue
of
international
and
risk­
based
regulations
successfully
operating
below
the
method­
defined
MDL
(
Chapter
3,
Page
3­
5).
Method
1613b
lists
compound
ML
at
1­
5
pptr
in
solid
samples
which
corresponds
to
an
order
of
magnitude
higher
detection
than
the
official
food/
feed
dioxin
limits
imposed
in
Europe.
1.
In
Chapter
2,
EPA
recognizes
and
is
willing
to
accept
other
detection
and
quantitation
concepts,
and
has
attempted
to
identify
concepts
that
have
been
widely
used
or
are
widely
known.
Are
there
other
concepts
and
procedures
that
EPA
should
evaluate?
If
so,
please
provide
supporting
rationale
and
citations.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
8
of
49
When
U.
S.
TEQ
reporting
protocols
were
applied
in
Europe,
Method
1613b
could
not
be
used
due
to
quality
control
issues,
not
detection
issues.
Since
non­
detected
analytes
are
reported
as
zero
values
in
the
U.
S.,
many
American
laboratories
report
erroneous
results.
All
non­
detects
are
reported
as
zero
so
no
correction
for
detection
is
made
to
Method
1613b
data
for
environmental
reporting.
The
U.
S.
Food
and
Drug
Administration
(
FDA)
does
report
nondetects
using
½
the
MDL
in
dioxin/
furan
computations
that
involve
TEQ
for
regulatory
purposes.

EU
regulators
applied
an
Upper
Bound
reporting
limit
where
all
non­
detects
are
found,
using
the
EPA
Method
Detection
Limit
(
MDL)
for
each
analyte.
This
forces
laboratories
to
achieve
levels
available
with
modern
instrumentation,
otherwise
the
Upper
Bound
reporting
level
is
above
the
regulatory
compliance
level
and
the
data
(
or
foodstuffs)
are
rejected.

Also
many
European
Union
(
EU)
procedures
have
trueness
criteria.
This
is
accuracy
determined
by
percent
(%)
recovery
of
an
accepted
reference
material.
Trueness
is
a
valuable
improvement
to
EPA
methods
and
will
be
discussed
later.
In
order
to
incorporate
trueness
into
an
EPA
method
validation
study,
an
appropriate
reference
material
would
need
to
be
developed
ahead
of
time
and
included
in
validation
studies.

Eppe
and
Pauw
have
detailed
the
elements
of
uncertainty
that
affect
detection
limits
and
uncertainty
in
ultra
trace
dioxin/
furan
measurements
(
9­
11).
Statistical
evaluation
of
detection
and
data
reliability
for
these
data
is
based
on
ISO
17025
requirements.
ISO
17025
includes
laboratory­
specific
uncertainty
estimation
as
a
part
of
data
validation.
Analytical
chemists
have
to
demonstrate
the
quality
of
their
measurements
by
associating
the
evaluation
of
uncertainty
with
their
results.

This
data
treatment
is
based
on
a
Eurachem
Guide,
which
provides
guidelines
to
evaluate
uncertainty
in
analytical
measurement
(
12).

Also
recent
VAM
protocols
give
additional
tools
for
uncertainty
evaluation
from
validation
data.
The
ISO/
VAM
process
to
evaluate
uncertainty
is
based
on
4
stages:

!
First
­
Specify
the
measurand.

!
Second
­
Identify
uncertainty
sources.

!
Third
­
Quantify
uncertainty
components.

!
Fourth
­
Calculate
combined
or
total
uncertainty.

Figure
1
shows
a
cause
and
effect
diagram
presented
by
Eppe
et
al.
to
show
components
of
detection
and
uncertainty
in
analytical
measurements.
The
strength
of
this
treatment
is
a
tested
system
to
rigorously
measure
individual
elements
of
data
uncertainty
and
detection.

Eppe
et
al.
(
9)
were
able
to
sum
individual
analytical
parameters
and
quantify
principal
sources
of
uncertainty
for
ultra
trace
measurement
of
dioxin/
furan
in
food
products
as
shown
in
Figure
2.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
9
of
49
Figure
1.
Cause
and
Effect
Chart
for
Uncertainty
and
Detection
in
Analytical
Measurements
from
Eppe
et
al.
(
9).
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
10
of
49
Figure
2.
Contribution
to
the
Measurement
Uncertainty
for
Dioxin/
Furan
Analysis
in
Food
and
Feed
from
Eppe
et
al.
(
9).
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
11
of
49
EPA
may
want
to
consider
recent
advances
in
the
statistical
treatment
of
analytical
method
data
that
has
evolved
in
Europe
for
three
reasons.

S
The
EU
is
conducting
the
second
largest
trace
chemical
analytical
program
in
the
world.
EU
testing
will
exceed
$
1B
in
the
first
three
years.
(
EPA
is
the
record
holder
with
the
Contract
Laboratory
Program,
$
1.5B+).

S
The
EU
has
applied
an
EPA
1600
method
equivalent
(
1613b)
to
a
regulatory
program
that
screens
all
food
and
animal
feed
used,
produced
or
imported
into
Europe.

S
The
subject
EU
program
has
developed
practical
solutions
to
applying
modern
ultra
trace
measurements
(
and
statistical
verification)
to
a
legally­
based,
widelyapplied
testing
program.

References
Cited
9.
G.
Eppe,
E.
De
Pauw,
"
ISO
17025
Requirements:
How
to
Evaluate
Uncertainty
for
Dioxin
Analysis
in
Food
and
Feed
from
Validation
Data?",
Proceedings
of
the
22nd
International
Symposium
on
Halogenated
Environmental
Organic
Pollutants
and
POPs,
Barcelona,
Spain,
August
12­
18,
2002,
Vol.
59,
pp.
403­
406,
2002.

10.
Eppe,
G.,
De
Pauw,
E.,
"
Are
Target
Dioxin
Levels
in
Animal
Feedingstuffs
Achievable
for
Laboratories
in
Terms
of
Analytical
Requirements?
Results
of
an
Interlaboratory
study.
",
Proceedings
of
the
22nd
International
Symposium
on
Halogenated
Environmental
Organic
Pollutants
and
POPs,
Barcelona,
Spain,
August
12­
18,
2002,
Vol.
59,
pp.
407­
410,
2002.

11.
Eppe,
G.,
De
Pauw,
E.,,
OrganohalogenCcompounds,
2002,
submitted.

12.
Quantifying
uncertainty
in
analytical
chemistry,
EURACHEM/
CITAC
Guide
2000.

13.
Barwick
V.
J.,
Ellison
S.
L.
R.
"
Development
and
Harmonisation
of
Measurement
Uncertainty
Principles.
Protocol
for
Uncertainty
Evaluation
from
Validation
Data",
VAM
Project
3.2.1.

Dr.
Piegorsch's
Comments
I
remain
comfortable
with
the
broad­
based
concepts
discussed
in
the
TSD.
As
mentioned
previously
(
in
my
reply
to
Question
8),
however,
the
EPA
should
build
into
its
mercury
detection
strategy
 
and
elsewhere
as
they
see
fit
 
a
review
and
critique
of
how
the
use
of
composite
sampling
would
improve
MDL
and/
or
ML
determination(
s).
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
12
of
49
Dr.
Rocke's
Comments
The
list
of
detection
and
quantitation
concepts
is
sufficiently
complete
for
this
analysis.

Dr.
Wait's
Comments
EPA's
discussion
about
the
historical
development
of
detection
limit
and
quantitation
concepts
is
consistent
with
my
recollection
of
events
over
the
past
three
decades.
The
thoroughness
of
the
on­
line
search
for
relevant
documents
provided
in
the
Reference
Section
and
Appendix
A
of
the
TSD
was
impressive.
A
review
of
my
files
on
MDLs
found
nearly
every
paper
to
be
listed
in
Appendix
A.
However,
on­
line
searches
often
don't
capture
information
contained
in
text
and
reference
books
(
e.
g.,
Budde's
2001
book
entitled
"
Analytical
Mass
Spectrometry").
Has
this
area
of
information
been
adequately
addressed
by
EPA?

EPA
appears
to
have
closely
examined
the
detection
and
quantitation
approaches
by
other
professional
organizations.
Has
EPA
rigorously
examined
how
these
concepts
are
perceived
and
implemented
by
other
federal
and
state
agencies
(
e.
g.,
USGS,
NRC,
FDA,
DOD,
DOE)?
For
example,
Chapter
19
of
the
recently
published
draft
document
entitled
"
Multi­
Agency
Radiological
Laboratory
Analytical
Protocols
Manual"
(
MARLAP)
discusses
detection
and
quantitation
issues.
It
would
be
useful
for
EPA
to
tabulate
the
concepts
used
by
federal
and
state
agencies
in
this
Section
of
the
TSD.

Another
source
of
information
may
be
Case
Law.
Has
EPA
examined
whether
there
is
a
legal
record
detailing
EPA
(
and
others,
e.
g.,
NEIC,
DOJ)
opinions
on
these
matters?
If
so,
their
opinions
and
those
of
the
Court
should
be
acknowledged.

Technically,
EPA
should
justify
why
7
replicates
were
chosen
to
determine
MDLs
rather
than
6,
8
or
some
other
number
(
refer
to
Section
2.2.1).
For
example,
although
Canada
has
closely
mimicked
EPA's
MDL
approach,
they
use
8
replicates
rather
than
7
("
Ontario
Ministry
of
Environment
Estimation
of
Analytical
Method
Detection
Limits
(
MDL)
 
Analytical
Method
Detection
Limits
Protocol
for
Municipal
and
Industrial
Strategy
for
Abatement
(
MISA)
Program"
[
ISBN
0­
7729­
4117­
3]).
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
13
of
49
Dr.
Cooke's
Comments
In
Chapter
3
the
TSD
has
done
a
good
job
of
identifying
technical
issues
that
affect
data
quality
and
subject
legal­
regulatory
constraints
under
which
the
Agency
is
operating
(
specifically
CWA
and
NTTA).

Chapter
3
states
six
(
6)
specific
issues
EPA
is
charged
with
evaluating
as
part
of
a
courtdirected
settlement:

1.
Statistical
model
selection
criteria,
2.
Parameter
estimation,
3.
Statistical
tolerance
and
prediction,
4.
Challenge
study
design
criteria
including
measurement
levels,
5.
Interlaboratory
effects,
6.
Probability
design.

The
TSD
lists
twenty
(
20)
technical/
regulatory
issues
that
should
be
considered
in
implementing
this
effort.
These
issues
can
be
grouped
into
5
(
five)
categories:
A.
Method
Performance
Criteria;
B.
Laboratory
Performance
and
Method
Flexibility;
C.
Regulatory
Constraints;
and,
D.
Quality
Control
Table
I.
20
Technical/
Regulatory
Topics
Addressed
in
the
TSD
A.
Method
Performance
Criteria
1.
Ease
of
Use
2.
Background
3.
Instrument
Non­
Response
4.
Lower
Limit
of
Measurement
5.
Matrix
Effects
6.
Outliers
7.
Sources
of
Variance
B.
Laboratory
Performance
and
Method
Flexibility
8.
Descriptive
vs.
Prescriptive
Use
of
Lower
Limits
of
Measurement
9.
Laboratory
Performance
Verification
10.
Laboratory­
Specific
Applications
11.
Non­
Regulatory
Applications
2.
In
Chapter
3,
has
EPA
adequately
identified
and
characterized
the
issues
that
need
to
be
considered
when
evaluating
detection
and
quantitation
limit
concepts
in
the
context
of
implementation
under
the
Clean
Water
Act?
If
not,
please
identify
additional
issues
and
provide
a
rationale
for
each
addition.
Are
there
any
issues
discussed
that
are
not
critical
and
can
be
deleted?
If
so,
please
identify
those
issues
and
provide
a
rationale
for
each
deletion.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
14
of
49
C.
Regulatory
Constraints
12.
Cost
13.
False
Positives/
Negatives
14.
Method
Development
15.
National
vs.
Local
Standards
of
Measurement
16.
NPDES
Uses
17.
Use
of
Pairs
of
Procedures
18.
Voluntary
Consensus
Body
(
VCB)
Procedures
D.
Quality
Control
19.
Censoring
Data
20.
Degradation
of
Method
Performance
Over
Time
Chapter
3
addresses
most
of
the
twenty
(
20)
technical
elements.
Also
the
TSD
addresses
elements
of
the
six
(
6)
directed
issues.
Actual
implementation
of
these
technical
and
statistical
issues
would
require
a
careful
study
that
either
controls
or
evaluates
the
effect
of
each
directed
issue
(
or
technical
issue)
of
concern.
EPA
may
want
to
consider
some
additional
issues
which
may
have
a
significant
effect
on
the
reliability
of
data
produced
at
ultra
trace
levels,
whether
to
determine
the
initial
presence
of
an
analyte
like
mercury,
or
reliably
apply
regulations
at
a
discharge
limit.

General
Comment
on
Quality
Control
and
the
Use
of
Reference
Materials
­

The
technical
issues
in
Chapter
3
concentrate
on
method
and
regulatory
issues
and
give
less
attention
to
Quality
Control.
Quality
Control
should
be
considered
in
greater
depth.

As
mentioned
earlier,
Method
1631B
identifies
interferences
rather
than
instrumental
limitations
has
having
the
greatest
negative
effect
on
detection
limits
and
minimum
quantification
levels.
The
TSD
discusses
use
of
real
world
matrices
in
determining
detection
or
quantitation
limits
at
low
levels
(
cf.,
page
3­
4).
Operational
laboratory
performance
can
be
addressed
by
use
of
appropriate
reference
materials
that
demonstrate
the
ability
to
handle
interferences
and
low
level
detection
as
an
operational
quality
control
procedure.

Calibration
required
in
Method
1631B
could
be
enhanced
by
use
of
a
reference
material
which
contains
a
"
real
world"
matrix,
and
also
mercury
forms
known
to
exist
in
natural
samples.
Method
1631B
states:
"
The
Method
is
based
on
a
contractor­
developed
method
(
Reference
1)
and
on
peer­
reviewed,
published
procedures
for
the
determination
of
mercury
in
aqueous
samples,
ranging
from
seawater
to
sewage
effluent
(
References
2
 
5)."
As
such
Method
1613B
is
designed
for
samples
ranging
from
reagent
water
to
saline
samples,
and
samples
with
high
dissolved
matter
contents.

Method
1613B
further
defines
mercury
forms
amenable
to
this
technique:
"
Total
mercury
 
all
BrCl­
oxidizable
mercury
forms
and
species
found
in
an
unfiltered
aqueous
solution.
This
includes,
but
is
not
limited
to,
Hg
(
II),
Hg
(
0),
strongly
organo­
complexed
Hg
(
II)
compounds,
adsorbed
particulate
Hg,
and
several
tested
covalently
bound
organo­
mercurials
(
e.
g.,
CH3HgCl,
(
CH3)
2Hg,
and
C6H5HgOOCCH3)."
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
15
of
49
Elemental
mercury
in
nature
often
converts
to
Cinnabar
or
meta­
Cinnabar,
forms
of
mercuric
sulfide.
These
are
very
stable,
innocuous
forms
of
mercury.
Ambient
samples
can
also
contain
organomercurials
that
have
elevated
human
toxicity.

EPA
might
consider
a
demonstration
study
to
show
how
"
safe"
and
"
unsafe"
mercury
forms
are
oxidized
by
BrCl,
and
are
subsequently
measured
by
this
method,
especially
at
low
concentrations
near
the
limit
of
detection.

An
appropriate
reference
standard
could
be
developed
that
incorporates
the
designated
sample
types
(
e.
g.,
sewerage
sludge,
brackish
and
saline
waters),
challenge
concentration
ranges,
and
mercury
forms
described
in
Method
1631B.

For
example
a
reference
standard
based
on
saline
sewage
effluent,
spiked
with
known
amounts
of
cinnabar,
elemental
mercury,
and
mercury
salts
could
be
developed
to
challenge
laboratory
performance
over
the
full
range
of
intended
applications.
(
Organomercurials
may
be
problematic
in
such
a
standard
due
to
safety
and
secondary
calibration
limitations.).
Such
a
standard
could
be
used
to
measure
trueness,
and
to
eliminate
data
from
laboratories
that
can
not
procedurally
handle
complex
environmental
samples.

Comment
on
Technical
Issues
­
A.
Method
Performance
Criteria
Ease
of
Use
­
The
complex
theoretical
treatments
defined
in
this
TSD
and
resultant
additional
analyses
required
in
regulatory
applications
of
Method
1631B
may
produce
significant
cost
due
to
new
supporting
analyses
needed
to
demonstrate
detection
and
data
reliability.
The
TSD
does
not
address
cost
to
users,
but
consideration
of
ease­
of­
use
and
cost
should
be
included
in
any
final
revisions
arising
from
this
process.

Background,
Matrix
Effects,
Sources
of
Variance
­
The
TSD
gives
considerable
discussion
to
the
problems
that
arise
from
background,
matrix
effects
and
sources
of
variance.
Other
topics
like
instrument
maintenance,
reliability
and
time
stability
of
calibration
standards,
anion
solubility
effects,
and
related
topics
are
also
important
to
implementation
of
the
method.
EPA
has
done
a
good
job
addressing
these
issues,
both
in
the
TSD
and
in
Method
1631B.

Instrument
Non­
Response,
Lower
Limit
of
Measurement,
Outliers
­
The
TSD
spends
considerable
time
addressing
models
to
define
limit
of
detection.
Also
instrument
non­
response
is
discussed
in
detail.

Good
quality
control
would
include
control
charts
that
identify
statistically
significant
loss
of
response
at
the
MDL
or
alternate
minimum
detection
level.
This
discussion
does
point
out
the
operational
difficulty
in
applying
a
method­
defined
MDL
to
single­
laboratory
determinations
of
few
samples.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
16
of
49
Data
in
the
TSD
and
referenced
publications
cite
the
loss
of
precision
for
ultra
trace
determinations
near
the
limit
of
detection.
Eppe
et
al.
(
10)
plotted
this
effect
as
shown
in
Figure
3.

The
subject
of
outliers
was
given
limited
attention
in
Chapter
3.
Outlier
treatment
is
a
statistically
valid
area
of
data
treatment.
Cochran's
test,
and
single/
double
Grubb's
tests
are
useful
in
evaluating
interlaboratory
data
sets
to
determine
outliers
and
stragglers.
Other
classical
outlier
tests
could
also
be
evaluated
in
examining
the
data
sets
used
in
the
TSD.

Comment
on
Technical
Issues
­
B.
Laboratory
Performance
and
Method
Flexibility
Descriptive
vs.
Prescriptive
Use
of
Lower
Limits
of
Measurement
­
EPA
typically
walks
a
thin
line
in
defining
descriptive
versus
prescriptive
procedures.
Regulatory
requirements
built
into
EPA
"
Final
Rules"
are
very
difficult
to
change
and
cause
a
high
level
of
legal
liability
to
laboratories
and
data
users.
It
is
important
for
EPA
to
build
as
much
flexibility
as
possible
into
CWA
methods
in
order
to
prevent
"
locking"
unreasonable
or
unsound
procedures
into
Final
Rule
methods.

Laboratory
Performance
Verification,
Laboratory­
Specific
Applications,
Non­
Regulatory
Applications
­
Laboratory
performance
is
built
into
Method
1631B,
however
quality
control
procedures
could
be
strengthened
to
insure
adequate
day­
to­
day
demonstration
of
laboratory
performance.
This
issue
is
discussed
several
times
in
this
review.

Non­
Regulatory
Applications
require
individual
treatment.
Uses
like
risk
assessment,
screening,
product
of
process
monitoring,
CERCLA
surveying,
all
are
valid
applications
of
EPA
numbered
methods,
but
all
require
specific
method
modifications
that
are
often
beyond
the
scope
of
a
method
defined
for
a
specific
matrix
or
regulatory
application
(
e.
g.,
Method
1631B
for
CWA
compliance).
In
practice
most
1600
series
methods
are
used
for
many
applications
beyond
their
original
intention.
Typically
the
user
must
make
and
defend
EPA
method
modifications
applied
to
new
analytes
or
new
matrices.

Comment
on
Technical
Issues
­
C.
Regulatory
Constraints
Method
Development,
False
Positives/
Negatives,
Cost
­
Chapter
3,
and
other
parts
of
the
TSD,
spent
a
lot
of
effort
reviewing
method
development
and
the
topic
of
false
positives
and
false
negatives.
Obviously
the
risk
of
a
false
negative
must
be
thoroughly
addressed
in
any
health­
protective
regulatory
environmental
method.
Likewise
false
positives
cause
unnecessary
disruption
to
the
regulated
community.
EPA
is
correct
in
giving
this
topic
significant
consideration.

Cost
was
not
addressed
adequately
in
the
TSD;
however,
finalization
of
this
review
process
and
proposed
method
modifications
would
need
to
be
identified
before
cost
could
be
estimated.
Method
modifications
could
lead
to
very
expensive
monitoring
programs
for
the
regulated
community
and
responsible
regulators.
This
topic
should
be
fully
addressed
in
summary
reviews
of
this
TSD.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
17
of
49
Figure
3.
Coefficient
of
Variance
Versus
Concentration
for
Feed
Samples
at
Parts­
per­
Trillion
Levels
Using
an
EPA
Method
1613b
Equivalent
Procedure.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
18
of
49
National
vs.
Local
Standards
of
Measurement,
and
NPDES
Uses
­
By
law
local
restrictions
must
be
equally
stringent
(
or
greater)
than
Federal
rules.
This
process
is
addressed
in
current
law.
Also
NPDES
permitting
is
well
established
in
the
United
States.
I
am
not
aware
of
any
unusual
legal
or
procedural
difficulties
that
arise
from
a
review
of
this
TSD.

Use
of
Pairs
of
Procedures
­
EPA
has
stated
in
the
TSD
that
one
primary
procedure
is
needed
for
clarity
and
to
avoid
confusion
among
stakeholders.
If
alternate
procedures
are
needed,
the
EPA
Clean
Air
Act
system
of
reference
and
equivalent
methods
has
worked
well,
and
could
be
a
model
for
EPA
to
follow
under
the
Clean
Water
Act.

Voluntary
Consensus
Body
(
VCB)
Procedures
­
EPA
has
strived
to
include
VCB
methods
in
the
TSD
and
has
conducted
an
extensive
review
and
discussion
of
candidate
VSB
procedures.
This
author
only
suggests
including
international
VSBs
(
especially
European
NGOs)
due
to
extensive
recent
investigations
of
detection
and
quantification
issues
on
similar
species.

Comment
on
Technical
Issues
­
D.
Quality
Control
Censoring
Data,
Degradation
of
Method
Performance
Over
Time
­
Method
flexibility
is
discussed
in
the
TSD
and
considers
time­
dependent
modifications
to
rigid
methods.
EPA
is
gaining
experience
in
this
area
and
newer
methods
do
address
this
concern.
EPA
has
developed
an
evolving
method
development
process,
which
has
been
shown
to
be
responsive
to
this
issue.

Dr.
Piegorsch's
Comments
In
Ch.
3
of
the
TSD,
many
important
issues
are
listed
for
evaluating
detection
and
quantification
limit
concepts.
I
applaud
the
EPA's
desire
to
consider
alternative
(
quantitative)
perspectives.
In
this
vein,
and
accepting
the
TSD's
interpretation
of
the
MDL
as
a
"
general
purpose
version
of
Currie's
critical
values
[
LC]"
(
p.
2­
3),
I
am
concerned
that
The
operational
definition
as
taken
from
pp.
5­
2/
5­
3
of
MDL
=
t
0.99(
df)
S
does
not
correspond
to
a
confidence
statement
that
I
can
interpret.
(
See
my
response
to
Question
#
4).
This
should
be
replaced,
although
I
agree
that
a
number
of
statistical
quantities
could
be
used;
this
is
where
the
"
fray"
seems
to
be
most
boisterous.
(
By
the
way,
the
TSD,
and
I,
should
be
more
careful
in
the
use
of
statistical
terminology.
We
both
refer
often
to
confidence
"
intervals,"
when
in
fact
the
quantity
of
interest
is
a
confidence
limit
­
or
tolerance
limit,
etc.
­
on
some
underlying
parametric
quantity.)
I
see
no
reason
to
change
the
core
of
my
previous
recommendations,
however.
If
we
accept
the
TSD's
argument
on
p.
3­
25
that
the
practical
value
of
tolerance
limits
is
limited,
then
the
MDL
should
be
viewed
as
a
prediction
limit.
And
if
so,
it
must
contain
an
additional
term
as
per
Gibbons
(
1994,
p.
98):

(
Eqn.
1)
t
0.99
(
df)
S
1

1
n
(
1)
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
19
of
49
The
single
most
problematic
issue
when
developing
a
detection
limit
is
correction
for
false
negatives.
I
took
from
the
TSD
(
in
§
3.3.6)
an
implicit
emphasis
on
LC­
type
values
such
as
the
MDL
[
when
correctly
calculated,
as
in
(
1)],
as
motivated
by
an
underlying
sort
of
practical/
environmental
conservatism
that
essentially
removes
false
negatives
from
the
estimator's
development.
I
am
willing
to
accept
this
interpretation.
I
suspect
the
fray
will
continue,
however,
since
there
seems
to
be
a
fair
amount
of
confusion
on
the
issue
in
the
analytical
chemistry
literature.
The
bottom
line
from
my
reading
of
the
TSD
is
that,
in
effect,
we
are
calculating
an
LC,
but
using
terminology
that
makes
some
readers
think
it's
an
LD.
I
can
accept
the
argument
that
false
negative
errors
are
not
the
critical
issue
here,
and
hence
that
the
approach
is
reasonable
(
once
correct
calculations
are
undertaken).
But,
the
Agency
should
put
forth
an
effort
to
overcome
this
confusion
in
terminology.
(
I
expect
they
will
ask
me
how,
and
in
reply
I'd
suggest
emphasizing
that
an
LC
calculation
is
a
form
of
decision
limit,
not
a
detection
limit.
But
here
I
suspect
many
users
will
still
confuse
the
terms,
or
reverse
their
meaning,
or
not
see
the
difference,
or
who
knows
what
else?
I
don't
know
how
winnable
this
battle
is...)

One
caveat:
although
I
think
the
prediction
limit
argument
is
acceptable,
if
the
use
of
tolerance
limits
rather
than
prediction
limits
is
in
fact
desired,
then
Gibbons'
(
1994,
p.
99)
presentation
or
an
equivalent
approach
should
be
used
instead
to
correct
the
MDL
calculation.

Dr.
Rocke's
Comments
The
issues
are
complete
as
laid
out
in
Chapter
3.

Dr.
Wait's
Comments
Matrix
effects
are
an
extremely
critical
element
to
be
considered
when
generating
MDLs.

As
EPA
notes,
since
each
environmental
sample
is
unique,
it
would
be
impossible
to
conduct
a
MDL
study
on
each.
The
best
means
of
dealing
with
this
reality
is
by
employing
on
a
project
by
project
basis
a
graded
approach
to
verifying
MDLs.
The
EPA
DQO
process
is
an
efficient
mechanism
for
addressing
the
variability
of
MDLs
between
different
matrices.
As
a
corollary,
"
EPA
believes
that
reference
matrices
should
be
used
to
establish
method
detection
and
quantitation
limits
 "(
pg
3­
4).
Has
EPA
considered
establishing
a
repository
of
"
typical"
matrices
where
low
background
concentrations
of
contaminants
are
thoroughly
characterized
similar
to
NIST
SRMs?
If
laboratories
had
the
option
of
evaluating
MDLs
using
matrices
similar
to
samples
they
were
studying
(
e.
g.,
POTW
wastewater,
salt
water,
river
sediment,
pond
sediment,
clay),
this
would
give
labs
an
option
in
demonstrating
their
analytical
capabilities
in
a
fashion
comparable
to
other
labs.
Their
use
would
not
preclude
the
basic
need
for
determining
MDLs
using
reagent
water,
nor
matrix
specific
MDLs.
Again,
the
use
of
these
low
level
matrices
would
be
determined
during
the
DQO
process.

Section
3.2.4
discusses,
in
part,
the
option
of
using
performance
standards
over
prescriptive
standards,
which
would
allow
laboratories
and
others
the
freedom
to
use
a
variety
of
different
approaches
to
establish
limits.
Although
theoretically
this
sounds
agreeable,
operationally
this
would
be
a
nightmare
and
comparability,
a
QA
tenet,
would
be
jeopardized.
I'm
not
in
favor
of
this
approach.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
20
of
49
Overall,
the
analytical
chemistry,
CWA
regulatory
issues,
and
statistical
issues
presented
in
this
Section
of
the
TSD
are
comprehensive.
The
issues
of
integrated
error
are
recently
becoming
more
appreciated
by
analytical
chemists.
In
Section
3.3.1,
the
discussion
on
sources
of
variability
could
be
enhanced
to
address
the
impact
of
variability
at
the
MDL
and
how
this
variability
impacts
data
use.
Refer
to
the
error
concepts
recently
discussed
by
EPA's
Deanna
Crumbling
and
a
soon
to
be
released
paper
by
Dr.
John
Maney
in
the
October
1
issue
of
Environ.
Sci.
&
Tech.

Although
informative,
Section
3.1.4,
which
discusses
measurement
quality
over
the
life
of
a
method,
could
probably
be
deleted
without
hurting
the
integrity
of
the
Chapter.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
21
of
49
Dr.
Cooke's
Comments
Six
(
6)
criteria
are
addressed
in
Chapter
4:
Criterion
1.
Scientific
Validity
Criterion
2.
Demonstrated
Method
Performance
Criterion
3.
Single
Laboratory
Method
Practical,
Affordable
Procedures
Criterion
4.
Assure
99%
Detection
Confidence
in
an
Experienced
Laboratory
Criterion
5.
Assure
Reliable
Quantification
Limit
in
an
Experienced
Laboratory
Criterion
6.
Procedures
are
Responsive
to
the
Clean
Water
Act
Scientific
validity
is
defined
two
ways:
legal
reliability
and
scientific
practice.
Criterion
1
is
defined
by
U.
S.
Supreme
Court
decisions
defining
expert
testimony.
Scientific
validity
is
based
on
publication
in
the
open
literature,
competent
peer
review,
and
general
acceptance
in
the
scientific
community.

The
legal
basis
for
Criterion
1
as
stated
in
the
TSD
is
as
follows:

1.
Procedure
which
can
and
has
been
tested.
2.
Publication
and
peer
review.
3.
Known
or
estimable
error
rate.
4.
Standards
to
control
operation.
5.
Widespread
acceptance
in
the
scientific
community.

The
primary
procedures
evaluated
by
EPA
for
detection
and
quantification
appear
to
meet
most
of
these
conditions,
e.
g.,
MDL/
ML,
ATM
IDE,
ACS
LOD,
IUPAC/
ISO
Detection
Limit.
One
open
question
concerns
condition
4,
"
Standards".
The
TSD
interprets
this
condition
to
mean
well­
documented
methodology.
U.
S.
Constitutional
law
intended
metrology,
the
legal
recognition
of
reference
measures,
as
a
Federal
responsibility.
If
the
court's
intent
was
to
include
legal
measures
(
metrology)
as
part
of
expert
testimonial
evidence,
then
the
need
for
a
defined
reference
material,
or
EPA
audit
standard,
is
implied
in
Criterion
1
and
should
be
considered.
This
reviewer
is
not
competent
to
answer
this
legal
question.
All
the
other
elements
of
Criterion
1
seemed
to
be
addressed
in
the
subject
TSD.

Criterion
2
appears
to
be
met
under
EPA
Method
1631B
and
other
EPA­
cited
methods
used
as
examples
in
the
TSD.
Measurement
of
variability
and
defined
method
expectations
may
require
a
special
study
that
addresses
all
candidate
alternate
procedures
and
parameters
that
interested
stake
holders
deem
significant.
3.
Do
the
evaluation
criteria
in
Chapter
4
adequately
reflect
the
discussion
of
issues
identified
in
Chapter
3?
If
not,
please
explain.
Do
you
believe
EPA
should
eliminate
any
of
the
six
evaluation
criteria
or
add
other
criteria?
If
yes,
please
identify
the
criteria
to
be
added
or
eliminated
and
explain
your
rationale.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
22
of
49
Criterion
3
addresses
performance
of
a
procedure
used
by
a
single
laboratory
that
is
practical
and
affordable.
This
criterion
is
important
because
it
isolates
theoretical
estimators,
and
large
demonstration
studies
(
interlaboratory
and
intralaboratory
comparisons),
from
the
fundamental
application
of
any
EPA
method
for
single
or
small
numbers
of
determinations.
The
most
common
situation
is
a
laboratory
performing
many
types
of
analysis
but
must
perform
EPA
Method
1631B
on
a
periodic
basis
where
reproducibility
is
poor.

Criterion
3
should
judge
method
ruggedness
and
appropriate
quality
control
tot
make
a
method
reliable
as
well
as
"
laboratory
friendly".
Criterion
3
addresses
cost
which
is
very
important
but
this
may
need
to
be
a
final
estimator
after
other
parameters
are
settled.

Criteria
4
and
5
address
the
primary
subject
matter
of
the
TSD
detectability
(
Criterion
4),
and
quantifiability
(
Criterion
5).
As
such
they
are
significant
and
should
be
maintained.

Criteria
6
addresses
conditions
in
the
method
that
meet
Federal
limits
and
allow
for
more
stringent
application
by
local
regulatory
bodies.
This
Criterion
is
essential
and
can
not
be
changed.

These
six
criteria
should
provide
a
vigorous
review
of
the
conditions
set
out
in
the
TSD.
This
reviewer
feels
that
Criterion
3
should
be
strengthened
both
in
performance
discussions
and
proposed
method
modifications.
Single
laboratories,
working
independently
"
start
from
scratch"
each
time
they
perform
the
method.
Quality
control
should
be
sufficient
to
insure
reliability
in
single,
isolated
determinations
of
small
sample
sets,
as
well
as
large
commercial
laboratories
performing
many
tests.

Dr.
Piegorsch's
Comments
The
evaluation
criteria
in
Ch.
4
do
seem
reasonable
at
first
reading.
I
do
not
think
any
of
them
should
be
eliminated,
and
I
do
not
have
any
concrete
suggestions
for
addition.
In
passing,
however,
I
should
note
that
as
I
continued
through
the
chapter,
I
found
it
perchance­
less­
thancoincidental
that
the
(
revised)
MDL
and
ML
concepts
seemed
to
satisfy
the
criteria
so
readily,
and
that
most
of
the
other
concepts
were
found
wanting.
(
A
cynical
reader
might
view
this
as
a
contrivance
that
elevates
the
MDL
and
ML
at
the
expense
of
the
other
methods,
and
perhaps
the
EPA
may
wish
to
proceed
with
caution
in
this
area.)

Dr.
Rocke's
Comments
The
evaluation
criteria
in
Chapter
4
are
adequate.
However,
I
believe
some
changes
should
be
made
to
criteria
4
and
5
as
outlined
below
in
the
comment
section.
Criterion
4
should
be
edited
to
reflect
its
essential
equivalence
to
an
implementation
of
Currie's
critical
level.
Criterion
5
should
be
completely
changed
to
reflect
the
fact
that
almost
all
implementation
of
limits
of
quantitation
have
nothing
to
do
with
whether
the
measurements
are
actually
quantitative.

Dr.
Wait's
Comments
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
23
of
49
All
of
the
criteria
used
by
EPA
are
pertinent
to
the
evaluation
of
viable
detection
and
quantitation
limit
methods.
Again,
the
recognition
of
the
role
of
the
Daubert
approach
(
Criterion
1)
is
particularly
important.
Criterion
3
is
obvious
and
necessary
(
practical
and
affordable
procedure
that
a
single
lab
can
perform).
The
recent
problems
with
lab
fraud,
as
enunciated
by
EPA's
Inspector
General,
Nikki
Tinsley,
in
an
open
letter
to
the
laboratory
community
(
September
5,
2001;
www.
epa.
gov/
oigearth/
eroom.
htm)
make
the
use
of
practical
and
efficient
methods
of
key
importance.

The
explanations
for
each
criterion
are
reasoned
and
persuasive.
I
would
not
remove
any
criteria
from
the
evaluation
process.
No
other
evaluation
criteria
are
apparent.

Dr.
Cooke's
Comments
The
assessments
given
in
Chapter
5
address
the
six
evaluation
criteria.
As
stated
before,
the
challenge
techniques
seem
to
have
a
common
limitation,
procedural
verification.
Either
the
technique
does
not
have
a
defined
procedure
to
determine
statistically
rigorous
measures
of
performance,
or
those
measures
are
not
available
to
adequately
compare
with
EPA's
MDL
and
ML.

MDL
and
ML
have
stood
the
test
of
time
and
provide
a
proven
methodology
which
meets
defined
evaluation
criteria
stated
in
the
TSD.

The
Chapter
5
assessment
appears
valid
based
on
stated
criteria.
Detection
using
MDL,
in
my
opinion,
is
valid.
Quantification
concepts
are
subject
to
a
higher
degree
of
scientific
challenge
and
interpretation.

Evaluation
criteria
stated
in
Chapters
3
and
4
do
not
address
adequate
measures
to
estimate
increased
variability
near
the
limit
of
detection.
Nor
do
they
establish
rigorous
criteria
for
data
acceptance.
In
practical
laboratory
operations
techniques
like
control
charts,
maintained
over
time,
would
provide
reliable
measures
of
variability
during
actual
laboratory
operation.

The
review
process
might
be
strengthened
if
EPA
were
to
suggest
experiments
to
evaluate
alternate
detection­
quantitation
procedures.

Operational
procedures
(
control
charts,
reference
standards,
audit
standards)
would
provide
additional
confidence
in
method
performance
at
ultra
trace
levels
in
"
real
world"
samples.

Dr.
Piegorsch's
Comments
4.
Is
the
assessment
in
Chapter
5
of
the
TSD
valid?
Are
the
detection/
quantitation
concepts
presented
in
the
that
chapter
conceptually
and
operationally
sound?
Identify
positive
and
negative
features
and
justifications
for
your
conclusions.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
24
of
49
The
assessment
in
Ch.
5
should
be
revisited
with
the
goal
of
including
the
issues
I
note
in
Question
#
2.
In
particular,
if
equation
1
(
equation
1
is
found
in
Piegorsch's
comments
for
Question
#
2)
or
some
other
new
limit
calculation
is
adopted
then
clearly
it
too
should
be
placed
under
a
similar
evaluation.

As
for
specific
positive
and
negative
features:

$
The
definition
of
S2
given
on
p.
5­
2
is
in
error.
A
correct
expression
is:

(
Eqn.
2)
S
2

1
n

1

n
i

1
X
2
i


n
i

1
X
i
2
n
$
As
noted
above,
on
p.
5­
3
(
lines
2
and
13)
the
suggestion
that
MDL
represents
a
95%
confidence
interval
is
spurious.
I
do
not
see
how
in
its
given
form
it
corresponds
to
an
appropriate
form
of
interval
estimator
(
and,
as
also
mentioned
above,
it's
a
limit,
not
an
interval).
Technically,
a
confidence
statement
provides
a
limit
or
interval
on
some
parameter,
say
q,
or
parametrically­
related
quantity.
I
do
not
see
which
such
quantity
t0.99(
df)
S
or
2.681Spooled
is
intended
to
bound.
This
issue
also
pertains
to
the
discussion
on
Condition
3
on
p.
5­
4.
(
Although,
the
issue
raised
there
about
"
uncertainty
in
the
estimates"
is
a
valid
argument.)
These
concerns
lead
me
to
suggest
the
revision
to
the
prediction
limit
construction
in
equation
(
1).

$
On
the
positive
side,
it
is
good
to
mention
(
p.
5­
4,
middle)
that
the
MDL
procedure
is
not
adjusted
for
outliers,
since
this
sort
of
subtlety
could
escape
the
casual
reader.

$
(
p.
5­
8,
top)
I
agree
that
the
IDE
procedure
as
outlined
is
so
complex
as
to
make
simple
determination
of
error
rates
associated
with
it
untenable.
This
point
is
worth
emphasizing.

$
I
liked
the
description
of
the
IUPAC/
ISO
detection
limit
(
starting
on
p.
5­
14).
Similarly,
I
thought
the
introduction
to
quantitative
assessment
of
the
ML
(
p.
5­
17)
was
concisely
presented.

Dr.
Rocke's
Comments
The
method
assessments
in
Chapter
5
are
sound
(
subject
to
comments
below
on
particulars
of
the
MDL).

Dr.
Wait's
Comments
The
thorough
evaluation
process
used
by
EPA
is
excellent!
A
comprehensive
and
open
discussion
was
performed
for
all
5
detection
limit
concepts
and
4
quantitation
limit
concepts.
These
discussions
fairly
debate
the
pros
and
cons
of
each
concept.

In
Section
5.1.1.2.1,
EPA
astutely
notes
that
many
people
complain
that
MDLs
can
vary
depending
on
spike
levels
used,
based
on
the
mistaken
assumption
that
spike
levels
may
be
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
25
of
49
arbitrarily
selected.
I
have
witnessed
this
same
complaint
numerous
times.
EPA
also
properly
notes
that
Step
1
of
the
MDL
procedure
specifies
a
number
of
criteria
that
must
be
met
in
selecting
spike
levels.
Apparently
many
chemists
just
don't
get
it.
It
would
be
advantageous
for
EPA
to
embellish
Step
1,
possibly
with
examples,
to
make
the
requirement
clearer.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
26
of
49
Dr.
Cooke's
Comments
I
agree
with
EPA's
primary
conclusions
as
stated
in
Chapter
6,
based
on
the
conditions
laid
out
in
Chapters
3,
4
and
5.
Furthermore
EPA
has
documented
that
Method
Detection
Limit
(
MDL)
is
a
sound
estimator
of
initial
signal
response
in
a
broad
range
of
analytical
methods.
MDL
has
stood
the
"
test
of
time"
and
I
could
not
find
a
convincing
statistical
argument
to
replace
MDL.
However
alternate
methods
do
demonstrate
potential
improvements
to
MDL
implementation
(
e.
g.,
criteria
for
initial
spike
determination
and
selection).
ML
and
other
candiate
procedures
for
quantitation
limits
show
significant
variability.

An
overall
conclusion
from
reading
the
TSD
is
that
EPA
has
made
a
strong
case
for
maintaining
MDL
and
ML
as
reference
procedures.
Alternate
procedures
could
be
accepted
if
formal
acceptance
criteria
were
developed
and
agreed
by
all
parties.
Then
side­
by­
side
testing
would
be
needed
to
evaluate
strengths
of
candidate
procedures,
and
adherence
to
acceptance
criteria.

There
are
no
strong
arguments
in
the
TSD
that
would
cause
a
level
of
concern
needed
to
suspend
EPA
regulatory
programs
or
methodology
pending
additional
review.
However
there
is
evidence
for
additional
study
of
variability
near
method
detection.
Figure
3
above,
and
comparison
data
for
mercury
in
Appendix
B
to
the
TSD
document
illustrate
this
point.
Two
equivalent
validation
programs
were
plotted
in
Appendix
B,
EPA
and
AAMA.
AAMA
data
for
mercury
by
ICP/
AES
(
Method
200.7)
are
shown
in
Appendix
B,
Plot
B.
1.2,
Page
8
of
13.
EPA
data
for
mercury
by
ICP/
AES
(
Method
200.7)
are
shown
in
Appendix
B,
Plot
B.
1.1,
Page
8
of
13.
Both
plots
show
response
versus
concentration
for
known
standards.
EPA
data
are
plotted
on
a
logarithmic
scale
which
tends
to
spread
observed
values.
All
three
plots
show
that
low
level
samples
are
subject
to
higher
relative
variance
and
should
be
treated
differently
from
data
above
an
agreed
quantitation
(
or
quantification)
limit.
5.
Do
you
agree
with
the
conclusions
presented
in
Chapter
6?
If
not,
please
explain.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
27
of
49
Dr.
Piegorsch's
Comments
I
am
hesitant
to
admit
formal
acceptance
of
the
conclusions
presented
in
Ch.
6
until
the
issues
raised
in
item
#
2,
above,
are
brought
into
further
consideration.

Dr.
Rocke's
Comments
The
conclusions
in
Chapter
5
are
generally
reasonable.
With
a
slight
alteration
to
the
specifications
on
the
spike
concentration
(
see
below),
the
EPA
MDL
as
now
given
is
a
reasonable,
practical
implementation
of
a
limit
of
detection
concept
and
method.
None
of
the
other
methods
is
an
improvement
on
this
overall.
With
respect
to
the
limit
of
quantitation
concept,
the
EPA
ML
is
as
good
as
any
of
the
others
given;
however,
all
are
flawed
by
the
assumption
that
there
is
some
level
higher
than
the
critical
level
needed
before
quantitative
assessments
can
be
made.
This
is
not
supported
in
this
document,
nor
anywhere
else
I
have
seen,
except
as
an
almost
unexamined
assumption.
The
entire
concept
of
a
quantitation
level
higher
than
the
critical
level
should
be
immediately
discarded.

Dr.
Wait's
Comments
The
discussions
and
findings
provided
in
Chapter
6
are
consistent
with
the
approach,
analysis
and
results
presented
throughout
the
TSD.
Most
of
the
assessments
provided
by
EPA
are
reasonable
and
defensible.
With
regards
to
alternative
MDL
and
ML
procedures
for
stakeholders
operating
under
CWA
programs,
what
options
is
EPA
considering
and
how
does
this
stand
up
from
a
comparability
standpoint
between
stakeholders?
Can
you
give
an
example?
I
realize
that
there
is
a
discussion
of
this
issue
in
Section
4.6
of
the
TSD,
but
I'm
having
a
difficult
time
understanding
what
differences
from
EPA's
MDL
procedures
presented
in
Appendix
D
will
actually
be
acceptable.
This
new
flexibility
may
lead
to
more
litigation.

Regarding
improvements
to
this
Section,
a
better
correlation
between
the
findings
in
Table
6­
1
and
the
associated
text
would
be
useful.
Within
Table
6­
1,
it
would
also
be
useful
to
reference
where
in
the
TSD
many
of
the
statistic
were
derived.
Also,
the
revised
MDL
procedures
presented
in
Appendix
D
should
be
mentioned.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
28
of
49
Dr.
Cooke's
Comments
I
am
not
aware
of
any
specific
data
sets
that
could
elucidate
the
various
approaches
and
challenges
listed
in
the
TSD.
Even
if
such
databases
exist,
it
would
be
very
difficult
to
make
the
appropriate
computations
and
solicit
adequate
reviews
from
interested
parties
given
the
limitations
of
the
six
evaluation
criteria.

EPA
has
identified
several
candidate
detection­
quantification
models
that
challenge
the
basic
MDL
and
ML
measures
used
in
many
EPA
water­
based
analytical
procedures.
Most
of
these
systems
have
not
had
method
validation
performed
with
the
rigor
EPA
requires
for
legally
defensible
data.
For
example
in
Chapter
5
("
Assessment")
four
of
the
candidate
alternate
procedures:
ACS
(
LOD),
IUPAC/
ISO
(
CRV),
and
IUPAC/
ISO
(
MDV)
fail
due
to
the
"
absence
of
a
procedure"
for
determining
the
value
of
interest.
ASTM
(
IDE)
is
rejected
for
a
number
of
reasons.

Most
of
these
procedures
are
rejected
because
they
have
not
been
tested
extensively
in
the
manner
that
EPA
challenges
its
internal
procedures
before
publication
for
regulatory
applications.
Candidate
alternate
procedures
were
drafted
by
Non­
Governmental
Organizations
(
NGOs)
as
generally
applicable
without
consideration
for
the
legal
constraints
placed
on
EPA.
EPA
procedures
are
formed
around
[
1]
legally­
defined
analyte
lists
(
e.
g.,
the
Priority
Pollutant
List),
producing
limited
numbers
of
analytes,
and
[
2]
"
bright­
line"
legal
limits
which
define
compliance
vs.
violation.
NGOs
usually
do
not
create
method
criteria
based
on
these
legal
constraints.
This
disconnect,
seen
between
EPA
and
candidate
alternate
procedures,
is
to
be
expected.
EPA
has
handled
this
problem
in
other
media
(
e.
g.,
the
Clean
Air
Act)
by
establishing
one
or
more
EPA
reference
procedures,
then
establishing
minimum
criteria
for
equivalency.
This
could
be
done
with
candidate
alternate
procedures
if
they
contain
statistically
sound
principals
that
allow
equivalent
performance.

To
completely
test
the
six
criteria
stated
in
the
TSD,
a
tailored
validation
study
would
need
to
be
designed
and
performed.
6.
Prior
to
proposal
of
revised
detection
and
quantitation
concepts,
should
EPA
evaluate
other
available
data
sets?
Bearing
in
mind
that
in
order
to
effectively
assess
various
concepts,
data
sets
must
reflect
measurements
made
below
the
detection
limit,
in
the
range
of
detection
and
quantitation
limits,
and
in
the
normal
measurement
range
of
the
method,
are
there
any
available
data
sets
that
you
recommend
EPA
consider?
If
so,
please
identify
them
and
explain
why
they
are
appropriate.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
29
of
49
Dr.
Piegorsch's
Comments
Since
I
do
not
have
at
my
disposal
any
new
data
sets,
nor
am
I
working
with
anyone
currently
who
does,
I
cannot
give
the
EPA
any
new
sources
of
data.
However,
I
would
encourage
the
EPA
to
expand
its
search
and
consider
as
many
additional
data
sets
as
can
be
acquired
in
a
reasonable
period
of
time.

Dr.
Rocke's
Comments
There
is
no
need
to
examine
additional
data
sets.

Dr.
Wait's
Comments
During
the
1980s
numerous
intralaboratory
method
studies
were
conducted
by
EPA's
ORD
group
in
Cincinnati;
some
of
which
may
have
looked
at
detection
limits.
Has
EPA
examined
any
of
their
historical
work
for
pertinent
MDL
information?
Also,
as
I
recall,
George
Stanko
of
Shell
presented
a
fairly
large
study
challenging
EPA's
detection
limits
for
volatile
organics
in
water
at
EPA's
annual
Analytical
Symposium
in
Norfolk,
Virginia?

Has
EPA
petitioned
large
trade
associations,
such
as
the
American
Petroleum
Institute
(
API),
about
detection
and
quantitation
studies
they
may
have
sponsored?

Personally,
I
am
not
aware
of
any
detection
and
quantitation
limit
data
sets
which
may
be
of
value
to
EPA.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
30
of
49
Dr.
Cooke's
Comments
Intralaboratory
Issues
­

EPA
has
done
an
adequate
job
showing
performance
of
EPA
methods,
especially
defining
detection
(
e.
g,
MDL).
EPA
has
presented
extensive
data
on
interlaboratory
studies
that
demonstrate
method
performance
for
a
number
of
EPA
regulatory
procedures.

Need
for
a
tailored
demonstration
study
­

To
fully
evaluate
alternate
approaches
a
cooperative
study
should
be
performed
that
is
designed
with
input
from
all
settlement
participants,
and
interested
outside
laboratory
professionals.
That
study
could
include
spike
levels,
blank
and
zero
determination,
intralaboratory
variability,
ruggedness
testing,
"
pairs"
determinations,
use
of
"
real
world"
samples,
evaluation
of
outlier
criteria,
sufficient
replicates
to
challenge
statistical
models,
and
reproducibility
versus
repeatability
(
e.
g.,
single
unbroken
series
of
determinations,
versus,
series
performed
on
different
dates
after
set
up
and
calibration).

Since
the
subject
legal
settlement
specifically
addressed
mercury
using
EPA
Method
1631B,
the
focus
of
any
collaborative
study
to
answer
questions
raised
in
the
TSD
and
legal
challenges,
should
include
this
specific
method.
In
a
joint
validation
study,
it
would
be
useful
if
EPA
incorporated
uniform
procedures
to
be
followed
for
any
alternate
procedures
that
supplant
EPA
numbered
methods.

Better
method
equivalency
and
method
flexibility
­

Simple
equivalency
procedures
are
routinely
specified
by
several
EPA
Offices.
For
air
determinations
EPA
provides
a
generic
method
protocol
to
demonstrate
method
performance.
This
is
used
to
show
that
an
alternate
procedure
is
suitable
for
reporting
accurate
data
for
regulatory
purposes.
For
stack
methods,
four
(
4)
concurrent
determinations
in
the
same
source
are
required.
EPA
could
use
this
type
of
process
to
set
and
demonstrate
simplified
equivalency
criteria
for
existing
EPA
water
and
waste
water
methods.

Such
guidance
(
method
equivalency
and
flexibility)
will
become
more
critical
as
detection
limits
are
driven
lower,
additional
analytes
are
required,
and
more
complex
matrices
are
added
to
areas
of
regulatory
concern.
7.
Has
EPA
dealt
with
the
interlaboratory
versus
intra
laboratory
issues
appropriately
and,
if
not
what
recommendations
would
you
make
for
dealing
with
the
issues
more
appropriately?
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
31
of
49
Interlaboratory
Issues
­

Need
for
an
approved
reference
and
audit
standard
­

Interlaboratory
performance
is
highly
variable
using
modern
ultra­
trace
methods
like
EPA
Method
1631.
When
small
batches
of
samples
arrive
at
most
environmental
laboratories,
they
are
scheduled
in
series
with
other
methods
and
different
analytes
than
mercury.
These
samples
must
be
checked
in,
records
verified,
proper
storage
and
chain­
of­
custody
implemented.
At
that
point
the
appropriate
equipment
must
be
started
and
calibrated.
This
process
is
the
worst
case,
intermittent
analyses
where
all
the
causes
of
variability,
and
sensitivity
loss,
are
maximized.
This
means
quality
control
on
every
batch
of
samples
becomes
critically
important.
EPA
validation
studies,
which
demonstrate
the
optimum
method
performance,
are
useful
guidance;
however,
onetime
optimum
performance
does
not
reflect
batch­
to­
batch
data
quality
in
actual
operation.
In
the
real
world
non­
optimum
operation
is
the
rule,
not
the
exception.
This
problem
is
exacerbated
with
ultra­
trace
methods.

Better
method
equivalency
and
method
flexibility
­

Method
1631B
is
an
example
of
a
prescriptive
method
where
form
exceeds
function.
The
criteria
for
any
change
to
the
method
are
so
restrictive
that
no
operating
laboratory
will
make
any
change
unless
major
financial
support
is
provided
to
conduct
the
EPA
required
proofs.
In
reality,,
under
Method
1631B
a
laboratory
has
no
option
to
even
perform
common
sense
changes
that
could
lessen
cost
and
improve
efficiency.
Also
in
Method
1631B
a
lone
supplier
is
identified,
and
the
caveat
"
or
equivalent"
is
added.
This
causes
a
monopoly
where
cost
of
supplies
and
equipment
can
drive
up
method
cost.

Example
1.
Black
anodized
aluminum
optical
block
­

"
6.3.2.5
Black
anodized
aluminum
optical
block
C
holds
fluorescence
cell,
PMT,
and
light
source
at
perpendicular
angles,
and
provides
collimation
of
incident
and
fluorescent
beams
(
Frontier
Geosciences
Inc.,
Seattle,
WA,
or
equivalent)."

Example
2.
Cold
vapor
generator
­

"
6.4.4
Cold
vapor
generator
(
bubbler)
C
200­
mL
borosilicate
glass
(
15
cm
high
x
5.0
cm
diameter)
with
standard
taper
24/
40
neck,
fitted
with
a
sparging
stopper
having
a
coarse
glass
frit
that
extends
to
within
0.2
cm
of
the
bubbler
bottom
(
Frontier
Geosciences,
Inc.
or
equivalent)."
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
32
of
49
Example
3.
Gold­
coated
sand
traps
­

"
6.5.2
Gold­
coated
sand
traps
C
10­
cm
long
x
6.5­
mm
OD
x
4­
mm
ID
quartz
tubing.
The
tube
is
filled
with
3.4
cm
of
gold­
coated
45/
60
mesh
quartz
sand
(
Frontier
Geosciences
Inc.,
Seattle,
WA,
or
equivalent)."

EPA
could
list
alternate
suppliers
and
approved
alternate
devices.
Or
even
better,
include
specifications
for
alternate
devices
and
exclude
listing
specific
suppliers.
For
example
gold­
coated
quartz
wool
traps
have
been
used
for
mercury
amalgamation
for
many
years.
Quartz
wool
would
has
much
better
air
flow
performance
than
sand.
A
gold
coated
quartz
wool
trap
would
have
much
less
pressure
drop
than
the
prescribed
sand
trap.
To
use
such
a
device
would
require
an
onerous
amount
of
study
and
documentation
that
would
deter
any
private
laboratory.

Difficulty
of
making
any
method
modifications
is
onerous
­

Method
1631
is
replete
with
sections
that
block
any
method
changes,
even
the
most
obvious
or
simple.
EPA
Method
1631
makes
modifying
the
method
a
major
endeavor.
This
is
overkill
for
the
stated
purpose,
limiting
changes.
In
reality
a
method
which
is
written
in
this
extraordinarily
confining
way,
may
spawn
legal
battles
over
minutiae
in
laboratory
reports
that
show
(
or
challenge)
the
smallest
elements
of
method
performance.
Sections
9.1.2.2
through
9.1.2.4
speak
for
themselves.

Two
examples
from
Method
1631
are
given
below.

Example
1.
Alternate
determination
methods
restricted,
Section
9.1.2
­

"
9.1.2
If
an
analytical
technique
other
than
the
CVAFS
technique
specified
in
this
Method
is
used,
that
technique
must
have
a
specificity
for
mercury
equal
to
or
better
than
the
specificity
of
the
technique
in
this
Method."

Example
2.
Onerous
requirements
for
even
the
smallest
modification,
Section
9.1.2.2
through
9.1.2.4­

"
9.1.2.2
The
laboratory
is
required
to
maintain
records
of
modifications
made
to
this
Method.
These
records
include
the
following,
at
a
minimum:

9.1.2.2.1
The
names,
titles,
addresses,
and
telephone
numbers
of
the
analyst(
s)
who
performed
the
analyses
and
modification,
and
the
quality
control
officer
who
witnessed
and
will
verify
the
analyses
and
modification
9.1.2.2.2
A
narrative
stating
the
reason(
s)
for
the
modification(
s)

9.1.2.2.3
Results
from
all
quality
control
(
QC)
tests
comparing
the
modified
method
to
this
Method,
including
the
following:
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
33
of
49
(
a)
Calibration
(
Section
10)
(
b)
Initial
precision
and
recovery
(
Section
9.2)
(
c)
Analysis
of
blanks
(
Section
9.4)
(
d)
Matrix
spike/
matrix
spike
duplicate
analysis
(
Section
9.3)
(
e)
Ongoing
precision
and
recovery
(
Section
9.5)
(
f)
Quality
control
sample
(
Section
9.6)
(
g)
Method
detection
limit
(
Section
9.2.1)

9.1.2.2.4
Data
that
will
allow
an
independent
reviewer
to
validate
each
determination
by
tracking
the
instrument
output
to
the
final
result.
These
data
are
to
include
the
following:

(
a)
Sample
numbers
and
other
identifiers
(
b)
Processing
dates
(
c)
Analysis
dates
(
d)
Analysis
sequence/
run
chronology
(
e)
Sample
weight
or
volume
(
f)
Copies
of
logbooks,
chart
recorder,
or
other
raw
data
output
(
g)
Calculations
linking
raw
data
to
the
results
reported"

In
summary
new
EPA
methods
like
Method
1631
may
be
too
restrictive,
especially
with
simple
configuration
issues,
like
specifying
a
specific
gold
sand
trap
by
one
producer,
while
the
desorber
used
with
that
trap
is
very
simply
stated.
A
home­
made
desorber
can
be
used
and
will
work
fine.
Why
not
simplify
other
devices
the
same
way.

"
Section
6.5.3
Heating
of
gold­
coated
sand
traps
C
To
desorb
Hg
collected
on
a
trap,
heat
for
3.0
min
to
450
B
500
EC
(
a
barely
visible
red
glow
when
the
room
is
darkened)
with
a
coil
consisting
of
75
cm
of
24­
gauge
Nichrome
wire
at
a
potential
of
10­
14
vac.
Potential
is
applied
and
finely
adjusted
with
an
autotransformer."

Dr.
Piegorsch's
Comments
The
issues
of
intralaboratory
and
interlaboratory
variation
are
quite
important,
and
I
applaud
the
TSD
for
its
consideration
of
them.
While
reasonably
addressed,
I
would
encourage
that
the
EPA
undertake,
commission,
or
actively
abet
a
formal
interlaboratory
study,
building
on
the
success
of
the
Method
1638
Interlaboratory
Validation
Study.
The
recognition
that
multiple
component
of
variation
can
exists
in
calculating
LC
(
or
any
other
form
of
detection/
decision
limit),
is
an
important
one,
and
such
calculations
must
be
based
on
appropriate
variance
components
for
the
model
under
study
(
Gibbons,
1995).
A
large,
carefully­
conducted
interlaboratory
study
would
make
a
major
contribution
towards
understanding
and
quantifying
these
components
for
use
in
future
detection
limit
calculations.

Dr.
Rocke's
Comments
The
EPA's
position
on
interlaboratory
vs.
interlaboratory
variability
is
reasonable.
See
Dr.
Rocke's
comments
in
the
Specific
Comments
section.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
34
of
49
Dr.
Wait's
Comments
The
use
of
interlaboratory
measurements
is
important
for
a
general
understanding
of
the
laboratory
communities'
capabilities,
but
is
not
as
relatable
to
the
issues
that
EPA
must
consider
in
support
of
a
permittee's
CWA
requirements.
Intralaboratory
measurements
are
more
practical.
EPA's
approach
between
inter­
and
intra­
studies
is
balanced
and
reasonable.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
35
of
49
Dr.
Cooke's
Comments
The
most
important
improvements
to
detection
and
quantification
are
quality
control
procedures
that
insure
accurate
data
for
specific,
individual
sets
of
samples.

Improvements
will
come
when
EPA
adds
defined
quality
control
procedure
to
show
that
laboratories
are
actually
performing
these
ultra
trace
methods
at
the
low
levels
needed.

Four
items
will
be
discussed:

[
1]
Audit
Standards,
[
2]
Analysis
Series,
[
3]
MSD
at
two
levels,
[
4]
Quality
adherence
and
audit
tools
1.
Audit
Standards
Method
1631
addressed
reference
materials
in
Section
9.6.

"
Section
9.6
Quality
control
sample
(
QCS)
C
The
laboratory
must
obtain
a
QCS
from
a
source
different
from
the
Hg
used
to
produce
the
standards
used
routinely
in
this
Method
(
Sections
7.7
B
7.10).
The
QCS
should
be
analyzed
as
an
independent
check
of
system
performance."

The
current
Method
1631
Quality
Control
Sample
is
a
calibration
standard
with
the
specific
purpose
of
using
dual
calibrants
from
separate
sources.
Method
1631
also
specifies
calibrants
traceable
to
NIST
standards
(
e.
g.,
NIST
3133,
a
standard
mercury
solution).
This
is
adequate
sourcing
for
calibration
solutions,
not
for
method
performance.

Much
more
is
required
to
show
intra­
laboratory
method
performance.
An
audit
sample
is
needed
of
known
composition
and
concentration.
EPA
should
provide
an
audit
standard
along
with
storage
guidance,
stability
information
and
validation
data.
This
should
be
a
standard
that
regulators
or
laboratory
mangers
can
dilute
to
known
concentrations
for
blind
performance
audits.
This
audit
material
should
be
made
from
multiple
mercury
forms
to
show
that
a
performing
laboratory
can
handle
complex
matrices.

The
EPA
Las
Vegas
Laboratory
has
been
supplying
audit
samples,
including
aqueous
standards,
for
mercury,
for
many
years.
Cinnabar
is
commonly
found
in
the
environment,
and
is
considered
"
safe"
to
humans
and
the
aquatic
environment.
The
audit
material
for
this
program
should
include
other
mercury
forms
that
reflect
all
species
listed
in
the
method
(
i.
e.,
"(
Hg(
II),
Hg(
0),
strongly
organo­
complexed
Hg(
II)
compounds,
adsorbed
particulate
Hg,
and
several
8.
Can
you
recommend
any
improvements
to
the
detection
and
quantitation
procedures
described
in
the
TSD?
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
36
of
49
tested
covalently
bound
organo­
mercurials
(
e.
g.,
CH3HgCl,
(
CH3)
2Hg,
and
C6H5HgOOCCH3)").

2.
Analytical
Series
Methods
like
Method
1631
should
define
a
calibration
series
for
actual
reported
measurements.
The
EPA
Contract
Laboratory
Program
(
CLP)
showed
the
need
for
this
approach.
The
CLP
program
used
Laboratory
Control
Samples
(
LCS)
to
show
calibration
during
analytical
measurements.
With
modern
instrumental
autosamplers
series
operation
is
simple
to
perform.
A
series
for
ultra­
trace
mercury
measurements
could
be
the
following:

Calibration
Standard
(
Medium
Concentration),
­>
Calibration
Standard
(
Low
Concentration),
­>
Sample
1,
­>
Sample
2,
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
­>
Sample
X,
­>
Sample
Y,
­>
Calibration
Standard
(
Low
Concentration),
­>
Calibration
Standard
(
Medium
Concentration)

The
initial
two
calibration
analyses
(
Medium
and
Low
Concentration
Standards),
would
cause
the
analysis
to
be
stopped
if
unacceptable
performance
is
observed.
The
first
analysis
after
native
samples
are
processed,
the
Low
Concentration
Standard,
is
important
because
it
shows
that
the
analytical
system
did
not
degrade
during
a
series
of
reported
analyses.
The
higher
concentration
standards
should
not
be
measured
first
because
higher
analyte
levels
can
pacify
active
sites
and
mask
method
failure
at
or
near
the
method
detection
limit.
Actual
and
measured
calibration
standard
values
should
be
reported.

This
LCS
series
should
be
reported
as
a
measurable
quality
performance
criterion.

3.
Matrix
Spike
Duplicate
(
MSD)

Other
regulatory
bodies
have
abandoned
the
MSD
criteria.
A
duplicate
analysis
(
one
degree
of
freedom)
provides
little
or
no
useful
information.
This
is
a
time­
wasting,
cost
escalating
step
in
many
EPA
numbered
methods.

Some
European
authorities
have
dropped
the
traditional
EPA
method
spike
duplicate
(
MSD).
They
specify
two
different
spike
levels
instead,
one
near
the
MDL
and
a
second
spiked
matrix
sample
above
the
MDL,
at
a
concentration
of
linear
response.
Method
1631
extensively
defines
low
and
intermediate
spiking
levels,
so
this
approach
could
easily
be
used
with
Method
1631.

A
stronger
quality
control
approach
would
be
to
analyze
a
reference
sample
(
previously
discussed)
from
a
recognized
metrology
authority
(
e.
g.,
NIST,
BCR),
or
through
the
EPA
contract
program
for
reference
standards,
along
with
a
matrix
spike
near
the
MDL.
This
would
be
performed
with
every
batch
of
20
samples
or
once
during
the
month
in
which
less
than
20
samples
are
analyzed.
Thereby
three
quality
control
objectives
could
be
included:
matrix
effects,
low
level
detection
in
the
matrix
tested,
and
performance
with
a
reference
material.

At
a
minimum
matrix
spike
analyses
should
be
performed
on
each
group
of
samples
that
represent
either
a
different
matrix,
or
a
separate
sample
batch.
From
a
regulatory
perspective
it
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
37
of
49
would
be
useful
if
EPA
defined
what
constitutes
a
discrete
matrix,
including
that
definition
in
the
method
itself.
This
would
prevent
a
large
batch
of
samples
from
different
sources
being
analyzed
together
and
only
one
matrix
tested.

4.
Quality
adherence
and
audit
tools
Since
staff
training
and
experience
vary,
and
Methods
like
1631
are
performed
at
irregular
intervals,
audit
and
adherence
tools
would
help
EPA
gain
consistency
in
method
performance.
Three
tools
are
suggested:
[
1]
Flow
Charts,
[
2]
Audit
Checklists,
[
3]
Control
Charts.

1.
Flow
Charts
Modern
methods
like
Method
1631
are
very
complex
due
to
the
very
low
measurement
levels
attained.
Prescriptive
steps
for
Method
1631
include:
Method
detection
limit
demonstration
(
Section
9.2.1),
Initial
precision
and
recovery
(
IPR)
(
Section
9.2.2),
Matrix
spike
(
MS)
and
matrix
spike
duplicate
(
MSD)
(
Section
9.3),
Ongoing
precision
and
recovery
(
OPR)
(
Section
9.5).

A
set
of
simple
flow
charts
could
be
developed
to
visually
show
the
order
of
method
steps.
This
would
be
a
valuable
training
aide,
which
would
also
help
analysts
set
up
and
perform
Method
1631
on
irregular
schedules.

2.
Audit
Checklists
Audit
checklists
are
valuable
tools
for
everyone
performing
these
complex
methods
or
verifying
laboratory
performance.
EPA
could
develop
audit
checklists
(
e.
g.,
necessary
procedures,
quality
control
data,
etc.)
as
part
of
the
method
development
process.
This
would
provide
a
uniform
document
for
checking
method
adherence.

3.
Control
Charts
Control
charts
are
useful
for
environmental
laboratories
routinely
conducting
trace
analytical
procedures.
EPA
could
establish
criteria
for
control
charts.
This
would
allow
laboratories
to
flag
method
failure
by
measuring
intra­
laboratory
error
bands
for
acceptable
performance
over
time.
EPA
has
extensive
experience
with
control
charts,
and
this
would
be
a
simple
addition
to
Method
1631.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
38
of
49
Dr.
Piegorsch's
Comments
Aside
from
the
comments
given
above,
I
have
no
further
improvements
to
suggest.
I
do
have
one
general
question,
however:
has
the
EPA
studied
the
use
of
composite
sampling
methodology
(
primarily
from
a
statistical
perspective)
for
application
to
MDL
or
ML
determination?
I
do
not
profess
to
be
an
expert
in
composite
sampling,
and
perhaps
I
missed
this
in
the
TSD,
but
as
I
understand
it
composite
sampling
is
intended
for
chemometric
and
environmental
monitoring
scenarios
where
chemical
analytes
or
biochemical
(
and
also
pharmaceutical)
metabolites
are
assessed
for
levels
of
occurrence.
Apparently,
it
is
particularly
useful
when
studying
whether
a
chemical
has
exceeded
or
dropped
below
some
critical
threshold.
Other
uses
include
compliance
monitoring
for
environmental
standards
(
Barnett
and
Bown,
2002),
classification
of
samples
as
to
their
levels/
status
of
some
environmental
contaminant
(
Johnson
and
Patil,
2001).
This
background
seems
similar
tot
he
detection
limit
problem
under
considered
here,
and
this
methodology
may
prove
useful.

Dr.
Rocke's
Comments
See
Dr.
Rocke's
comments
on
the
MDL
in
the
Specific
Comments
section.

Dr.
Wait's
Comments
The
MDL
and
ML
concepts
evaluated
in
Section
5.1.1
and
5.2.1,
respectively,
are
shown
in
this
evaluation
to
be
technically
sound
and
practical.
The
revised
MDL
procedure
provided
in
Appendix
D
is
streamlined
and
more
intelligible
than
the
previous
version,
although
a
reexamination
of
Step1
to
aid
chemists
in
the
spiking
level
requirement
may
be
warranted.

EPA's
literature
search
was
extensive,
irregardless
of
my
suggestions
to
examine
some
other
sources.
The
detection
and
quantitation
concepts
I'm
aware
of
have
already
been
adequately
"
fleshed"
out
by
EPA.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
39
of
49
VI.
SPECIFIC
COMMENTS
Dr.
Cooke's
Comments
There
were
no
specific
comments
from
Dr.
Cooke.

Dr.
Piegorsch's
Comments
There
were
no
specific
comments
from
Dr.
Piegorsch.

Dr.
Rocke's
Comments
Section
1.3.2
Grouped
analysis
by
concentration
leads
to
anomalous
results.
If
all
the
samples
at
a
given
concentration
are
analyzed
in
sequence,
then
the
next
concentration,
and
so
on,
the
values
at
a
given
concentration
will
be
closer
together
than
would
be
the
case
if
they
were
analyzed
at
different
times,
or
interspersed
with
other
concentrations.
We
have
seen
this
phenomenon
many
times
in
such
data.
This
means
that
the
variability
of
the
replicates
around
the
mean
of
the
replicates
is
an
underestimate
of
the
actual
variability
at
that
concentration.
This
problem
should
be
fixed
by
using
a
proper
randomized­
order
design,
but
can
be
mitigated
by
always
looking
at
variability
around
the
calibration
line,
rather
than
around
the
mean
of
the
replicates
(
cf.
3.3.8.2).

Section
2.2.1
The
MDL
had
a
number
of
problems
that
needed
repair,
some
of
which
were
fixed
in
the
rewording
on
page
5­
4.
The
basic
concept
of
Glaser
et
al.
(
1981)
that
the
"
MDL
is
considered
operationally
meaningful
only
when
the
method
is
truly
in
detection
mode,
i.
e.,
[
the]
analyte
must
be
present."
is
problematic.
For
methods
under
which
a
signal
is
generated
from
blanks,
this
is
not
at
all
necessary.
For
cases
in
which
the
blank
does
not
generate
a
signal
due
to
instrumental
limitations
(
such
as
inability
to
find
the
peak
to
integrate),
one
must
generate
the
MDL
using
positive
concentrations.
Otherwise,
blank
samples
are
fine.
See
further
comments
below
on
the
MDL.

Section
2.2.2
The
ML
as
originally
defined
may
very
well
be
below
the
MDL.
After
all,
the
concentration
at
which
the
MDL
is
measured
must
generate
peaks
that
can
be
measured.
Any
definition
that
relates
the
ML
or
related
concept
to
either
a
multiple
of
the
standard
deviation
at
zero,
or
to
a
desired
CV
is
fundamentally
flawed.
If
the
instrument
can
be
read,
and
the
spectra
can
be
recognized,
then
the
ML
is
exceeded,
regardless
of
the
other
issues.
I
don't
think
it
is
too
much
to
say
that
any
level
at
which
the
instrument
can
be
read,
and
at
which
there
is
a
reliably
estimated
standard
deviation
is
a
level
at
which
quantitation
is
possible.
No
arbitrary
standard
regarding
multiples
of
the
standard
deviation
at
zero
or
a
desired
CV
is
appropriate
for
any
purpose
in
analytical
chemistry
or
the
regulation
of
toxic
substances.
This
includes
the
PQL,
the
AML
and
other
related
methods.
None
of
them
generate
a
useful
number.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
40
of
49
Regulatory
Levels
Obviously,
levels
of
a
toxic
substance
can
not
easily
be
regulated
below
the
level
at
which
there
is
instrumental
response
(
i.
e.,
a
signal
is
generated).
All
environmental
measurements
should
be
reported
as
measured,
and
should
only
be
reported
as
non­
detects
if
the
instrumental
response
itself
fails.
If
a
value
is
generated
by
the
instrument,
it
should
be
reported,
with
an
indication
of
what
the
estimated
standard
deviation
is,
and
whether
the
measurement
shows
the
concentration
to
be
non­
zero
(
that
is,
whether
the
signal
is
above
the
critical
level).
See
3.3.2.
For
substances
in
which
the
toxic
level
is
well
below
the
critical
level,
then
the
compliance
threshold
should
be
at
the
critical
level
(
in
one
of
its
implementations
such
as
the
revised
MDL).

Interlaboratory
Variability
If
a
laboratory
computes
its
critical
level
using
a
procedure
such
as
the
MDL,
it
makes
no
sense
to
expand
this
to
account
for
interlaboratory
variability.
Whether
other
labs
can
or
cannot
detect
the
substance
with
a
signal
at
the
MDL
of
the
given
laboratory
is
irrelevant.
It
may
be
different
if
the
goal
is
precisely
to
determine
the
quantity
of
the
analyte
in
a
standard
sample.
In
this
case,
interlaboratory
variability
may
be
appropriately
considered.
It
should
not
be
considered
in
detection
decisions
unless
it
can
be
shown
that
such
decisions
in
an
individual
laboratory
are
biased
and
may
over­
or
under­
estimate
the
true
critical
level
(
detection
threshold)
in
that
laboratory.
For
the
specific
purpose
of
determining
whether
a
given
sample
exceeds
the
safe
level,
a
general
interlaboratory
study
is
not
of
much
use,
since
it
may
be
influenced
by
the
performance
of
laboratories
at
levels
far
removed
from
the
point
at
issue.
If
the
safe
level
is
below
the
critical
level,
than
use
of
the
critical
level
is
appropriate
as
an
action
threshold.
If
the
safe
level
is
above
the
critical
level,
then
interlaboratory
variation
should
only
be
taken
into
account
if
it
can
be
shown
that
the
number
of
false
positives
when
the
analyte
is
present
at
the
safe
level
is
not
well
controlled
using
the
usual
intralaboratory
calibration
methods.

Prediction
and
Tolerance
Intervals
Tolerance
intervals
are
inappropriate
for
environmental
monitoring.
The
main
issues
here
are
1)
is
the
true
concentration
greater
than
some
specified
safe
of
action
level,
with
sufficient
confidence,
and
2)
what
interval
of
possible
concentrations
is
consistent
with
one
or
a
series
of
measurements,
with
a
specified
degree
of
confidence.
Both
are
statements
about
a
given
sample
or
series
of
samples,
and
not
about
the
hypothetical
variability
of
future
estimates.
Suppose
that
one
has
a
sample
of
10
observations
with
mean
concentration
of
1ppb
and
standard
deviation
of
0.5ppb.
Then
the
estimated
99%
critical
level
is
(
2.326)(
0.5)
=
1.2ppb.
One
may
choose
to
use
a
t­
score
instead
of
a
normal
score
so
that
the
chance
that
a
future
observation
will
exceed
this
level
is
in
fact
99%.
In
this
case,
the
critical
level
estimate
would
be
(
3.250)(
0.5)
=
1.6ppb.
This
does
actually
correspond
to
a
prediction
interval
for
future
observations
from
a
zero
concentration
sample.

If
one
asked
instead
for
a
95%
confidence
interval
for
the
.99
percentage
point
of
the
true
distribution
of
measurements
(
assuming
normality)
when
the
true
quantity
is
zero,
this
can
be
calculated
approximately
using
a
chi­
squared
distribution
and
covers
the
interval
(
0.9ppb,
2.4ppb).
It
does
not,
however,
make
sense
to
use
2.4ppb
as
a
threshold,
since
the
chance
of
a
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
41
of
49
future
observation
exceeding
2.4ppb
when
the
true
mean
concentration
is
0
is
about
.0005,
far
smaller
than
the
intended
false­
positive
limit
of
.01.

4.4
Criterion
4
This
criterion
appears
from
the
description
and
the
discussion
to
be
a
mix
of
the
Currie
concepts
of
critical
level
and
minimum
detectable
value.
What
should
appear
here
is
the
criticallevel
equivalent.
Here
is
a
suggested
re­
wording:

The
detection
level
concept
should
identify
the
signal
or
estimated
concentration
at
which
there
is
99%
confidence
that
the
substance
is
actually
present
when
the
analytical
method
is
performed
by
experienced
staff
in
a
well­
operated
laboratory.

4.5
Criterion
5
This
concept
has
no
operational
meaning
as
written.
The
only
real
criterion
for
a
QL
is
that
the
instrument
should
generate
a
recognizable
signal.
Here
is
a
suggested
re­
wording:

The
quantitation
limit
concept
should
identify
a
concentration
at
which
the
instrument
yields
a
measurable
signal
at
least
99%
of
the
time,
and
which
is
also
no
smaller
than
the
detection
level.
This
will
often
be
the
same
as
the
detection
level.

Evaluation
of
Detection
Limit
Concepts:
The
MDL
This
method,
as
re­
described
in
Condition
3
on
the
top
of
page
5­
4,
and
as
further
specified
on
page
5­
2,
is
a
reasonable
implementation
of
the
critical
level
concept
for
situations
in
which
the
instrument
may
not
yield
reliable
data
for
blanks.
It
should
not
increase
the
critical
level
much
over
what
would
be
obtained
using
true
blanks
if
that
were
possible.
However,
the
use
of
as
much
as
five
times
the
critical
level
for
the
spike
concentrations
could
be
problematic.
The
inflation
of
the
MDL
by
using
a
spike
at
the
critical
level
is
only
25%
for
a
method
with
a
highlevel
CV
of
20%
(
this
and
other
calculations
here
are
done
with
the
Rocke
and
Lorenzato
1995
variance
function
assuming
a
sample
size
of
7).
A
spike
concentration
of
3
times
the
critical
level
inflates
the
MDL
to
a
value
140%
higher,
which
even
there
may
be
tolerable.
Use
of
a
value
5
times
the
critical
level
gives
an
inflation
of
over
280%.
Thus
if
the
true
critical
level
is
1ppb,
then
the
use
of
1,
3,
and
5
times
the
critical
level
for
spike
concentrations
in
determining
the
MDL
gives
likely
values
of
1.2ppb,
2.4ppb,
and
3.8ppb,
respectively.
These
number
were
determined
as
follows:
Let
V(
y)
=
a2+
b2
µ
2.
Then
the
expected
MDL
if
blanks
were
used
is
approximately
ta,
where
t
is
the
appropriate
t­
statistic.
If
spikes
at
kta
are
used,
then
the
variance
at
that
level
of
µ
is
a2+(
ktab)
2,
and
the
approximate
estimated
MDL
will
be
t
times
the
square
root
of
this
quantity,
so
that
the
ratio
of
the
MDL
with
blanks
to
the
MDL
at
spike
level
µ
=
kta
is

[
1+(
ktb)
2].
Thus,
I
would
recommend
that
the
procedure
be
altered
to
use
concentrations
that
are
no
more
than
3
times
the
detection
limit,
and
perhaps
to
permit
concentrations
lower
then
the
critical
level,
including
possibly
blanks.

Other
than
that,
the
MDL
procedure,
with
its
new
definition,
is
quite
a
reasonable
choice
for
a
detection
limit
concept.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
42
of
49
Dr.
Wait's
Comments
Although
the
TSD
is
necessarily
long
and
dense
with
information,
it
is
well
written
and
flows
logically.
I
would
not
make
any
structural
changes
to
the
document.
Since
the
TSD
addresses
fundamental
quality
assurance
issues,
I'm
surprised
that
there
is
no
acknowledgement
or
reference
to
EPA's
Quality
System.
EPA
may
want
to
reexamine
the
TSD
and
update
as
appropriate
to
remain
consistent
with
Agency
directives.

A
listing
of
acronyms
would
be
useful.

Typos,
grammar,
etc.:

Chapter
1
A
detailed
citation
reference
to
the
law
suit
and
settlement
agreement
should
be
provided
so
that
the
reader
can
actually
research
the
suit.

Page
2­
7
,
text
line
14
"
most
newer"?

Chapter
3
If
the
1997
EPA
Method
1625
study
has
previously
been
published,
it
should
be
referenced.

Page
3­
7,
text
line
15
"
give"
should
be
"
given".

Section
3.2.1.4
First
sentence
incomplete.

Section
3.2.1.4
A
number
of
studies
have
been
mentioned
in
the
first
paragraph.
These
should
be
referenced.

Section
3.3.1
When
discussing
errors,
should
add
systemic
errors
and
blunders.

Page
3­
14,
text
line
11(
first
line
of
Section
3.3.1)
Should
add
"
analytical"
before
"
measurement".
Globally,
the
term
measurement
includes
all
sample
collection
and
analysis
activities.

Page
5­
3,
text
line
8
">"
3.05
should
be
"<"
3.05
(
Sentence
beginning
after
"
Step
4").

Title
of
5.2.1
"
L"
in
limit
should
be
capitalized.

Page
6­
1,
text
line
28
"
be"
should
be
inserted
between
"
it"
and
"
would".

Reference
Section
Youden
reference
needs
date
(
1975?)

Appendix
C,
page
C­
11
PCB
1216
wrong.
Should
be
PCB
1016?
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
43
of
49
VII.
MISCELLANEOUS
COMMENTS
Dr.
Cooke's
Comments
There
were
no
miscellaneous
comments
from
Dr.
Cooke.

Dr.
Piegorsch's
Comments
Additional
sources
of
input
on
the
topic
of
broad­
based
concepts
discussed
in
the
TSD
include
the
following:.

Boswell,
M.
T.,
Burnham,
K.
P.,
and
Patil,
G.
P.
(
1988).
Role
and
use
of
composite
sampling
and
capture­
recapture
sampling
in
ecological
studies.
In
Handbook
of
Statistics
Volume
6:
Sampling,
Krishnaiah,
P.
R.
and
Rao,
C.
R.
(
eds.),
469­
488.
Amsterdam:
North­
Holland/
Elsevier.

Boswell,
M.
T.,
Gore,
S.
D.,
Lovison,
G.,
and
Patil,
G.
P.
(
1996).
Annotated
bibliography
of
composite
sampling
Part
A:
1936­
92.
Environmental
and
Ecological
Statistics
3,
1­
50.

Correll,
R.
L.
(
2001).
The
use
of
composite
sampling
in
contaminated
sites­
a
case
study.
Environmental
and
Ecological
Statistics
8,
185­
200.

Dorfman,
R.
(
1943).
The
detection
of
defective
members
of
large
populations.
Annals
of
Mathematical
Statistics
14,
436­
440.

Edland,
S.
D.,
and
van
Belle,
G.
(
1994).
Decreased
sampling
costs
and
improved
accuracy
with
composite
sampling.
In
Environmental
Statistics,
Assessment,
and
Forecasting,
Cothern,
C.
R.
and
Ross,
N.
P.
(
eds.),
29­
55.
Boca
Raton,
FL:
Lewis
Publishers.

Elder,
R.
S.,
Thompson,
W.
O.,
and
Myers,
R.
H.
(
1980).
Properties
of
composite
sampling
procedures.
Technometrics
22,
179­
186.

Gore,
S.
D.,
Patil,
G.
P.,
Sinha,
A.
K.,
and
Taillie,
C.
(
1993).
Certain
multivariate
considerations
in
ranked
set
sampling
and
composite
sampling
designs.
In
Multivariate
Environmental
Statistics,
Patil,
G.
P.
and
Rao,
C.
R.
(
eds.),
121­
148.
Amsterdam:
North­
Holland.

Kosmelj,
K.,
Cedilnik,
A.,
and
Kalan,
P.
(
2001).
Comparison
of
a
two
stage
sampling
design
and
its
composite
sample
alternative:
An
application
to
soil
studies.
Environmental
and
Ecological
Statistics
8,
109­
119.

Kotz,
S.,
Johnson,
N.
L.,
and
Read,
C.
B.
(
1982).
Bulk
sampling.
In
Encyclopedia
of
Statistical
Sciences,
1,
Kotz,
S.,
Johnson,
N.
L.
and
Read,
C.
B.
(
eds.),
324­
325.
New
York:
John
Wiley
&
Sons.

Lancaster,
V.
A.,
and
Keller­
McNulty,
S.
(
1998).
A
review
of
composite
sampling
methods.
Journal
of
the
American
Statistical
Association
93,
1216­
1230.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
44
of
49
Lovison,
G.,
Gore,
S.
D.,
and
Patil,
G.
P.
(
1994).
Design
and
analysis
of
composite
sampling
procedures:
A
review.
In
Handbook
of
Statistics
Volume
12:
Environmental
Statistics,
Patil,
G.
P.
and
Rao,
C.
R.
(
eds.),
103­
166.
New
York:
North­
Holland/
Elsevier.

Nussbaum,
B.
D.,
and
Gilbert,
R.
O.
(
2001).
Editorial:
Special
issue
on
composite
sampling.
Environmental
and
Ecological
Statistics
8,
89­
90.

Patil,
G.
P.,
Gore,
S.
D.,
and
Sinha,
A.
K.
(
1994).
Environmental
chemistry,
statistical
modeling,
and
observational
economy.
In
Environmental
Statistics,
Assessment,
and
Forecasting,
Cothern,
C.
R.
and
Ross,
N.
P.
(
eds.),
57­
97.
Boca
Raton,
FL:
Lewis
Publishers.

Here
are
some
other
sources
from
the
literature
(
besides
those
already
listed
on
TSD
pp.
R­
1/
R­
2)
that
have
commented
in
various
ways
on
the
issue
of
detection/
decision
limits
in
environmental
applications.
The
van
der
Voet
(
2002)
entry
expands
upon
this
list
somewhat.
I'm
sure
some
of
them
will
already
be
familiar
to
the
Agency.

Adams,
M.
J.
(
1992).
Errors
and
detection
limits.
In
Methods
of
Environmental
Data
Analysis,
Hewitt,
C.
N.
(
ed.),
181­
212.
Amsterdam:
Elsevier
Applied
Science.

Clark,
M.
J.
R.,
and
Whitfield,
P.
H.
(
1994).
Conflicting
perspectives
about
detection
limits
and
about
the
censoring
of
environmental
data.
Water
Resources
Bulletin
30,
1063­
1079.

Cressie,
N.
(
1994).
Spatial
chemostatistics.
In
Environmental
Statistics,
Assessment,
and
Forecasting,
Cothern,
C.
R.
and
Ross,
N.
P.
(
eds.),
131­
146.
Boca
Raton,
FL:
Lewis
Publishers.

Currie,
L.
A.
(
1988).
Detection
in
Analytical
Chemistry:
Importance,
Theory,
and
Practice.
New
York:
American
Chemical
Society.

Currie,
L.
A.
(
1996).
Foundations
and
future
of
detection
and
quantification
limits.
Proceedings
of
the
American
Statistical
Association,
Section
on
Statistics
and
the
Environment,
1­
8.
El­
Shaarawi,
A.
H.,
and
Naderi,
A.
(
1991).
Statistical
inference
from
multiply
censored
environmental
data.
Environmental
Monitoring
and
Assessment
17,
339­
347.

Gibbons,
R.
D.
(
1994).
Statistical
Methods
for
Groundwater
Monitoring.
New
York:
John
Wiley
&
Sons.

Gibbons,
R.
D.
(
1995).
Some
statistical
and
conceptual
issues
in
the
detection
of
low­
level
environmental
pollutants
(
with
discussion).
Environmental
and
Ecological
Statistics
2,
125­
167.

Helsel,
D.
R.
(
1990).
Less
than
obvious:
Statistical
treatment
of
data
below
the
detection
limit.
Environmental
Science
&
Technology
24,
1766­
1774.
Lambert,
D.,
Peterson,
B.,
and
Terpenning,
I.
(
1991).
Nondetects,
detection
limits,
and
the
probability
of
detection.
Journal
of
the
American
Statistical
Association
86,
266­
277.
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
Page
45
of
49
Maynard,
A.
W.
(
1990).
Environmental
tests:
Are
they
valid?
Chemical
Technology
20,
151­
155.

McBean,
E.
A.,
and
Rovers,
F.
A.
(
1998).
Statistical
Procedures
for
Analysis
of
Environmental
Monitoring
Data
&
Risk
Assessment.
Upper
Saddle
River,
NJ:
Prentice
Hall
PTR.

Millard,
S.
P.,
and
Neerchal,
N.
K.
(
2001).
Environmental
Statistics
with
S­
PLUS.
Boca
Raton,
FL:
Chapman
&
Hall/
CRC.

Nagaraj,
N.
K.,
and
Brunenmeister,
S.
L.
(
1994).
A
new
approach
for
accommodation
of
below
detection
limit
data
in
trend
analysis
of
water
quality.
In
Environmental
Statistics,
Assessment,
and
Forecasting,
Cothern,
C.
R.
and
Ross,
N.
P.
(
eds.),
113­
127.
Boca
Raton,
FL:
Lewis
Publishers.

Slyman,
D.
J.,
de
Peyster,
A.,
and
Donohoe,
R.
R.
(
1994).
Hypothesis
testing
with
values
below
detection
limit
in
environmental
studies.
Environmental
Science
&
Technology
28,
898­
902.

van
der
Voet,
H.
(
2002).
Detection
limits.
In
Encyclopedia
of
Environmetrics,
1,
El­
Shaarawi,
A.
H.
and
Piegorsch,
W.
W.
(
eds.),
504­
515.
Chichester:
John
Wiley
&
Sons.

Dr.
Wait's
Comments
New
information
or
data
that
could
potentially
improve
the
quality
of
the
document.

Shumway
et
al.,
"
Statistical
Approaches
to
Estimating
Mean
Water
Quality
Concentration
with
Detection
Limits,"
Environ.
Sci.
Technol.
36:
3345­
3353(
2002).

Yu
et
al.,
"
Detection
Limit
of
Isotope
Dilution
Mass
Spectrometry,"
Analytical
Chem.
74:
3887­
3891.

A
number
of
papers
were
published
in
the
proceedings
of
the
224th
American
Chemical
Society(
ACS)
National
Meeting
 
Division
of
Environmental
Chemistry,
which
was
held
in
Boston
in
August
2002.
A
number
of
papers
may
be
of
interest
for
this
exercise:
Currie,
"
Detection
and
Quantification
Limits:
Basic
Concepts,
International
Harmonization,
and
Outstanding
Issues";
Wade
et
al.,
"
Method
Detection
Limits:
Application
to
Organic
Environmental
Chemistry
Data";
Rosecrance,
"
Recommended
Guidelines
for
Generating
Detection,
Quantitation
and
Reporting
Limits";
and
Burrows,
"
Instrument
Calibration
in
Environmental
Analysis
Issues
and
Proposals
for
Improvement".
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
A
Plan
for
the
Assessment
of
Detection
and
Quantitation
Limits
Under
Section
304(
h)
of
the
Clean
Water
Act
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
B
Charge
to
Peer
Reviewers
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
C
Dr.
W.
Marcus
Cooke
Curriculum
Vitae
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
D
Dr.
Walter
W.
Piegorsch
Curriculum
Vitae
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
E
Dr.
David
M.
Rocke
Curriculum
Vitae
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
F
Dr.
A.
Dallas
Wait
Curriculum
Vitae
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
G
W.
Marcus
Cooke
Comments
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
H
Walter
W.
Piegorsch
Comments
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
I
Dr.
David
M.
Rocke
Comments
Peer
Review
of
the
"
Technical
Support
Document
for
the
Assessment
of
Detection
and
Quantitation
Concepts"

September
2002
APPENDIX
J
Dr.
A.
Dallas
Wait
Comments
