Response to Comments on the Proposed Laboratory

Quality System Requirements Revision 3.0 (LQSR 3.0) for the National
Lead Laboratory Accreditation Program

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Prepared by the Office of Prevention, Pesticides, and Toxic Substances 

Office of Pollution Prevention and Toxics 

National Program Chemicals Division 

Program Assessment and Outreach Branch 

U.S. Environmental Protection Agency 

Washington, D.C.  20460 



Acronyms used in this document

 

AA 		Atomic Absorption 

A2LA 		American Association for Laboratory Accreditation 

AB 		Accrediting Body 

AIHA 		American Industrial Hygiene Association 

APLAC 	Asian-Pacific Laboratory Accreditation Cooperation

ASTM 	American Society for Testing and Materials 

CCV 		Continuing Calibration Verification 

CDC 		Centers for Disease Control 

ELPAT 	Environmental Lead Proficiency Analytical Testing 

EPA 		Environmental Protection Agency 

FSMO 		Field Sampling and Measurement Organizations 

ICV 		Independent Calibration Verification 

ICS 		Interference Check Standard 

INELA 	Institute for National Environmental Laboratory Accreditation 

ISO 		International Organization for Standardization 

ISO-IEC 	International Organization for Standardization and
International Electrochemical 			Commission 

LCS 		Laboratory Control Sample 

MDL 		Method Detection Limit 

MOU 		Memorandum of Understanding 

QMSM 	Quality Management System Manual 

NACLA 	National Cooperation for Laboratory Accreditation

NELAC 	National Environmental Laboratory Accreditation Conference

NLLAP 	National Lead Laboratory Accreditation Program 

OPPT 		Office of Pollution Prevention and Toxics 

PCS 		Performance Characteristic Sheet 

PE 		Performance Evaluation 

PT 		Proficiency Testing 

QA 		Quality Assurance 

QC 		Quality Control 

SI 		Système International d'Unités (International System of Units) 

SOP 		Standard Operating Procedure 

TSCA 		Toxic Substances Control Act 

VIM		International Vocabulary of Basic and General Terms in Metrology

 

Background Information and Introduction

 

In the FY92 appropriations bill, EPA is identified by Congress as the
federal agency responsible for establishing an accreditation program for
laboratories participating in the analysis of lead in paint, soil and
dust samples as a part of a national residential lead-based paint
abatement and control program.  In response to this federal mandate, the
Office of Pollution Prevention and Toxics (OPPT) established the
National Lead Laboratory Accreditation Program (NLLAP).  The NLLAP
recognizes laboratories which have demonstrated the ability to
accurately analyze for lead in paint, dust and soil samples. 

 

There are two basic components to the NLLAP.  The first component is a
laboratory proficiency testing program (the Environmental Laboratory
Proficiency Analytical Testing (ELPAT) Program) administered by the
American Industrial Hygiene Association (AIHA) in conjunction with
EPA’s NLLAP.  AIHA sends out ELPAT proficiency testing samples on a
quarterly basis (four test rounds per year).  AIHA assimilates the test
results for each test round and evaluates the laboratories performance
on a statistical basis. 

 

The second component of the NLLAP is a systems audit to be conducted by
a laboratory accrediting organization recognized by EPA.  EPA currently
recognizes the American Association for Laboratory Accreditation (A2LA)
and AIHA as accrediting organizations through a memorandum of
understanding.  Once a laboratory successfully meets the requirements of
the ELPAT Program and passes an NLLAP system audit, the laboratory is
recognized by EPA under the NLLAP. 

 

In 1993, EPA issued its first version of the Laboratory Quality System
Requirements (LQSR), which outlined minimum requirements for NLLAP
recognized laboratories.  The LQSR was revised in 1996 and is now being
revised to include criteria for the Field Sampling and Measurement
Organizations (FSMOs).  FSMOs use analytical techniques that combine
sampling and analysis of lead in a single step. 

 

An organization requesting NLLAP recognition shall be a laboratory
capable of performing sampling and/or lead testing.  A laboratory shall
have distinct staffing, instrumentation, sampling and test methods, as
appropriate, and depending upon the type laboratory may have physical
facilities and may use field test kits.  An organization must meet the
requirements listed in the third version of the LSQR, also referred to
as LQSR 3.0, to attain recognition under the NLLAP as a lead-testing
laboratory. 

 

Organization of Response to Comments 

 

The LQSR 3.0 was introduced to the stakeholders on August 25, 2005.  In
October 2005 EPA received comments from eleven commenters.  These
comments were reviewed and summarized. Based on the comments and further
agency staff analysis, EPA has revised the Laboratory Quality System
Requirements under version LQSR 3.0. 

 

To facilitate ease of review the document is organized by order of
incoming comments.  When similar comments were raised by different
stakeholders, the comments were combined and responded to as one
comment, when possible. The name/or affiliation of the commenters are
provided in the list below.



List of Commenters

American Industrial Hygiene Association 			AIHA

American Association for Laboratory Accreditation 		A2LA

Alliance for Healthy Homes					AHH

BTS Laboratories 						BTS

City of Cleveland Health Department 			CCHD

Connecticut Department of Public Health			CDPH

EHS								EHS		

Fiberquant Analytical Services 				FAS

National Center for Healthy Housing, 			HCHH

Professional Service Industries, Inc. 				PSI

Research Triangle Institute 					RTI

U.S. Department of Housing and Urban Development	HUD

 

EPA’s Response to Comments

 

 

Comment #1 

Professional Service Industries, Inc. (PSI)

 

The program needs to ensure that the NLLAP approved accrediting
organizations, the American Association for Laboratory Accreditation
(A2LA) and the American Industrial Hygiene Association (AIHA), will be
capable of accrediting large numbers of organizations on a timely basis
when the regulations take effect. 

 

EPA’s response: 

 

EPA agrees with this comment.  The Laboratory Quality System
Requirements (LQSR 3.0) will become effective eighteen months after
publication.  EPA is working closely with the two existing accrediting
organizations, A2LA and the AIHA, to ensure their programs will support
the emerging need for accreditation of new organizations in a timely
manner. 

Comment # 2

PSI

EPA should consider using the Institute for National Environmental
Laboratory Accreditation (INELA) accreditation model for the FSMO. 

 

EPA Response: 

 

Both the EPA LQSR 3.0 and TNI standards for the FSMO are based on
ISO/IEC 17025:2005 and, therefore, are practically identical.  As such
the TNI model was taken into consideration in drafting LQSR 3.0.

Please note that the Institute for National Environmental Laboratory
Accreditation (INELA) has added “The” as part of their name to
become The Institute for National Environmental Laboratory Accreditation
or The NELAC Institute which is now initialized as TNI. 

Comment #3 

PSI

 

EPA should make it clear that no additional requirements by the
Accrediting Bodies (ABs) are allowed above and beyond those included in
the final LQSR. 

 

EPA’s Response:

 

EPA disagrees.  The ABs may impose some additional requirements to
accredited laboratories in addition to LQSR 3.0 requirements.  EPA wants
to ensure that EPA’s requirements are consistent with the requirements
that ABs must adhere to under international laboratory accreditation
standards (e. g. ISO – 17025 and ISO 17011).  Giving flexibility to
ABs in adopting LQSR 3.0 allows the analytical results to be accepted
domestically and internationally.

Also restricting the AB’s flexibility in adapting LQSR 3.0 may prevent
ABs from maintaining other mutual recognition arrangements (e.g. NACLA,
APLAC).  At the same time, Section 1 of the NLLAP MOU with currently
recognized ABs states: “EPA/OPPT is to be notified in writing within
30 days after a decision has been made to implement major changes in
organizational policies or management of the accreditation organization,
which could affect the NLLAP”.  Therefore, when a decision is made to
add or change requirements by the AB, EPA must be notified and concur on
such requirements before they are imposed on the accredited
laboratories.  EPA may, as necessary, publish Federal Register Notices
to request comments on any major changes or requirements proposed by
ABs, to seek comments from affected entities, and to announce the
effective date of any such change.  The Agency would then publish the
effective date of such major change. 

 

Comment # 4 

 PSI

The LQSR 3.0 is too prescriptive. 

 

EPA’s response: 

 

EPA disagrees that the LQSR 3.0 is too prescriptive.  The LQSR 3.0 was
developed to be consistent with Agency’s Performance Based Measurement
System that allows for the following:

Increased emphasis on the specification of flexible requirements for
measurements and the development of process;

Development of processes for validation that assure that measurements
meet quality requirements;

Increased collaboration with stakeholders to develop validation
processes for new measurement technologies; and

Rapid assessment of new and modified technologies, methods , and
procedures.

Comment # 5 

 PSI

Will the FSMO be accredited for on-site measurement only, or for both
on-site measurement and sampling? 

 

EPA’s response: 

 

Either accreditation situation could occur.  It is the Agency’s
intention that an FSMO could be accredited for on-site measurement only,
or accredited for both on-site measurement and sampling.  Any
accreditation would be based on the technology utilized by the FSMO and
their proficiency with the technology.  For example, an organization
that uses an XRF for paint analysis would be accredited for on-site
measurement and sampling, because sampling and measurement cannot be
separated. 

Comment # 6 

PSI

 

All LQSR requirements, specifically those in the “Technical
Requirements” section (5.0), need to be carefully evaluated to ensure
that all are applicable to FSMO.  Many of the requirements included are
very appropriate for labs but some may be difficult to implement for an
FSMO due to the nature of the work performed, and/or will not add value
to the environmental data produced. 

 

EPA’s response: 

 

It is EPA’s intent that LQSR 3.0 be applicable to any lead testing
technology.  Certain requirements will not apply to every technique and
it is EPA’s intent to allow for such 

flexibility.  It is the Agency’s intention to allow for flexibility
but at the same time ensure adequate quality of lab measurements to
accomplish the Agency’s mission of protecting public health. 

 

Comment # 7 

PSI, BTS Laboratories (BTS), National Center for Healthy Housing (NCHH),
Research Triangle Institute (RTI), EHS, Alliance for Healthy Homes
(AHH), American Industrial Hygiene Association (AIHA) 

 

EPA should revise the requirements for one-person labs to assure quality
of test results produced by NLLAP accredited organizations. 

EPA’s response: 

 

In an effort to ensure quality measurements and quality control, the
Agency believes that an independent review of the data must be required.
 A change was made in Section 5.4.6 Data Reduction and Review Process to
state: 

 

“In the case of a one-person laboratory, the review process shall be
contracted out to an independent person or firm that is competent and
has the experience and training necessary to conduct the review.” 

 

“The review process shall be documented and signed by the reviewer,
and shall be retained on file with a copy of the final report for a
minimum of five years.”

 

Comment # 8  

PSI

 

Consider replacing the term “Management System” with “Quality
Management System.” 

 

EPA’s response: 

 

EPA has accepted this comment and made the change. 

Comment # 9  

PSI, BTS 

The requirement of the laboratory document retention period is too long.


 

EPA’s response: 

 

The LQSR 2 document retention period requirement was 10 years.  EPA then
proposed in the August 2006 LQSR 3.0 Draft to change the document
retention period to three years.  However, commenters suggested that EPA
change the retention period to be consistent with the NELAC requirement
of 5 years.  EPA agreed, and made the record retention period to be 5
years. 

 

Comment # 10 

PSI, BTS 

Non-applicable requirements for FSMOs should be removed from the
document. 

 

EPA’s response: 

Throughout the LQSR 3.0 there are statements that indicate if, where, or
when requirements are applicable to certain FSMO analyses.  However,
certain requirements would not apply to every procedure and therefore
would need to, and may be treated as not applicable. 

 Comment # 11 

PSI, NCHH, Fiberquant Analytical Services (FAS), RTI, EHS, American
Association for Laboratory Accreditation (A2LA), Connecticut Department
of Public Health (CDPH)

Revise the instrument calibration frequency and acceptance criteria. 
The acceptance criteria for CCV and ICS should be narrowed to +/- 10% .

  

EPA’s response: 

 

It is the Agency’s intention to allow for flexibility within the
Quality Management System of every organization that is appropriate to a
specific testing technology that is used by that particular
organization.  EPA intentionally removed all technology-specific
requirements from the LQSR 3.0 and proposed that instrument QC
requirements are based on performance and lead regulatory needs, rather
than analytical technique.  Acceptance criteria shall be determined,
documented and used.  Each laboratory shall establish its own
performance criteria for QC samples (uncertainty of measurement), which
shall not be greater than stated in LQSR 3.0. For example, see Section
5.9 – ASSURING THE QUALITY OF TEST RESULTS:

 

“Laboratory system process control and system performance monitoring
shall be accomplished using statistical process control (charts or data
base) for monitoring the laboratory’s performance with QC sample
analysis results.  The statistical process control method used shall
specify warning and action limits for acceptance or rejection of the QC
data and shall be used to monitor performance trends within the quality
management system over time.  In the absence of a statistically
sufficient data base to determine the necessary frequency for QC samples
and/or action limits for acceptance or rejection of QC data, the
laboratory shall use of the frequencies and criteria for QC samples
stated in Tables 3 and 4.” 



Table 3  Summary of QC Sample Performance Requirements for an instrument
which produces a numerical result

QC SAMPLE	FREQUENCY	ACCEPTANCE LIMITS

Laboratory Control Sample	One per 20 samples or batch (min. frequency 5
%)	Within ±20 % of known value

Matrix Spike Sample	One per 20 samples or batch (min. frequency 5 %)
Within ±25 % of calculated value

Duplicate Sample	One per 20 samples or batch (min. frequency 5 %)	Within
±25 % Relative % Difference (RPD)

Method Blank	One per 20 samples or batch (min. frequency 5 %)	Absolute
value not more than 50 % of the lowest regulatory limit for the sample
matrix analyzed or minimum level of concern

In the absence of sufficient data for statistical determination of
adequate QC limits and frequency, the types of QC samples, minimum
frequencies and the required minimum acceptance limits shown in this
table shall be met, as appropriate.



Table 4  Summary of QC Sample Performance Requirements for an Instrument
(or equivalent) which Produces Pass-Fail Results

QC SAMPLE	FREQUENCY	ACCEPTANCE LIMITS

Laboratory Control Sample Positive LCS-P (sample lead level no more than
20 % above the applicable regulatory limit; omit for positive screen
technologies)	One per 20 samples or batch (min. frequency 5 %)	Positive

Laboratory Control Sample Negative LCS-N (sample lead level no less than
20 % below the applicable regulatory limit; omit for negative screen
technologies)	One per 20 samples or batch (min. frequency 5 %)	Negative

Duplicate Laboratory Control Sample LCS-P or LCS-N	One per 20 samples or
batch (min. frequency 5 %)	Positive or Negative, depending on the choice
of lead level and the capability of the technology

Method Blank	One per 20 samples or batch (min. frequency 5 %)	Negative



In the absence of sufficient data for statistical determination of
adequate QC limits and frequency, the types of QC samples, minimum
frequencies and the required minimum acceptance limits shown in this
table shall be met, as appropriate.

Comment # 12 

PSI

 

The LQSR should separate “collected” sample analysis requirements
from in-situ analysis requirements. 

 

EPA’s response: 

 

EPA has established separate requirements for techniques that produce
numeric results from the requirements for pass/fail in-situ
technologies. Section 5.4.1 of the LQSR 3.0 states:

“The laboratory shall have demonstrated that the test and/or sampling
methods used are suited for the intended use.”  Section 5.4.1c says
“New, alternative or modified analytical methods and/or new testing
technologies may be used by a laboratory if they have been validated by
the laboratory or a third party, and shown to meet the minimum
performance requirements stated in 5.4.1(b).  The method validation must
be documented by the laboratory or qualified third party.  In the case
where validation is done by a party other than the laboratory, the
laboratory shall confirm its competency utilizing the method as
described above in Section 5.4.1(a).”  

Whereas the sampling requirements for procedures that produce numeric
results are provided in published references indicated in Section 5.4.1
b, the sampling protocols have yet to be published.  The in-situ
sampling requirements for new, alternative or modified analytical
methods and/or new testing technologies are usually provided by the
manufacturers and therefore the two section quotations above apply.

 

Comment # 13 

BTS

 

The turnaround time for reporting Proficiency Testing (PT) results
should be shortened. 

 

EPA’s response: 

 

EPA is working with AIHA (current ELPAT provider) to shorten the
turnaround time for PT samples to three weeks. 

 

Comment # 14 

BTS, CDPH

 

FSMOs should be held to the same high standards of quality as fixed-site
laboratories. 

 

EPA’s response: 

 

EPA agrees with this comment. EPA’s intention is for the LQSR 3.0 to
encourage the entry of new and innovative techniques for lead-based
paint testing activities and to provide the most cost-effective approach
for consumers of lead-based paint activities.  LQSR 3.0 is generic and
requirements may be applied to a variety of analytical technologies.  An
operation that performs analytical testing of paint chip (film), dust,
and/or soil samples for lead analysis under NLLAP, regardless of the
number of personnel or the extent of the scope of testing activities, is
required to meet the LQSR 3.0 requirements. 

 

Comment # 15 

BTS, RTI, EHS, AHH, AIHA, City of Cleveland Health Department (CCHD)

 

The educational level required for the Technical Manager and other
laboratory staff analyzing samples for lead should not be changed. 

EPA’s response: 

 

EPA is working closely with NLLAP recognized ABs to develop specific
educational/training requirements for a specific technology to make sure
that on-site evaluators have sufficient guidance to determine the
appropriate level of training and experience for people analyzing
samples for lead. The current requirements in LQSR 3.0 are consistent
with the ISO/IEC 17025 (May 2005). 

Comment # 16  

NCHH,PSI,RTI,EHS

Explain what positive screen technologies or negative screen
technologies mean.  It has not been explained before – but I think it
addresses a concern about technologies that only tell you whether you
meet the regulatory threshold.  More information is needed including a
definition since these terms are used often in the following sections. 

 

Also, 20% is too wide.  The positive screen technology should be able to
say with 95% confidence that the sample is below the appropriate
regulatory threshold. 

 

It makes sense to accommodate the possibility of a negative screen
technology too but the positive screen technology is the most important
– assuming I have understood the terms correctly.  Does positive mean
that the lead levels are cleaner than the threshold? 

The commenter provided a chart to compare the performance of the six
lead dust analysis technologies evaluated in EPA’s Environmental
Technology Verification program.  All of the NLLAP labs conducting the
analysis had more than 50% false negatives results on their samples that
were near the appropriate regulatory threshold. A false negative means
that the client would be told that the area is OK for children when the
actual level exceeded the EPA standard. According the most recent ETV
report (available at
http://www.epa.gov/etv/verifications/vcenter1-22.html), the NLLAP lab
would only have 50% change of getting a “pass” if a window sill
sample was actually at 280 μg/ft2 And this lab had a 91% to 98% average
percent recovery on the samples. A lab could be accredited with a 75%
average percent recovery. 

 

The commenter noted that the results of false negatives raise two
potential changes. 

 

μg/ft2 on a floor result may actually be a failure except for the
limitations in the lab. 

2. Narrow the accuracy range from 20% to 10% to reduce the number of
false negatives and ensure more consistent performance by all labs. 

Require that lab reports include the plus/minus limits in its report to
the client. 

Narrow the accuracy range from 20% to 10%. 

EPA’s response: 

 

Using the chart provided by the commenter and the EPA’s Environmental
Technology Verification program report, some fixed-site laboratory
measurements might be perceived as false positive (fp) /false negative
(fn) results. The primary objective for the verification test was to
evaluate XRF technology. For the purpose of results,
correlation/comparison analysis of the dust wipe by an accredited
laboratory was used.  In this report, fp and fn results were defined as:


A fp result is one in which the technology detects lead in the sample
above a clearance level when the sample actually contains lead below the
clearance level (Keith et al., 1996). A fn result is one in which the
technology indicates that lead concentrations are less than the
clearance level when the sample actually contains lead above the
clearance level.” See  HYPERLINK
"http://www.epa.gov/etv/verifications/vcenter1-22.html"
http://www.epa.gov/etv/verifications/vcenter1-22.html .  

Every definitive measurement includes uncertainty associated with a
produced result.  Also, preparation of the dust wipe sample for ELPAT or
by any other entity also includes a certain level of uncertainty, so the
concentration for each CRM sample is expressed as a number +/-
uncertainty.  NLLAP specifies a maximum level of uncertainty measurement
(+/- 20% of known value for the Laboratory Control Sample (LCS)) to
include any errors that could occur during preparation of the LCS plus
uncertainty associated with the materials used for LCS preparation).

Several measurements were outside of NLLAP specifications, but there may
be numerous explanations for such occurrences.  The results in the sited
reported indicate that such occurrences were less than 5% of the total
measurements performed.  Also, concentrations for the samples used for
the study were expressed not as absolute values but as “estimated”
values.  Some of those samples were not very well characterized during
their production.

LQSR 3.0 Section 5.10.2(p) indicates that test reports must include a
statement on the estimated uncertainty of the measurement.  Every
quantitative measurement test report must include the following
statement: result +/- uncertainty level.  The uncertainty level varies
for each laboratory/analytical technique/preparation type etc., since
the qualitative measurement uncertainty cannot be estimated during each
measurement

Tables 3 and 4 of LQSR 3.0 contain the required minimum performance
criteria for accuracy determination.  EPA believes that +/- 20% for the
accuracy range is a reasonable starting point based on EPA QA/R-5
document entitled “EPA Requirements for Quality Assurance Management
Plan Quality System that meets the American National Standard Institute
(ANSI)/American Society for Quality Control (ASQC) E-4-1994,  that will
allow for the new the analytical techniques to enter the field of lead
testing.  However, in order to be accredited, each new laboratory will
have to establish its own performance criteria that cannot be greater
than 20% and will improve over time.  In order to establish such
criteria for a new technique or a new instrument, one must have a
starting point and a number of measurements to perform statistical
analysis (usually 20 measurements). 

 

Comment #17 

NCHH

LQSR 3.0 needs to be modified to allow for the “inconclusive” or the
“pass only” option. 

 

In the case of clearance testing for lead dust, a client getting an
“inconclusive” result could either: 

 

1. Treat the result as a failure, re-clean the room, and retest, hoping
for a “pass” result the second time, or 

 

2. Direct the lab to send the sample to a stationary lab that may be
able to make the determination with more precision. (This option only is
possible if the initial analysis does not affect the sample.) 

 

EPA’s response: 

 

As specified in Table 4, Summary of QC Sample Performance Requirements
for an Instrument (or equivalent) Which Produces Pass-FailRresult, a
pass QC result is appropriate only for a positive screen technology, and
a fail QC result is appropriate only for a negative screen technology.

Section 5.10.2(p) of the LQSR 3.0 states that test reports must include,
where applicable, a statement on the estimated uncertainty of the
measurement based ISO/IEC 17025:2005(E) Section 5.10. language. Every
quantitative measurement test report must include the following
statement: result +/- uncertainty level.  It is the responsibility of
the decision maker to accept the result, order a retest, send the sample
to a fixed laboratory, or re-clean the property.  Also, the following
text on “inconclusive results” was added in section 5.10: 

Comment #18 

NCHH

 

The confidence-based approach used in LQSR 3.0 is a major departure from
current practices. 

 

EPA’s response: 

 

EPA disagrees that this is a major departure from current laboratory
practices.  With the current changes to the LQSR, EPA addresses the
appropriate requirements for measurement uncertainty based on good
laboratory practices and ISO/IEC 17025:2005(E).  Also see LQSR 3.0
Section 4.4.6.1 Estimate of Uncertainty of Measurement. 

The current good laboratory practices are described in “Good
Laboratory Practice Standards” which is now updated and renamed
“Consolidation of Good Laboratory Practice Standards” last updated
on September 24, 2007. (See www.epa.gov/fedrgstr/epa-pest) 

 

Comment #19 

NCHH

 

Labs that use technologies that do not change the composition of a
proficiency testing sample should be allowed to share the ELPAT samples
with other similar labs. This approach will dramatically reduce the
costs for the labs without compromising quality. 

 

EPA’s response: 

 

EPA disagrees because sharing ELPAT samples by different laboratories
may jeopardize the integrity of the reported results as well as ELPAT
samples themselves.  However, EPA is in the process of developing
Proficiency Testing samples for intact paint to prevent jeopardizing the
integrity of the samples and their measurement reporting by individual
users that would share ELPAT samples. 

 

Comment #20 

NCHH, RTI, EHS

  

Labs using field technologies should be given the option of bringing
their equipment to a central location for inspection by the accrediting
organization and for proficiency testing. Since there is no evidence
that location impacts the ability of the equipment to analyze the
sample, this approach will dramatically reduce the costs for the labs
without compromising quality.  

 

EPA’s response: 

 

EPA agrees with this comment.  The Agency is working closely with NLLAP
recognized ABs to develop cost-effective options to accommodate
on-site/off-site evaluation options of NLLAP recognized organizations. 

Comment #21 

NCHH

 

A quality assurance program needs to accommodate changes in personnel.
Any time the operator of an XRF changes, the new person should – after
appropriate training – be required to send at least 25% of the samples
to another NLLAP until the person can demonstrate that 95% or more
samples would yield comparable results to the other lab. Any clearances
done during this interim would be contingent on laboratory confirmation.

 

EPA’s response:

 

LQSR 3.0 will include appropriate requirements (e.g., training,
proficiency) for setting-up a Quality Management System to ensure the
quality of test results with or without high staff turnover. 

 

Comment #22 

NCHH

EPA’s language for training was reasonable and appropriate. Given the
variety of technologies involved, it should be left to the accrediting
organization to set more specific standards.

EPA’s response:

 

EPA agrees with this comment. 

 

Comment #23 

NCHH

EPA should not require NLLAP approval for the in-situ analysis. This
would dramatically limit access to XRFs by local health departments and
community action programs that do not use the equipment often enough to
warrant the costs and hassles of NLLAP accreditation. 

 

EPA’s response:	

LQSR 3.0 is based on requirements of ISO/IEC 17025:2005(E) and is
intended to apply the ISO/IEC general requirements.  EPA NLLAP
accreditation (approval) on all sample analysis devices is designed to
generate quality results that protect public health and ensure the
competency and proficiency of FSMOs.  

Comment #24 

NCHH

What does “identifiable part of an organization” mean?  If a risk
assessor uses an XRF for dust analysis in the field, it will be
difficult to consider that an identifiable part.  Would some
acknowledgement that lab analysis is listed as a distinct service be
sufficient? 

 

EPA’s response:	

A laboratory or FSMO is the “identifiable part of an organization.” 
As such, an FSMO using an XRF is considered “identifiable.”  The
text in LQSR 3.0 has been revised to state: 

 

“1.0 SCOPE”

 

“The requirements described within this document must be met for an
organization to attain recognition under the NLLAP as a lead testing
laboratory, hereafter referred to in this document as a laboratory.  An
organization requesting recognition under this program shall possess a
laboratory operation capable of performing sampling and lead testing.  A
laboratory shall have distinct staffing, instrumentation, sampling and
test methods, as appropriate, and depending upon the type laboratory may
have physical facilities and may use field test kits.  Laboratory
accreditation under this program shall be based on its meeting the
requirements listed in this document.”

 

Comment #25 

NCHH

  

The LSQR needs to acknowledge the possibility of a one person program.
The use of the word “staff” implies that a contractor won’t work.
Replace with the phrase “staff or contractor.” 

EPA’s response:

 

EPA agrees with this comment.  LQSR 3.0, Section 4.1 now states: 

 

“The laboratory shall also appoint (or contract out) one member of the
staff to act as quality manager.”

Comment #26  

NCHH

  

In Section 4.1 it states “The laboratory shall also appoint one member
of the staff to act as quality manager.” Is this person the technical
manager?  The next sentences imply that but it is not clear. 

 

EPA’s response: 

 

No, the technical manager and the quality manager may not be the same
person.  As a result the requirement for the technical manager and the
quality manager to be full-time employees of the laboratory has been
deleted in the LQSR 3.0. The text in Section 4.1 was clarified and now
states: 

 

“All NLLAP recognized laboratories shall identify a responsible
laboratory official who is authorized to release test reports on behalf
of the laboratory.  In a laboratory with one person, that person will be
the responsible official for the release of the test report.  That
person may either serve as the technical manager or the quality manager,
but cannot serve as both.  In the case of a laboratory with only one
employee, one of these positions shall be contracted out.  See Sections
4.14, 5.4.6 and 5.10.3 below for requirements on one-person firms
regarding internal audits and the release of test reports.” 

 

Comment #27 

NCHH

  

The LQSR states: “The technical manager shall ensure that mobile and
FSMO personnel have the capability to communicate with their supervisor
or the technical manager while on site at a field job location.”  What
if there is no supervisor or other personnel? 

 

EPA’s response: 

 

To become accredited under NLLAP, a laboratory must have a supervisor to
address, among other things, technical issues.  Note that the sentence
“The technical manager shall ensure that mobile and FSMO personnel
have the capability to communicate with their supervisor or the
technical manager while on site at a field job location” is removed
from LQSR 3.0.  

Comment #28 

NCHH

Section 4.3.2 of the LQSR says that Quality System documents generated
by the laboratory shall be clearly identified. What are quality system
documents? 

 

EPA’s response: 

 

Quality system documents are all documents generated by the laboratory
that form part of the Quality Management System such as regulations,
standards, other normative documents, test and calibration methods, as
well as drawings, software, specifications, instructions, and manuals as
indicated in ISO/IEC 17025: 2005 (E).

Comment #29  

NCHH

  

What are laboratory documents?  In Section 4.3.4 in the LQSR, it states
that laboratory documents are required to be updated every two years,
but Section 4.2.1 requires that the QMSM be reviewed annually? 

 

EPA’s response: 

 

Laboratory documents are documents that form part of the Quality
Management System.  However, as indicated in LQSR 3.0, section 4.3.4,
“ This process shall address when and how the laboratory’s documents
are reviewed, identify the sign off authority, and state that the
laboratory documents are reviewed at least annually and/or revised as
needed.”

 The Quality Management System Manual must be reviewed annually.

 

Comment #30  

NCHH

  

Why should EPA care whether the price for requests, tenders or contracts
has been agreed to? 

 

EPA’s response: 

 

Neither EPA nor the NLLAP’s LQSR 3.0 is concerned with the specific
pricing, tenders, or contracts other than to ensure that the quality of
data generated and provided by laboratories meets the following:

The requirements including the methods to be used are adequately
defined, documented, and understood;

The laboratory has the capability and resources to meet the
requirements; and

The appropriate test method is selected and capable of meeting the
customer’s requirements.

To ensure this, LQSR 3.0 in the Quality Management System section,
section 4.4, states that: procedures shall be established by the
laboratory for the review of requests, tenders, or contracts.

 

The text in Section 4.4 REVIEW OF REQUEST, TENDER OR CONTRACT was
revised to state: 

“Differences between the customer’s request and the contract offered
to the customer shall be resolved based on the QMSM policies and
procedures before any work commences.  The contract shall be acceptable
to both the laboratory and customer.  Changes made at the contract
review shall be recorded and maintained.  Contract review shall also
cover work to be subcontracted by the laboratory.  If the contract
requires amendment after work has begun, the same contract review
process shall be repeated and amendments communicated to all necessary
personnel.  The customer shall be informed of any deviations from the
contract.”

 

Comment #31  

NCHH

  

Section 4.5 of the LQSR states that “an NLLAP recognized laboratory
shall only subcontract when necessary and with approval from the
customer, with another NLLAP recognized laboratory.”  It is awkward
for field labs to subcontract work with another NLLAP recognized
laboratory since they may want to routinely run their samples through a
stationary lab for approval or comparisons.  

EPA’s response: 

 

Section 4.5 in the LSQR, which addresses subcontracting of tests, was
clarified to state: 

 

“An NLLAP recognized laboratory shall only subcontract work when
necessary and with the approval of the customer.  Work shall only be
performed by another NLLAP recognized laboratory.”

 

Comment #32  

NCHH

  

Do spiked samples or wipes have expiration dates? 

 

EPA’s response: 

 

Yes. The laboratory must ensure that the expiration date is assigned to
spiked samples or wipes. 

 

Comment #33  

NCHH

  

Section 4.8 of the LSQR states that “any complaint about the quality
of reported results may be referred to the accrediting authority if such
complaints cannot be resolved directly with the customer.”  The
statement should require the lab to specifically identify how to contact
the accrediting authority.  

EPA’s response: 

Section 4.8 in LQSR 3.0 states:

 

“The laboratory shall have documented policy and procedures for the
resolution of complaints received from customers or other parties about
the laboratory's activities or results.  The policy shall include the
notice that “Any complaint about the quality of reported results may
be referred to the accrediting body if such complaints cannot be
resolved directly with the customer.”  See the NLLAP web page at:  
HYPERLINK "http://www.epa.gov/lead/nllap.htm" 
www.epa.gov/lead/nllap.htm  for the contact information of the
recognized accrediting bodies.  A record shall be maintained of all
complaints and of the actions taken by the laboratory.  Where a
complaint, or any other circumstance, raises doubt concerning the
laboratory's compliance with policies its own procedures, the
requirements of this document, the quality management system and/or the
quality of the laboratory's analyses, the laboratory shall ensure that
those areas of activity are promptly audited.”

Note the complete sentence, “The policy shall include the notice that
“Any complaint about the quality of reported results may be referred
to the accrediting body if such complaints cannot be resolved directly
with the customer.””  While the LQSR 3.0 does not specifically
identify how to contact the accrediting body (AB) the AB contact
information is public information and can be found on the web at  
HYPERLINK "http://www.epa.gov/lead/nllap.htm" 
www.epa.gov/lead/nllap.htm  as stated in the LQSR 3.0, section 4.8. The
EPA believes this is sufficient notification for laboratories and their
customers.

Comment #34  

NCHH

  

Section 4.13(a) in the LSQR states that “All records (including those
pertaining to calibration and test equipment, certificates, and reports)
shall be safely stored, held secure and in confidence to the
customer.” What does “in confidence to the customer” mean? 

 

EPA’s response: 

 

The phrase “in confidence to the customer” means that all
information of a laboratory’s customer will be treated as
confidential.  The text in section 4.13 was clarified and now states: 

 

“a.  All records (including those pertaining to calibration and test
equipment, certificates, and reports) shall be safely stored, held
secure and considered confidential.”

 

Comment #35 

NCHH

  

Section 4.13(e) requires a signature for the record. Is an electronic
signature an option? 

 

EPA’s response: 

 

Yes. An electronic signature is acceptable if defined as acceptable by
the Quality Management System. 

Comment #36  

NCHH

  

Section 4.13.1(b) states that records must include sample storage and
tracking, including shipping receipts, but they may not be relevant for
field labs. 

 

EPA’s response: 

 

EPA agrees with the comment and Section 4.13.1(b) in the LSQR was
revised to state: 

 

“Sample storage and tracking including shipping receipts, where
applicable;”

 

Comment #37  

NCHH

  

Standard and reagent origin, receipt, preparation, and use are not
relevant to some field lab technologies.  Labs that use XRFs can save
the sample.  Should they be required to do so for three years?  I
believe that non-destructive technologies offer opportunities for
flexibility that need to be considered and built into the document. 

 

EPA’s response: 

 

Standards are very important for calibration and calibration
verification, and are routinely used for quality assurance and control
by users of  field technologies,  including portable XRFs.  Users of
non-destructive technologies such as XRF may decide to save the sample
for any reasonable period of time that is specified in their Quality
Management System and use these samples as Quality Control Samples. 

 

Comment #38  

NCHH

  

The commenter quotes the LQSR that “At least annually, the laboratory
shall, in accordance with a predetermined schedule and procedure,
conduct internal audits of its activities to ensure that operations
comply with the requirements of this document and the established
management system.”  Then they note that “one requirement was every
two years.”

EPA’s response: 

There are no specified biannual/biyearly/every two year requirements as
part of a laboratory quality management system or internal audits.  The
frequency for conducting internal audits is addressed in the LQSR 3.0,
section 4.14 regarding laboratory document review and revision and is
stated as follows:

“At least annually, the laboratory shall, in accordance with a
predetermined schedule and procedure, conduct internal audits of its
activities to ensure that operations comply with the requirements of
this document and the established quality management system.  The
quality manager shall be responsible for the planning and implementation
of such audits.  Such audits shall be carried out by trained and
qualified staff that are, whenever possible, independent of the
activities to be audited.  All relevant information pertaining to the
audit activity shall be recorded.  In a one-person organization, the
annual audits may be conducted by the one member of the laboratory if
that person follows independent third-party guidance for conducting a
quality system audit, such as ISO 19011, Guidelines for quality and/or
environmental management systems auditing or similar third party
guidance published by organizations such as ASTM (American Society for
Testing and Materials) or ANSI (American National Standards Institute)
and creates and maintains a written audit report of the audit findings
in its record system.  Alternatively, the one-person organization may
choose to contract out the internal audit to an independent person or
firm that is competent and has the experience and training to conduct an
audit of a quality system.

When the audit finds doubt on the correctness or validity of the
laboratory test results, timely corrective action shall be taken by the
laboratory.  If the audit indicates that laboratory results may have
been affected, customers shall be notified and informed of the
corrective action taken and the amended results.”

Comment #39  

NCHH

  

The headings for the section 5.1 should match the titles for the
subsection. 

 

EPA’s response: 

 

EPA agrees with this comment and the headings for the section 5.1 have
been changed to match the titles for the subsections. 

 

Comment #40 

NCHH

  

Section 5.2 of the LSQR indicates that the laboratory must use
permanently employed staff or those under contract. If the staff is not
employed or under contract, then what are they?  Volunteers?  If
something other is meant, then you need to explain.   

 

EPA’s response: 

 

In response to this comment, the text in section 5.2 was clarified to
state: 

 

“The NLLAP recognized laboratory shall use permanently employed staff
or those under contract to the laboratory.  When additional support or
technical personnel (such as temporary employees) are needed, the
laboratory shall ensure that such personnel are competent, supervised
and work within the quality management system.”

Comment #41 

NCHH

  

New technicians need to demonstrate competency before being released to
perform sample analysis independently and without further supervision. 
For XRFs, the person should be required to send a certain number of
samples to a fixed lab for result confirmation. 

EPA’s response: 

 

EPA agrees with this comment.  LQSR 3.0, Section 5.2.1.1, Minimum
Qualification and Training for Analysts and Technicians provides the
requirements for analyst and technician training.  As stated in section
5.2.1.1.3:  

“The analyst/technician trainee shall complete a minimum of four
independent test runs of sample preparation (when applicable) and/or
instrumental analysis for each matrix.” 

 

Comment #42 

NCHH

  

It is important for the technician to know how the sample is taken and
the variables involved.  Not many private lab technicians have taken the
lead inspector/risk assessor courses where the sampling collection
methods are taught.  

EPA’s response: 

 

EPA agrees that it is important for the technicians to know how and
where the sample is taken and the variables involved while selecting
sampling location (when applicable).  NLLAP requires that technicians
that collect samples as a part of testing for lead must take samples in
accordance with regulations in 40 CFR Part 745 - Lead-Based Paint
Activities. 

 

Comment #43 

NCHH, EHS

  

μg/ft2 could pass clearance testing for floors.  A 299 μg/ft2 sill
sample could pass.  Also, labs should be required to report their
standard deviations so consumers are alerted to the potential accuracy
issues at the lab.  

 

EPA’s response: 

 

See Comment #16 on uncertainty of measurements and acceptable criteria. 
Based on comments received EPA believes that the frequency of ELPAT
rounds should remain as they currently are - quarterly.  EPA believes
that this provides adequate assurance of laboratory competence. 

 

Comment #44 

NCHH

  

Are the accrediting organizations being asked to oversee and monitor the
sample selection process? 

 

EPA’s response: 

 

No.  LQSR 3.0 doesn’t ask the accrediting organizations to oversee and
monitor the sample selection process.  However, an accrediting
organization will have to make sure that the employee of the laboratory
responsible for sample location selection is certified by EPA or an
authorized state or tribal program as a risk assessor/inspector/sampling
technician, pursuant to Section 402 of the Toxic Substance Control Act
(TSCA) and its implementing regulations. 

 

Comment #45 

NCHH

  

The LQSR states that “All mobile or FSMO technicians shall be
evaluated by a competent supervisor for their first two NLLAP-related
job sites.”  Does it have to be a supervisor or can a contractor do
it? 

 

EPA’s response: 

 

A competent supervisor, contractor, or consultant may be a supervisor. 
The key requirement here is that the person is competent. 

 

Comment #46 

NCHH

  

The LSQR states in Section 5.3.1.1 that “Sample preparation and
analysis is not to proceed until surface contamination is below the
specified maximum allowable concentration of 50 percent of the lowest
regulatory limit for dust wipe samples.” It appears that 50% of blanks
could have levels over 40 μg/ft2.

EPA’s response: 

 

This is not a blank assessment requirement.  It is a requirement for
working area contamination monitoring.  Sample preparation and analysis
cannot be performed until surface contamination is below the specified
maximum allowable concentration of 50 percent of the lowest regulatory
limit for dust wipe samples (currently it would be 20 μg/ft2). 

Comment #47 

NCHH

  

Section 5.3.1.1 in the LSQR states “For FSMOs, appropriate
contamination control blank samples shall be run in order to monitor
potential lead contamination.”  Define appropriate contamination
control for FSMO. 

 

EPA’s response: 

LQSR 3.0 recognizes the “Note under ISO/IEC 17025:2005(E) Section
5.5.6” as a guideline for FSMO contamination control requirement: The
ISO/IEC Note states that “Additional procedures may be necessary when
measuring equipment used outside the permanent laboratory (applicable to
FSMOs) for tests, calibration or sampling (and contamination).”  An
appropriate contamination control blank might be a sample blank with
physical and chemical properties that mimic the sample to be measured
with no lead content or lead level below the detection limits of the
FSMO’s testing device.

Comment #48 

NCHH

  

Provide example methods under consideration for analytical testing by
using the current regulatory limit. 

 

EPA’s response: 

 

To provide maximum flexibility in quickly adapting to any future changes
in the lead regulations, it is simpler to state the quantitation limit
as a percentage of the current regulatory limit (e. g., 20 μg/ft2 is 50
% of the current regulatory concentration limit for lead in dust on
floors). 

 

Comment #49  

 NCHH

The test that allows a lab to state with 95% confidence that the lead
level of the sample is below the appropriate regulatory threshold is
more important (than above the regulatory threshold).  In other words,
it is more important for a lab to state with 95% confidence that a
sample is false negative than false positive so that the regulation is
more health protective than the other way around.   

EPA’s response: 

 

The manufacturer or vendor of a pass-fail technology can choose how its
technology will be used, and how results are to be reported.  If the
technology is designated as two-sided vis-a-vis the standards, then two
sided control limits apply.  The manufacturer may choose to designate
the technology as one-sided vis-a-vis the standards, and the control
limit charts have been written to allow this possibility.  Also, see EPA
Response to comment #17. 

 

Comment #50

NCHH

  

Standard Operating Procedures (SOPs) could be standardized for each
technology. 

 

EPA’s response: 

 

EPA agrees with this comment.  When resources permit, SOPs can be
developed. 

 

Comment #51

NCHH

  

Accuracy and precision should be included in the report to the client. 

 

EPA’s response: 

 

LQSR 3.0 section 5.10.2(n) Test Reports requires a statement on the
estimated uncertainty of the measurement. 

 

Comment #52

NCHH

  

It is essential to allow a lab to be able to release reports on
clearance tests without having someone else review the results. 
Otherwise the major benefit of field labs will be lost.  Let regular
checks of the reports by a second person be sufficient.  The second
person often falls into the same traps as the first person if every
report is reviewed. Many reports from fixed labs contain errors.  A
monthly check should suffice.  For a new person, the checks need to be
in place for the first few.

 

EPA’s response: 

 

EPA agrees with comments recommending an independent review of each test
report, see reply to comments # 7. 

 

The text of section 5.4.6 was revised to state: 

“5.4.6 DATA REDUCTION AND REVIEW PROCESS “

  

“The data reduction and review process shall be conducted by a
qualified person and include, but not necessarily be limited to:
comparison of quality control data against established acceptance
limits, computation verification, transcription of data, and adherence
to the procedures established in the laboratory SOPs.  Where
appropriate, computations shall be verified and transcription of data
double checked.  Qualified persons can be technicians, analysts, the
quality manager, or technical manager, or the responsible person
described previously in Section 4.0. 

 

In the case of a one person laboratory, the review process shall be
contracted out to an independent person or firm that is competent and
has the experience and training necessary to conduct the review. 

 

 The review process shall be documented and signed by the reviewer, and
shall be retained on file with a copy of the final report for a minimum
of five years.”

 

Comment #53 

NCHH

  

In the case of non-destructive technologies, the easiest method is to
test the spiked sample repeatedly and to send some of the samples to a
NLLAP lab for comparison.  The document needs to capture the flexibility
provided by non-destructive technologies. 

 

EPA’s response: 

 

EPA has modified the document to include this flexibility.  See EPA’s
Response to Comments, #s 11 and 17 on measurement uncertainty. For a
response to comments on changes to test report requirements see EPA’s
Response to Comments, #51. 

 

Comment #54 

NCHH

  

Since the Independent Calibration Verification (ICV) standard must be at
a lead concentration in the range of lead levels of concern or action
levels such as regulatory limits, should ICV be tested at 40 μg/ft2 and
250 μg/ft2? 

 

EPA’s response: 

 

Correct.  The ICV standard shall be at a lead concentration of the
interest (regulatory limits). 

 

Comment #55

NCHH

  

Why not design in the possibility of non-destructive technologies that
can be checked subsequently by a stationary lab? 

 

EPA’s response: 

 

EPA allows that a measurement performed by a non-destructive technique
may be confirmed by a stationary laboratory. 

 

Comment #56 

NCHH

  

The standard for lead dust is written in English units and SI units. 

 

EPA’s response: 

Yes, the standard for lead dust is written in both English (properly
referenced as “British”) units and SI units. 

 

Comment #57 

NCHH

  

For how long should samples be saved after the analysis? 

 

EPA’s response: 

 

The laboratory must save the samples after the analysis for as long as
is specified by the laboratory SOP. 

Comment #58 

NCHH

  

I think that the opportunity for interlaboratory comparisons for XRFs is
a feature that needs to be a routine part of quality control and can
possibly serve as a substitute for other items. 

 

EPA’s response: 

 

Yes, interlaboratory comparisons and proficiency testing programs are a
major part of ensuring the quality of the test results. Proficiency
testing programs and interlaboratory comparisons are two of the key
components for validating a developed method such as XRF to confirm that
the method is fit for the intended use. (ISO/IEC 17025:2005(E) 5.4.4.2)

Comment #59 

NCHH

  

It is hard to figure how a dust wipe could be split.  How do labs that
destroy the sample do it?  If they sample only the digested liquid, that
is not a true split sample and will miss a crucial part of the sample
analysis process. 

 

EPA’s response: 

 

EPA agrees with this comment.  It is impossible to split Dust Wipe
samples.  Section 5.9.1.2 suggests: 

 

For analyses where there is not a sufficient amount of field sample for
splitting or the analytical technology does not allow for split samples,
the laboratory shall use alternative QC procedures in an effort to
monitor the laboratory's precision of analysis.  One of these options is
the analysis of duplicate laboratory control samples, prepared with the
appropriate matrix material, for each batch in order to monitor
laboratory performance. 

 

EPA also agrees that an acceptable alternative QC procedure for
non-destructive technologies might be a repeat test. 

 

Comment #60 

FAS

 

The Technical Manager should have more qualifications than the Quality
Manager, namely the background to design the technical aspects of the
management system from scratch, if need be. 

 

EPA’s response: 

 

EPA agrees with this comment and made the change.  Section 4.1.1.1 was
changed to state: 

 

The individual who functions as the technical manager (however named) of
the laboratory shall have appropriate education for the measurement
technologies used in the laboratory, training, and experience, or
combination thereof, to 1) be able to design and implement the
management system, and 2) enable that individual to identify the
occurrence of departures from the implemented quality management system
or test procedures and to initiate actions to prevent or minimize such
departures. 

 

Comment #61 

FAS

 

A method can’t be proficient or competent, but it can be valid and
accurate 

 

EPA’s response: 

 

EPA agrees with this comment and made the change in the text in section
5.4.1a to state: 

 

Methods shall not be used for sample analysis until the laboratory has
confirmed and documented proficiency in using these methods.  Competency
in using test methods shall be demonstrated over the lead concentration
and sample mass ranges for each matrix stated by the method, as
appropriate. 

 

Comment #62 

FAS, A2LA

 

It is worthwhile giving a reference here (LQSR 3.0 Section 5.4.4
“Method Performance Characteristics”), so that labs have a starting
point for finding how to do MDLs. 

EPA’s response: 

 

EPA agrees providing a reference is “worthwhile” and the reference
to 40 CFR Part 136 Appendix B was added. Section 5.4.4.1, Method
Detection Limits states: Method detection limits (MDLs) shall be
established statistically verified and monitored, as needed for each
method and matrix of concern.  For methods with stated MDLs,
demonstration of ability to achieve such MDLs is required and shall be
documented.  MDLs shall be determined using a documented SOP that is
based on procedures published or recognized nationally (e.g., 40CFR Part
136 Appendix B).

Comment #63 

FAS

 

The term “reporting limit” is no longer used and is not in the
glossary so should be dropped here in favor of quantitation limit. 

 

EPA’s response: 

 

EPA agrees with this comment.  EPA has replaced “reporting limit”
with “quantification limit” throughout the text of the document. 

 

Comment #64 

FAS 

EPA should note in the document that Atomic Absorption (AA) and similar
technologies must be used in the correct part of the response curve (for
AA the range of linear response).  

EPA’s response: 

 

EPA accepts the comment that “Atomic Absorption (AA) and similar
technologies must be used in the correct part of the response curve (for
AA the range of linear response).”  The following text was added to
section 5.5.2: 

 

f. When linear fit is used, the extent of the linear range shall be
verified (if possible) and the calibration standards shall be limited to
that range. 

 

Comment #65 

FAS

10% was the maximum acceptable range for Continuing Calibration
Verification (CCV) in the old LQSR for number-producing technology. 

EPA’s response: 

 

It is the Agency’s intention to allow for flexibility within the
Quality Management System of every organization that is appropriate to a
specific testing technology that is used by that particular
organization.  EPA intentionally removed all technology-specific
requirements from the LQSR 3.0 and proposed that instrument QC
requirements are based on performance and lead regulatory needs, rather
than analytical technique.  Acceptance criteria shall be determined,
documented and used.  Each laboratory shall establish its own
performance criteria for QC samples (uncertainty of measurement), which
shall not be greater than 20% as stated in LQSR 3.0.

Comment #66 

 FAS

Addition of specifications from the Performance Characteristic Sheet
(PCS) to frequency and acceptance criteria for CCV gives EPA the option
to specify for each new technology the frequency of QC. 

 

EPA’s response: 

 

EPA agrees with  this comment and made a change.  The text in tables 1
and 2 were added, as suggested, to specify the required frequency for
CCV.  CCV frequency requirements shall be:

 

At the beginning and end of a sample run, as well as every 12 hours, or
according to instrument manufacturer's recommendations, or according to
instrument Performance Characteristic Sheet (PCS), or at a predetermined
SOP frequency whichever is most frequent. 

 

Comment #67 

FAS

 

Consider adding to section 5.10.2 “Test Reports” the following:

 

g. identification of the validated analytical method used and of any
significant modifications made to the method;  

 

…of any significant modifications made to the method; because, name
alone of the starting method is meaningless if the lab has significantly
altered the method. 

 

EPA’s response: 

 

Modifications to the methods are covered in the following subsection h.,
stating: 

 

h. any deviations from, additions to, or exclusions from the analytical
method, and any other information relevant to a specific analytical
method, such as environmental conditions including the use of relevant
data qualifiers; 

Comment #68 

FAS

 

It was the quantitation limit for which minimum limits were set back in
5.4.4.2, so it should be the same limit shown on the report. 

EPA’s response: 

 

The same standard for setting minimum limits would also be used for
reporting those limits.  The LQSR 3.0 section 5.4.4.2 states; the
quantitation limit shall be “less than” (“<”) a value at least 2
times but no greater than 10 times the method detection limit as
determined in Section 5.4.4.1.

EPA agrees with this comment and made the change.  Suggested changes
were made throughout the text of the document. 

 

Comment #69 

FAS

 

Add an additional subsection in the requirements section for the test
reports significant figure statement. 

EPA’s response: 

 

EPA agrees with this comment and made the change.  The following text
was added to state: 

 

q. significant figures reported for each value shall correctly represent
the estimated uncertainty of the reported measurement. 

 

Comment #70 

FAS

Do not mention linear range in the glossary as long as using the linear
range where it is applicable was added to 5.5.2. 

EPA’s response: 

 

Apparently “linear range” was used to “help” describe
“independent calibration verification” where it apparently confused
the definition.  EPA agrees with this comment and made the change.  The
text in the glossary was changed to state: 

 

Independent calibration verification (ICV): A standard solution (or set
of solutions) used to verify calibration standard levels.  The
concentration of analyte should be near mid-range of the calibration
curve which is made from a stock solution having a different
manufacturer or manufacturer lot identification than the calibration
standards.  The ICV must be matrix matched to acid content present in
sample digestates.  The ICV should be measured after calibration and
before measuring any sample digestates. 

 

Comment #71 

RTI

 

Require two or three Proficiency Testing (PT) samples to be analyzed by
FSMOs every month or two but make reporting simple. 

EPA’s response: 

 

EPA disagrees with the commenter that a one or two month test round
frequency is needed.  EPA is developing an inexpensive PT sample that
will be required to be analyzed by FSMOs with the same frequency
(quarterly) of test rounds as for fixed-site laboratories. 

 

Comment #72 

RTI

 

Require FSMOs to submit examples of real-world sample QC results with
their PT results quarterly or every six months. 

 

EPA’s response: 

 

EPA disagrees that this is should be required.  Real-world QC sample
results are checked during on-site visits by the accrediting
organizations and are different from the evaluation of PT sample
results. 

Comment #73

EHS

 

EHS supports EPA’s idea of changing ELPAT frequency to semiannual. 

 

EPA’s response: 

 

It is the Agency’s intention to allow for flexibility but at the same
time ensure adequate quality of lab measurements to accomplish the
Agency’s mission of protecting public health.  EPA believes that the
frequency of ELPAT rounds should remain quarterly.  We believe that
keeping ELPAT frequency at one round per quarter provides adequate
assurance of the laboratory’s competence. 

 

Comment #74 

EHS

 

The measurement sites and minimum number of samples collected should be
defined. 

 

EPA’s response: 

 

Sampling methods are addressed in section 5.7 of the LQSR 3.0 Draft and
requires that the sampling be performed in accordance with the
requirements of 40 CFR Part 745 - Lead-Based Paint Activities. 

 

Comment #75 

A2LA

 

Regarding section 5.4.1 a. on demonstrating competence with a method,
EPA might note that ISO TS 21748 provides objective criteria for
demonstrating competence with a method.  These criteria relate to the
validation data for that method, if the method was validated with a
study that complies with ISO 5725 (or AOAC HCV or ASTM 691). 

 

EPA’s response: 

 

EPA agrees with this comment and made the change.  ASTM 691 is listed as
a guidance document that should be helpful in dealing with method
validation and performance demonstration.  See section 5.4.2 of LQSR
3.0. 

 

Comment #76 

A2LA

 

Section 5.4.1 c. on non-numeric methods refers, it is assumed, to
measurement procedures where there is an underlying quantitative signal
that is compared with a fixed cut point.  The situation is covered in
the revised A2LA LSAC policy on measurement uncertainty (but not the
previous version).  It is good that EPA has specifically stated that the
signal plus its expanded uncertainty must lie below the cut point, to
remove any ambiguity about this issue.  This will of course require that
laboratories calculate their uncertainty for these measurement
procedures, which is consistent with A2LA policy. 

 

EPA’s response: 

 

EPA agrees with this comment. 

 

Comment #77 

A2LA

 

Section 5.4.2 might include the ISO method validation/use documents
cited above (ISO/IEC 5725 series, ISO TS 21748), and the ISO/IEC 11843
series on Capability of Detection.  The CLSI series of evaluation
protocols are also relevant (EP5, EP9, EP17, EP6), although perhaps they
are more for clinical use.  However CLSI EP17-A (Limits of Detection and
Quantitation) uses an example dataset from CDC that relates to detection
of blood lead levels.  The titles are available on  HYPERLINK
"http://www.clsi.org/" www.clsi.org . 

EPA’s response: 

 

EPA agrees with this comment and it is addressed in response to comment
# 75. 

Comment #78 

A2LA

 

Unless section 5.4.4.1 and 5.4.4.3 on MDL and MQL have been updated
recently, EPA allows procedures to calculate MDL and MQL that I think
are invalid, and inconsistent with rigorous international standards. 
The procedures in ISO 11843-1 and in CLSI EP17-A should be required. 

EPA’s response: 

 

EPA agrees with this comment and made the change.  A reference to ISO
11843 has been added to state: 

 

Method detection limits (MDLs) shall be established, statistically
verified and monitored, as needed for each method and matrix of concern.
 For methods with stated MDLs, demonstration of ability to achieve such
MDLs is required and shall be documented.  MDLs shall be determined
using a documented SOP that is based on procedures published or
recognized by nationally or internationally acknowledged technical
authorities (e.g., ISO 11843 Capability of detection -- Parts 1, 2, 3
and 4 at: www.iso.org; 40 CFR Part 136 Appendix B). 

 

Also, see response to comment #62. 

 

Comment #79 

A2LA

 

Regarding section 5.4.4.3 on Accuracy and Precision, the appropriate
international harmonized term (International Vocabulary of Basic and
General Terms in Metrology - VIM) would replace “Accuracy” with
“Trueness” in this section.  The concept of “accuracy” includes
both trueness and precision. 

Appendix I, Glossary.  “Accuracy” is defined appropriately, and
therefore is used inappropriately in section 5.4.4.3.  The appropriate
VIM definition might be preferred however, or the glossary should define
“systematic error”.  The VIM definition for “Trueness” would
also be appropriate. 

EPA’s response: 

 

EPA agrees with this comment.  LQSR 3.0 defines: 

 

Accuracy: The degree of agreement between an observed value and an
accepted reference value.  Accuracy includes a combination of random
error (precision) and systematic error (bias) components which are due
to sampling and analytical operations; a data quality indicator.  See
Precision and Bias. 

 

Bias (trueness):  The systematic error manifested as a consistent
positive or negative deviation from the known true value. 

 

Precision (repeatability): The degree to which a set of observations or
measurements of the same property, usually obtained under similar
conditions, conform to themselves; a data quality indicator.  Precision
is usually expressed as standard deviation, variance, or range, in
either absolute or relative terms. 

 

The definitions are consistent with those in VIM. 

Also, in the text of section 5.4.4.3, “accuracy” is replaced with
“bias”, as appropriate, to state: 

 

Bias and precision shall be determined for each analytical method.  The
method evaluation results shall be documented and kept on file. 
Acceptable minimum method performance criteria for bias and precision
can be found in Tables 3 and 4 of this document. 

 

Comment #80 

A2LA

 

Regarding section 5.4.5 on subsampling, should the laboratory have an
estimate of the variability due to subsampling, and shall that estimate
be included in the estimate of measurement uncertainty?  This should be
explicitly stated to resolve ambiguity. 

 

EPA’s response: 

 

In the situation when the subsampling method affects reported lead
results, the statement on the estimated measurement uncertainty shall
include uncertainty contributed by subsampling.  The laboratory may
choose to use an appropriate statistical evaluation of the Laboratory
Control Sample (LCS) results to estimate measurement uncertainty when
“subsampling” of the LCSs is performed in a similar fashion as a
subsampling of the real samples. 

 

Comment #81 

A2LA

 

Sections 5.5.2.2-5.5.2.4 on Calibration Verification.  I cannot comment
on the verification requirements for measurement procedures that produce
a pass/fail result (verification values within +/- 20% of regulatory
limit), except that I agree a specification is needed.  Perhaps the
language should state “as close as possible to the regulatory limit,
but not farther than 20%.”  (Similar to the language in 5.9.1.1.)
However I can state that for quantitative procedures, the use of
correlation coefficient to judge the suitable of calibration is not
recommended.  This number can be very large (>.99) and still have
unacceptable error at key decision points, or grossly nonlinear
response.  Laboratories should have objective criteria for error at/near
regulatory limit(s), and/or use an appropriate protocol to determine the
linearity of response, per CLSI EP6-A. 

 

EPA’s response: 

 

See response to comment #11 on acceptable frequency and limits for QC
samples. 

Also, revisions have been made to the text to clarify the use of
correlation coefficient to judge the suitability of linear calibration,
in section 5.5.2 subsection f.  It states: 

 

f. When linear fit is used, the extent of the linear range shall be
verified (if possible) and the calibration standards shall be limited to
that range. 

 

Comment #82 

A2LA

 

Regarding section 5.9.1.2 on Precision (repeatability) determination,
the defined procedure will estimate the repeatability of a measurement
procedure, if data are appropriately pooled across batches.  However the
section seems to have an objective of also measuring precision across
batches, which would be an intermediate measure of precision – perhaps
called “within laboratory precision” or “among batch precision”.
 These are not measures of repeatability. 

 

EPA’s response: 

 

EPA agrees with this comment.  Section 5.9.1.2 describes procedures
appropriate for estimation of precision of a method, and was renamed:
“5.9.1.2  Precision Determination” 

 

Comment #83 

A2LA, AHH

 

Update 5.2.1.1.3 to be consistent with NELAC and with general method
requirements: 

 

Option 1 

 

Documentation of continued proficiency by at least one of the following
per year: 

 

a. Successful performance of a blind test sample (single blind to the
analyst) and the laboratory must determine the acceptable results prior
to analysis. 

b. At least four consecutive laboratory control samples with acceptable
performance from four different batches 

c. Analysis of authentic samples with test results statistically
indistinguishable from another qualified analyst. 

Option 2 

 

OR IF EPA requires more detail, this is another option for 5.2.1.1.3: 

 

The analyst/technician trainee shall complete a minimum of four
independent batches of sample preparation (when applicable) and/or
instrumental analysis for each matrix.  A batch includes the QC as
defined in Table 3 or 4.  For sample preparation training, the
recoveries of the associated reference materials or proficiency training
samples for each run must be within the requirements of Table 3.  For
instrumental analysis training, the recoveries of the associated
reference materials or proficiency training samples for each run must be
within either Table 3 or 4 as applicable. 

 

EPA’s response: 

 

EPA agrees with this comment and a change was made as suggested in
option 2.  Section 5.2.1.1.3 now states: 

 

  The analyst/technician trainee shall complete a minimum of four
independent test runs of sample preparation (when applicable) and/or
instrumental analysis for each matrix.  An independent run is defined as
analysis of at least five samples of known lead content, one of which is
a certified reference material or proficiency testing material and is
separated by a period of time sufficient to evaluate the performance of
any previous independent run.  For sample preparation training, the
recoveries of the associated reference materials or proficiency training
samples for each run must be within the requirements of Table 3.  For
instrumental analysis training, the recoveries of the associated
reference materials or proficiency training samples for each run must be
within either Table 3 or 4 as applicable. 

 

For some analytical testing technologies it may not be possible to
separate the sample preparation techniques from instrumental analyses. 
In such cases, the training requirements shall be based upon the minimum
requirements stated for both analysts and technicians. 

 

The reference material/proficiency test samples utilized shall: 1) be
similar to matrices the analyst /technician will encounter during
routine lead sample analysis, and 2) cover the sample mass/concentration
range for which the analytical SOP has been validated. 

 

Comment #84 

A2LA

 

Sections 5.9.1 and 5.9.2 seem to require the same trending for matrix
spikes as for LCSs.  Since each MS is a unique matrix, it is suggested
to exempt these from control charting and keep the method based
acceptance limits.  Require control charting for LCS only and not for
the method blank and the MS/duplicates.  Control charts are already
performed for LCSs for numerous programs, but most methods and NELAC
allow using fixed acceptance limits for the blanks, MS, and duplicates. 
Since the MS and duplicate matrices are ever changing and not from the
same source, these charts do not show laboratory performance.  The LCS
does show laboratory performance and is the best assessment of the
laboratory on a daily basis. 

 

EPA’s response: 

 

To avoid ambiguity in section 5.9.1, Quality Control Procedures, the
suggested change was made and a statement about the statistical
evaluation of the QC data rewritten to state: 

 

The laboratory shall have QC procedures for monitoring the validity of
lead tests undertaken.  The QC procedures shall be stated in the quality
documents such as QM and/or in each method SOP addressing, as
appropriate: 

 

• Duplicate or "Side-by-Side" Field Sample Analyses

• Spiked and Blank Sample Analyses

• Blind Samples

• Split/Spiked Field Sample Analyses 

• Control Charts or Equivalent 

• Calibration Standards 

• Laboratory Control Samples 

• Internal Standards 

The resulting QC data, where appropriate, shall be recorded in such a
way that trends are detectable and statistical techniques shall be
applied to the reviewing of the results. 

 

Comment #85 

A2LA

 

In reference to 5.3.1.2,  for stationary laboratories, it is unclear why
‘baths’ and wipes are needed since method blanks are specified.  All
methods and the LQSR require blanks.  Blanks should take care of
assessing laboratory added contamination.  I have never seen positive
wipes where the method blanks were acceptable.  Thus, it seems that the
wipes are redundant.  We really want to know what might contaminate the
sample.  As to acid baths, I audit 20+ labs a year and no one uses these
anymore, some labs acid rinse the container but none seem to have acid
baths any longer.  Perhaps it should be stated that the glassware used
for the method blanks should be processed through any acid baths used to
assess contamination. 

 

EPA’s response: 

 

 EPA agrees with this comment.  The text of section 5.3.1.2 Labware
Cleaning was changed to state: 

 

Cleaning procedures for labware shall be specified by the laboratory in
a written SOP.  The procedure shall include, where applicable, a
specified frequency for monitoring of lead concentrations in cleaning
baths, the monitoring of glassware contamination during the analysis of
reagent or other blanks, and periodic monitoring of disposable labware
contamination by analyzing of reagent or other blanks.  To assess
possible contamination, glassware used for the method blanks should be
processed through acid baths used by the laboratory for labware
cleaning. 

 

Comment #86 

AHH

 

Improvement is needed to ensure that the proficiency testing is
sufficiently protective.  Reducing the frequency of testing would be
less protective and have the effect of compromising public health, since
consumers could be exposed to a lab that is very poor performing if it
experiences diminution of its capacity for as long as two years. 

 

Instead, consideration should be given to increasing the number of
quarterly samples to 10 or 8 per matrix to increase the minimum accuracy
rate to 90% or 87.5% respectively. 

 

EPA’s response: 

 

EPA believes that quarterly proficiency testing with 4 samples per
matrix is sufficient.  It is the Agency’s intention to allow for
flexibility and the saving of resources, but at the same time ensure the
adequate quality of lab measurements to accomplish the Agency’s
mission of protecting public health, and it considers the present
frequency of testing to be appropriate. 

 

Comment #87  

AHH

 

Alternatively, the need to reduce the labs’ costs for the testing
program may deserve greater scrutiny.  We would urge EPA to verify
whether stationary labs can afford an increase in the cost of testing
more samples, taking into account the potential added marketing value of
meeting a more protective standard.  Perhaps this could be accomplished
by improving the cost-effectiveness of the compilation of samples, such
as by excluding the cost of any administrative or extraneous functions
from the cost of developing and sending samples. 

 

EPA’s response: 

 

EPA does not believe an increase in the number of samples is warranted. 
EPA believes that the current number of samples tested at this frequency
of testing is sufficient. 

Comment #88

AHH

We urge EPA to use its statutory authority under section 405 of the
Toxic Substances Control Act to “establish protocols, criteria, and
minimum performance standards” that are specific to field
methodologies such as portable analytic systems, and expand the current
NLLAP to accredit or certify entities and persons using these
methodologies.  By creating a second element, component, or sub-program
of NLLAP, EPA has a tremendous opportunity to tailor initial review,
monitoring, and other systems to address the circumstances of
non-stationary facilities as well as to ensure that the vendors that
operate the sub-program require relevant credentials and other
standards. 

 

EPA’s response: 

 

EPA believes that requirements for the users of the portable analytical
systems were efficiently and effectively incorporated under NLLAP and
there is no need for the creation of a separate accreditation system. 

 

Comment #89

AHH

 

Qualifying and auditing users of field technologies should be built into
the oversight, investigation, and enforcement functions of state and EPA
lead-based paint activities and sampling technician certification
programs.  These programs are already set up to oversee individuals
performing other sensitive functions in the field without other means of
supervision or second-party verification of results. Keeping these
functions within states will leverage existing regulators’ familiarity
with professionals and avoid the inefficiencies of long-distance travel
when field contact is necessary. 

 

EPA’s response: 

 

EPA will update the training requirements accordingly. 

Comment #90 

AHH

 

In updates to accredited risk assessor and inspector training program
courses and refresher courses, EPA and states should add information on
the use of XRF analytic systems and other field technologies for
checking dust for lead, especially if and where training and
certification in XRF use become optional elements of the program. 

 

EPA’s response: 

 

This is being incorporated into the EPA certification program.

 

Comment #91 

AHH

 

Proficiency testing for persons operating field measurement technologies
should be as frequent as is required for stationary labs. Given the
nature of the field technologies and the inherent lack of second-party
verification of results, persons operating field measurement
technologies could be held to a 100% performance standard under whatever
proficiency testing schedule is followed.  Alternatively, those with a
75% performance record could be required to send samples to a stationary
lab. 

 

EPA’s response: 

 

EPA agrees with this comment.  EPA plans to keep proficiency testing
requirements the same for all NLLAP accredited organizations regardless
of the number of personnel or the extent of the scope of testing
activities.  The performance requirement changed per comment # 83.  The
text in 5.2.1.1.3 was revised to state: 

 

The analyst/technician trainee shall complete a minimum of four
independent test runs of sample preparation (when applicable) and/or
instrumental analysis for each matrix.  An independent run is defined as
analysis of at least five samples of known lead content, one of which is
a certified reference material or proficiency testing material and is
separated by a period of time sufficient to evaluate the performance of
any previous independent run.  For sample preparation training, the
recoveries of the associated reference materials or proficiency training
samples for each run must be within the requirements of Table 3.  For
instrumental analysis training, the recoveries of the associated
reference materials or proficiency training samples for each run must be
within either Table 3 or 4 as applicable. 

 

For some analytical testing technologies it may not be possible to
separate the sample preparation techniques from instrumental analyses. 
In such cases, the training requirements shall be based upon the minimum
requirements stated for both analysts and technicians. 

 

The reference material/proficiency test samples utilized shall: 1) be
similar to matrices the analyst /technician will encounter during
routine lead sample analysis, and 2) cover the sample mass/concentration
range for which the analytical SOP has been validated. 

 

Comment #92 

AHH

The educational requirements for stationary lab personnel should not be
applied to persons operating field measurement technologies.  Instead
qualifications should cover relevant preparation such as applicable
lead-based paint activities and sampling technician certification;
knowledge of how to operate the equipment; and continued demonstration
of proficiency.  The lead-based paint activities programs may be better
positioned to establish and verify appropriate qualifications for
certified professionals using these technologies than a national
accreditation vendor. 

 

EPA’s response: 

 

EPA is working closely with the Accrediting Bodies to develop specific
education/training requirements for a specific technology to make sure
that on-site evaluators have sufficient guidance to determine the
appropriate level of training and experience for people analyzing
samples for lead.  Also, sampling technician certification is beyond
NLLAP’s authority. 

 

Comment #93

AIHA

 

AIHA strongly discourages EPA from revising the entire LQSR (or
developing a new section of the LQSR for FSMOs) that would allow
one-person labs as outlined in Section 4.1 as it would threaten the
integrity of laboratories’ quality management and data quality
systems. Certain program functions like internal assessment or data
review prior to reporting, found in sections 4.14, 5.4.6 and 5.10.3,
could be conducted by the same person since EPA is not requiring (but
suggesting) that these functions be performed by an outside contractor.
AIHA believes at least one additional on-site employee should be part of
a lead laboratory to ensure that an independent, unbiased data and
program review is done. 

 

EPA’s response: 

 

EPA disagrees with the commenter regarding one person laboratories and
the threat to the integrity of their lab quality management and data
quality systems.  To ensure quality measurements and quality control,
EPA does, however, agree that it is necessary and does require an
independent review of the data from FSMOs.  Accordingly, the following
change was made in section 5.4.6 DATA REDUCTION AND REVIEW PROCESS: 

 

“In the case of a one person laboratory, the review process shall be
contracted out to an independent person or firm that is competent and
has the experience and training necessary to conduct the review. 

 

The review process shall be documented and signed by the reviewer, and
shall be retained on file with a copy of the final report for a minimum
of five years.”  This section is in accordance with ISO/IEC 17025,
section 5.2.3.

 

Comment #94 

AIHA

If the XRF operator is self-employed, how would the accreditation body
(AB) issue a certificate to that individual? Would the certificate
include the persons’ name and address? If yes, then this implies that
the AB is accrediting an individual. AIHA believes that XRF operators
should be required to be associated and/or employed by company or
laboratory (with at least two individuals) so that AIHA would not be
accrediting self-employed individuals. 

 

EPA’s response: 

 

As stated in section 1.0, SCOPE 

 

“The requirements described within this document must be met for an
organization to attain recognition under the NLLAP as a lead testing
laboratory, hereafter referred to in this document as a laboratory.  An
organization requesting recognition under this program shall possess a
laboratory operation capable of performing sampling and lead testing.  A
laboratory shall have distinct staffing, instrumentation, sampling and
test methods, as appropriate, and depending upon the type laboratory may
have physical facilities and may use field test kits.  Laboratory
accreditation under this program shall be based on its meeting the
requirements listed in this document.”

 

Therefore, the Accrediting Bodies will accredit an organization with a
company name and address on the certificate, not an individual. 

 

Comment #95 

AIHA

 

AIHA believes that the language on laboratory personnel qualifications
in the current LQSR is appropriate and is concerned that the EPA’s
proposed language for a laboratory’s Technical Manager and Quality
Manager positions is too vague (for any lead laboratory including
FSMOs). AIHA recommends that any revision to the LQSR include specific
qualifications related to education and experience that personnel should
possess to conduct these analyses since the term “as appropriate”
can be easily misinterpreted. 

 

EPA’s response: 

 

EPA is working closely with NLLAP recognized Accrediting Bodies (ABs) to
develop specific education/training requirements for a specific
technology to make sure that on-site evaluators have sufficient guidance
to determine the appropriate level of training and experience for people
analyzing samples for lead. 

 

Comment #96 

AIHA

ABs should be allowed to retain the flexibility to maintain additional
requirements for the laboratories participating in its lead laboratory
accreditation program.  

 

EPA’s response: 

 

This comment is addressed in response to comment # 3.

 

Comment #97  

AIHA

 

EPA should establish a Federal Advisory Task Force comprised of
approximately eight stakeholders including 2 representatives each from
AIHA, A2LA, EPA and the laboratory community. The Task Force would be
charged with examining the host of complex issues surrounding the
implementation of a LQSR for FSMOs to address issues that accreditation
bodies would face including conformance with ISO/IEC 17011 requirements.


 

EPA’s response: 

Though the participants are not in a formal “task force,” EPA has
and is working closely with the current recognized ABs to develop
specific educational/training requirements for a specific technology to
make sure that on-site evaluators have sufficient guidance to determine
the appropriate level of training and experience for people analyzing
samples for lead. The current requirements in LQSR 3.0 are consistent
with the ISO/IEC 17025 (May 2005).   If others in the laboratory
community would like to participate they would be welcome.

Comment #98 

CCHD

The lab as an entity must have an acceptable SOP manual.  It should
therefore include corrective actions to be taken if the XRF operates
erratically, will not calibrate, etc.  If such a situation arises, the
SOP will direct the operator with corrective measures such as farming
the samples out or using a different XRF unit. 

EPA’s response: 

 

EPA agrees with this comment.   LQSR 3.0, section 4.2.3 directs that the
Quality Management System Manual include procedures for corrective
action.  Sections 4.11.1 - 4.11.5 elaborates on the requirements.

“4.11.1	General

	A policy and procedure for corrective action shall be established and
implemented for use by appropriate laboratory authorities when
nonconforming work is identified.  Required changes to the operational
procedures resulting from corrective action investigations shall be
properly documented and implemented.  When reported results have been
affected by nonconforming tests all affected customers shall be
contacted and informed of the corrective action taken and the amended
results (see 5.10.4).

4.11.2	Cause Analysis

	Corrective action procedures shall begin with an investigation to
determine the root cause(s) of the problem.

4.11.3	Corrective Actions

	Once the root causes have been identified, the laboratory shall
identify possible corrective actions.  The action(s) most likely to
eliminate the problem and prevent its recurrence shall be selected and
implemented.

4.11.4	Monitoring the Corrective Actions

	Once corrective action(s) have been implemented, the laboratory shall
monitor the results to make sure the actions taken have been effective
at addressing the problem(s) that caused the nonconformance.

4.11.5	Special Audits

	If there is doubt that the laboratory is complying with its own
policies and procedures or those of relevant standards, then the
laboratory shall ensure that the appropriate areas of activity are
audited.”

Comment #99  

CCHD

 

The educational requirements should be a high school diploma to operate
the XRF device plus the mandatory XRF training already in place.  It
would be reasonable to slightly expand the XRF training class for dust
wipe methodology.  XRF operators should also have a reasonable amount of
practice testing with standards for accuracy and precision as outlines
in the SOP manual.  The operator must be at least 80% competent in this
accuracy/precision training and be under supervision of a well trained
operator. 

 

EPA’s response: 

 

EPA agrees with the comments on education requirements for XRF
operators.  

Comment #100  

CCHD

 

Proposed QC criteria listed in LQSR 3.0 Draft are wider than the
criteria listed in existing LQSR 2.  The instrumentation in use today is
fully capable of meeting the existing criteria, and we see no need to
relax the criteria. 

 

EPA’s response: 

 It is the Agency’s intention to allow for flexibility within the
Quality Management System of every organization that is appropriate to a
specific testing technology that is used by that particular
organization.  EPA intentionally removed all technology-specific
requirements from the LQSR 3.0 and proposed that instrument QC
requirements are based on performance and lead regulatory needs, rather
than analytical technique.  Acceptance criteria shall be determined,
documented and used.  Each laboratory shall establish its own
performance criteria for QC samples (uncertainty of measurement), which
shall not be greater than stated LQSR 3.0. For example, see Section 5.9.
 Section 5.9 and the accompanying tables can be found in response to
comment # 11. 

Comment #101 

CCHD

 

What steps does EPA intend to require of manufacturers of a given new
technology to document any limitations of their technology and insure
these are understood by the user? 

EPA’s response: 

LQSR 3.0, section 5.4 requires that new, alternative or modified
analytical methods and/or new testing technologies may be used by a
laboratory if they have been validated by the laboratory or a third
party, and shown to meet the minimum performance requirements stated in
5.4.1(b).  

Where sample preparation and analysis methods are not specified by
regulatory programs, the laboratory shall, whenever possible, use
validated procedures published by federal agencies such as EPA, HUD or
NIOSH, state agencies, or nationally or internationally recognized
consensus standards organizations.   For example, EPA document
747-R-95-008, Methodology for XRF Performance Characteristic Sheets is
such a document, and can be found on the web at <  HYPERLINK
"http://www.epa.gov/lead/pubs/leadtpbf.htm#Identifying" 
http://www.epa.gov/lead/pubs/leadtpbf.htm#Identifying >.

Comment #102

CCHD

Will manufacturers be required to perform a series of standardized
challenge tests (e.g. tests to limitations or performance) and publish
the results? 

EPA’s response: 

No, EPA does not require the manufacturers of new testing technologies
to document their limitations or performance.  The laboratories or a
third party are responsible for validating the new technology to ensure
it meets the minimum performance requirements stated in 5.4.1(b).

Comment #103

CCHD

How do the states and end users know that the technology is valid? 

EPA’s response: 

The technology is valid if sample test results are confirmed by a third
party laboratory.  The laboratory shall confirm its competency utilizing
the method as described in Section 5.4.1(a).

“Methods shall not be used for sample analysis until the laboratory
has confirmed and documented its proficiency in using these methods. 
Competency in using test methods shall be demonstrated over the lead
concentration and sample mass ranges for each matrix stated by the
method, as appropriate.”

Comment #104

CCHD

Do states need to require validation studies, when this could be more
efficiently conducted on a national level? 

 

EPA’s response: 

It is up to the state to decide their own validation requirements if
they choose to.  As indicated in the response to comment 108A, where
sample preparation and analysis methods are not specified by regulatory
programs, the laboratory shall, whenever possible, use validated
procedures published by federal agencies such as EPA, HUD or NIOSH,
state agencies, or nationally or internationally recognized consensus
standards organizations.   For example, EPA document 747-R-95-008,
Methodology for XRF Performance Characteristic Sheets is such a
document, and can be found on the web at <  HYPERLINK
"http://www.epa.gov/lead/pubs/leadtpbf.htm#Identifying" 
http://www.epa.gov/lead/pubs/leadtpbf.htm#Identifying >.

Comment #105 

CCHD

 

The Connecticut DPH recommends the limit, for instruments which do not
give a numeric reading, for the positive control to be changed so that
the concentration is no more than 10% from the regulatory limit. This is
a more conservative approach to insure that samples near, but above the
limit yield, the correct response. 

 

EPA’s response: 

 

Traditional fixed site laboratories in NLLAP are allowed a range of plus
or minus 20% for their LCS samples.  Hence 20% was adopted in the case
referred to in the comment.  In the interest of equitability for all the
technologies that might be considered under NLLAP, the limit will remain
at 20%.  As with other NLLAP criteria, a laboratory or FSMO may choose
to adopt a limit within the NLLAP boundary of 20%, however each
laboratory in order to be accredited will have to establish its own
performance criteria that cannot be greater than 20% and will improve
over time.  In order to establish such criteria for a new technique or a
new instrument, one must have a starting point and a mound of
measurements to perform statistical analysis (usuallly 20 measurements).
 Also see response to comment #16. 

 

Comment #106 

CCHD

 

The Connecticut DPH feels that wherever possible, the requirements for
large and small operations be kept the same. We do not feel it is fair
to have different credential requirements based upon the size of the
firm. 

 

EPA’s response: 

 

EPA agrees with this comment.  This issue is also addressed in response
to comment #14.  It is the intent of the LQSR 3.0 to be applied to
variety of analytical operations that performs sampling and/or
quantitative and/or qualitative analytical testing of paint chip (film),
dust, and/or soil samples for lead regardless of the number of personnel
or the extent of the scope of testing activities. 

 

Comment #107 

NCHH

 

If you believe that there would be merits for the National Center for
Healthy Housing entering into an MOU with EPA so the Center can become
an accrediting organization like AIHA and A2LA, please let us know. 

 

EPA’s response: 

The requirements for becoming an NLLAP accrediting organization are
many, specific and very specialized.  As such, meeting the requirements
requires abilities and an infrastructure by such an organization that
could be expected to limit the pool of applicants.  

EPA welcomes any organization to apply as an NLAPP accrediting
organization if they feel they can meet the requirements to be one.  EPA
would then evaluate the application and decide accordingly.  

Any organization seeking EPA’s recognition as an NLLAP accreditation
body will be evaluated by EPA.  The criteria include: 1) compliance with
ISO/IEC Guide 58 - Calibration and Testing Laboratory Accreditation
Systems - General Requirements for Operation and Recognition; 2)
compliance with specific requirements set forth in the NLLAP Model
Memorandum of Understanding; and 3) compliance with specific
requirements set forth in “Laboratory Quality System Requirements,”
an appendix of the MOU.  For more information visit:  HYPERLINK
"http://www.epa.gov/lead/pubs/nllap.htm"
http://www.epa.gov/lead/pubs/nllap.htm 

Comment #108 

U.S. Department of Housing and Urban Development (HUD) 

It is suggested to change a definition of "field sampling and
measurement organization" to require that the organization isolates the
testing in a way that minimizes contamination of samples. 

 

EPA’s response: 

 

EPA does not believe it is appropriate to address the minimization of
contamination in the definition of FSMOs.  Instead, contamination is
addressed in Section 5.3.  Specifically, the requirement that the
organization isolate testing to minimize contamination of samples is
addressed in Section 5.3 and states: 

 

For laboratories operating portable testing technologies, sample
collection and field testing shall be conducted so as to minimize the
chance of cross-contamination.  Site access shall be controlled to the
extent possible while sampling and testing are taking place.  Activities
at the site shall be conducted to minimize the possibility of creating a
hazard during or after testing, or contaminating the site from sampling
and testing. 

 

The contamination control by an FSMO is also addressed in section
5.3.1.1 - Laboratory Dust Wipe Checks, and states: 

 

For FSMOs, appropriate contamination control blank samples shall be run
in order to monitor potential lead contamination as outlined in the
QMSM. 

 

Comment #109  

HUD

 

(1) What about an FSMO that may be operating during non-standard working
hours? (2) What is the performance standard for "easily accessible"? 
Will this vary by, for example, type of laboratory, time of day, day of
week, etc.? 

 

EPA’s response: 

 

Section in 4.1.1.2 provides assurance that the technical manager is
accessible for technical consultation at all times.  The proposed
statement in LQSR 3.0 ensures adequate supervision of all laboratory
technical personnel regardless of their work schedule. 

 

Comment #110  

HUD

 

What about gaining and using feedback from clients' field sampling
quality assurance activities? 

 

EPA’s response: 

 

 hkW

:

t

›

Ð

ü

%

C

f

Ž

Ç

 h£

 h£

 h£

 h£

 h£

 h£

਀&䘋

 h£

 h£

 h£

 h£

 h£

 hkW

 h£

 h£

 h£

 h£

 hkW

 h£

 h£

 h£

 h£

 h£

 h£

 h£

 h£

 hkW

&

&

 h£

 h£

 h¥v

 h¥v

 h¥v

&

&

&

&

&

&

|

&

&

|

&

|

&

|

|

 hkW

 h£

 h£

 h£

 h£

 h£

 hkW

 hkW

 hkW

 h£

 h¥v

&

&

&

&

h

 h

h

h

 h

gd

i

k

x

|

~

Ð

Ò

ã

å

i

k

v

x

|

Ï

Ò

ã

å

æ

í

ù

	

j 

jÂ

Feedback from the customers is an important part of the laboratory
management system and addressed in section 4.7, SERVICE TO THE CUSTOMER
stating: 

 

The NLLAP recognized laboratory shall cooperate with the customer.  This
may include providing the customer or customer representative reasonable
access to areas of the laboratory to witness work performed for that
customer or providing test or sample items needed by that customer for
purposes of verification.  The laboratory should also maintain contact
with the customer throughout the period of work and inform the customer
of any delays and/or deviations in performance of the tests. 
Laboratories are also encouraged to seek feedback from the customer for
improvement of the quality system. 

 

In addition to this section, HUD’s suggested text was added to section
5.9, ASSURING THE QUALITY OF TEST RESULTS to state: 

 

The laboratory quality control (QC) program shall include the continual
evaluation of its performance: determinations of bias and precision for
each matrix analyzed; and participation in the ELPAT Program for each
matrix analyzed.  Also laboratories are encouraged to participate in
interlaboratory comparisons and/or other proficiency testing programs. 

 

Because the client may also be providing duplicates, spikes and/or
blanks as part of their own sampling protocol, the laboratory should
seek client feedback, whenever possible, on the laboratory’s
performance after reports are provided to the client; any information
obtained should be included in the laboratory QC performance evaluation.


 

Comment #111 

HUD

 

Provide a definition for the term “sample”. 

 

EPA’s response: 

 

The following definition of “sample” was included in the Glossary: 

 

Sample: An aliquot taken from a painted surface (paint), surface (dust),
or soil in accordance with 40 CFR 745 that was physically removed from
its matrix for analysis by a laboratory. 

 

  PAGE  16 

