Summary of Comments and Responses on 

Performance Specification 16 for 

Predictive Emission Monitoring Systems

Prepared September 2008

Proposed in the Federal Register on

August 8, 2005



(This page included to provide for two sided copying)

TABLE OF CONTENTS

Section No.	Page

1.0  Summary	1

2.0  List of Commenters	2

3.0  Public Comments and Responses	5

3.1  Comments on Performance Specification 16	5

		Supportive Comments	5

		Non-supportive Comments	6

		Normal Load vs. Mid-level Load	7

		PS-16 and Market-based Programs	7

		PS-16 and the Draft PS on the EPA Website	9

		Relative Accuracy Based on DQO	10

		Alternative Limit for Low Emitters and PEMS RA Stringency vs. CEMS	12

		Testing Loads and Number of Relative Accuracy Runs	17

		Number of Runs for Market-Based PEMS	17

		Statistical Analyses	19

		Periodic PEMS Assessments	24

		QA of Associated Parameters	31

		Use of Portable Analyzers for the RAA	31

		Ongoing QA Tests Table in Section 9.1	35

		Number of RATA Runs	35

		Three Operating Conditions Unclear	36

		Definition of and Examples of PEMS in Section 3.6	36

		Applicability of PS-16	37

		Potential Overlap of PS-16 and PS-17	37

		Number of RM Tests Where Emissions are Steady Across Loads	39

		PEMS Difference with RM Due to Response Times	39

		PEMS Use in Showing Exceedances	40

		PEMS Use in Startup and Shutdown	40

		PEMS for O&M Monitoring	41

		PEMS with Less Than 3 Parameters	43

		Part-time PEMS Use	43

		Flexibility for Unique Operation Scenarios	44

		Sensor Evaluation System	44

		Graphic Representation of PEMS Relationships	45

		Sensor Envelopes	45

		Input Calibrations	46

		Detailing Neural Network PEMS Calculations	47

		Data Substitution	47

		PEMS Design	48

		Specific Comments by Section	48

3.2   Comments on Correction of Subpart J (Petroleum Refineries) 

		Coke Burn-off Equation	62

3.3	Comments on Subpart BB (Kraft Pulp Mills) Amendment	63

3.4	Comment on PS-2 Amendment	63

3.5	Comments on PS-11 Revisions	63

		General	63

		Sample Volume Audits	67

		Daily Drift Checks	68

			

1.0  SUMMARY

	On August 8, 2005, the U.S. Environmental Protection Agency (EPA)
proposed Performance Specification (PS) 16 for predictive emission
monitoring systems (PEMS).  These systems represent a new and innovative
way of monitoring pollutant emissions without the traditional hardware
analyzers.  Predictive emission monitoring systems can be used as
alternatives to continuous emission monitoring systems (CEMS).  Unlike
CEMS, PEMS determine source emissions without measuring them directly. 
These systems predict emissions using process parameters and known
relationships between these parameters and emission concentrations. 
They operate on principles ranging from simple combustion relationships
to more complex computer models that are trained using historical
pollutant data and neural networks technology.  

	The Agency has allowed PEMS use in a number of recently-promulgated
rules, and they are being considered by a number of facilities regulated
by State and Local agencies.  They are currently allowed as monitoring
alternatives under certain conditions under the New Source Performance
Standards, as well as under the Clean Air Markets rules.  Several PEMS
have been approved for nitrogen oxides (NOx) measurement at industrial,
commercial, and institutional steam-generating units (Subpart Db of Part
60).  Other facilities likely to consider PEMS include gas turbines,
internal combustion engines, and other combustion processes where
process parameters (primarily NOx and CO) have a known relationship with
air emissions.  Performance Specification 16 sets the performance
requirements for PEMS to meet in order to be considered acceptable for
use. 

2.0  LIST OF COMMENTERS



Item Number in 

OAR-2003-0074	

Commenter and Affiliation



0007	Thomas Gasloli

Michigan Dept. of Environmental Quality



0008	Brian Swanson

CMC Solutions, L.L.C.



0009	Linda Farrington

Eli Lilly and Company



0010	Ted Cromwell

American Chemistry Council



0011	Luis A. Comas

Sunoco Environmental Services



0012	Norbert Dee

National Petrochemical & Refiners Association



0013	Greg Gouzev

Solar Turbines, Inc.



0014	Erik G. Milito

American Petroleum Institute



0015	Donald R. Schregardus

U.S. Navy



0016	Tracy P. Stanton

Environmental Systems Corporation



0017	Tim Hunt

American Forest and Paper Association



0018	Lauren Freeman

Hunton & Williams, LLP



0019	Lisa S. Beal

Interstate Natural Gas Association of America



0020	M. Lopes

Unaffiliated Commenter



0021	John E. Pinkerton

National Council for Air and Stream Improvements, Inc.



0022	Robert D. Bessette

Council of Industrial Boiler Owners



0023	Scott Evans

Clean Air Engineering



0024	Darrell Bayless

Hoosier Energy Rural Electric Cooperative, Inc



0025	Valerie Ughetta

Alliance of Automobile Manufacturers



0026	Don Mark Anthony

Alyeska Pipeline Service Company



0028	Paul Reinermann

Pavilion Technologies



0029	Paul Reinermann

Pavilion Technologies



0030	Paul Reinermann

Pavilion Technologies



0031	Paul Reinermann

Pavilion Technologies



0032	Paul Reinermann

Pavilion Technologies





0033	

Brian Swanson

CMC Solutions



0034	Daren K. Zigich

Broin and Associates



0035	Dan Despen

Interpoll Laboratories, Inc.



0036	Edward D. Daniels

Lockheed Martin Aeronautics Company



0037	Hon. Lamar Smith

U.S. Congress



0038	Joseph J. Macak

Mostardi Platt Environmental





	3.0  PUBLIC COMMENTS AND RESPONSES

3.1  Comments on Performance Specification 16

Supportive Comments

	1.  Comment:  In our view, PS-16 provides excellent performance
criteria and guidance for compliance certification.  Rigorous scientific
methods give both the original equipment manufacturer and the customer a
clear path for deployment of PEMS in a stable regulatory environment. 
We also support the comments submitted by Commenters 0008, 0033, and
0019.  (0013)

	Response:  No response is necessary.

	2.  Comment:   EPA’s openness to PEMS is very welcomed. There are
many situations where PEMS could be used to provide accurate emissions
estimates for trading purposes or to demonstrate a reasonable assurance
of compliance with emission limits. There are even situations where a
PEMS performance may exceed a CEMS, such as units that operate very
infrequently or remote sources where regular CEMS maintenance may prove
difficult.  (0018)

	Response:  No response is necessary. 

	3.  Comment:  We support EPA in its efforts to procedurally enhance the
viability of PEMS as a reliable emission monitoring tool for estimating
emissions.  (0024)

	Response:  No response is necessary.

	4.  Comment:  We have used the draft PS document entitled “Guidance
for Certification of Predictive Emissions Monitoring Systems used for
Compliance Determination” in the past for certification under 40 CFR
Part 60 and Subpart E of 40 CFR 75.  The proposed PS-16 is a tremendous
improvement over the draft version and  EPA should be commended for the
work completed to move this performance specification forward.  The
tighter performance criteria and the statistical tests required for
certification are more consistent with the requirements of 40 CFR Part
75, Subpart E.  The rigorous quality assurance and detailed
certification testing procedures are valuable for the PEMS supplier and
the regulatory agency alike who seek a common set of objective criteria
to gain approval to use  PEMS for compliance determination under 40 CFR
Part 60.  (0008)  

	Response:  No response necessary.

	5.  Comment:  Our experiences have indicated that CEMS technology
requires significant capital investment, including initial costs as well
as on-going replacement costs since CEMS analyzers have a relatively
short life expectancy of 3-7 years.  This, along with the high
maintenance costs associated with these systems, provides incentive for
companies to consider newer, more cost-effective technologies such as
PEMS.  Thus we appreciate the agency’s efforts to recognize PEMS as
acceptable monitoring alternatives.  (0009)

	Response:  No response is necessary.

Nonsupportive Comments

	6.  Comment:  We are concerned that the proposed amendments to PS-16
will greatly increase the costs of emissions monitoring without any real
benefits to accuracy or environmental performance and may render the use
of PEMS technically and economically infeasible.  We are also concerned
that the proposed changes greatly limit the flexibility afforded
operators of smaller Db boilers to use PEMS instead of CEMS.  (0034,
0035)

	Response:  Performance Specification 16 should not significantly
increase the costs of emission monitoring.  In the data of past PEMS
performance that are compiled in the docket, 85 percent of the NOx PEMS,
91 percent of the CO PEMS, and 100 percent of the oxygen PEMS (though a
limited data pool of only 3 oxygen PEMS was used) were able to meet the
proposed PS-16 accuracy requirement when RATA tested.  Seventy nine
percent of the NOx PEMS (the predominate PEMS application) met a RA of
7.5 percent or better.  These data show that PEMS can be very capable
monitoring tools with superior performance.  The data also shows that
the technology in general is capable of meeting performance limits
substantially less than the 20 percent relative accuracy limit set for
CEMS several decades ago.

	7.  Comment:  In general, we feel that the proposal is overly
burdensome, restrictive, and costly without providing any significant
increase in environmental protection over that provided by the
guidelines currently in effect.  (0036)

	Response:  See the response to Comment 6.

General Comments

	8.  Comment:  We concur that a PEMS, when properly installed and
maintained, is an accurate compliance monitoring tool that is more cost
effective than  a CEMS with on-line gas analyzers.  Due to the fact that
we currently use a PEMS and because we may pursue development of
additional PEMS in the future, we are very interested in ensuring that
the new PS does not become overly burdensome to the point of making the
alternative monitoring method less appealing over more conventional
monitoring methods such as CEMS or periodic emission testing.  (0026)

	Response:  We do not believe the final PS-16 is burdensome or
distracting to the consideration of PEMS as alternatives to CEMS.

	9.  Comment:  We have designed a statistical hybrid PEMS to meet the
requirements of this performance specification.  This includes the use
of the statistical hybrid PEMS for compliance reporting in market-based
emissions trading programs.  Our PEMS has met the requirements of 40 CFR
Part 75, Subpart E and is in use today as a complete compliance
monitoring and reporting system for market-based emissions trading
programs at sites in the United States.  Please review docket OAR-
2005-0099 (07/13/2005) “Receipt of Request for Initial Certification
of Predictive Emission Monitoring Systems” for additional details. 
This statistical hybrid PEMS does perform well during startups,
shutdowns, and malfunctions when properly trained with historical data
that includes startups, shutdowns, and transitional operating states. 
The statistical hybrid PEMS is provided with this data by using
certified CEMS or other temporary or mobile monitoring equipment that is
quality assured using EPA reference methods for collection of historical
emissions training data. This testing is done during normal operation
and when conducted over the broadest range of operating and ambient
conditions provides the resolution necessary to meet all of the
requirements of 40 CFR Part 75, Subpart E which establishes accuracy
across the full range of normal unit operations including startups,
shutdowns, and transitional states.  (0008, 0033)

	Response:  No response necessary.

“Normal” Load vs. “Mid-level” Load

	10.  Comment:  In Section 8.2, the load levels are listed as low,
normal, and high. To avoid confusion, the "normal" load should probably
be changed to "mid".  Many sources operate normally at high or low
levels.  Combustion turbines typically run in the high operating level. 
As it is written, “normal” seems to refer to the middle portion or
“mid” of the operating range.  We recommend that EPA replace the
“normal” operating level with the “mid” operating level.  The
three operating levels, low, mid, and high, could then be defined in
terms of percentage of the range of operation.  A determination of the
“normal” load or operating level would then be made after performing
a historical load analysis or using sound engineering judgment, based on
knowledge of the unit and operating experience with the industrial
process.  This also establishes continuity with 40 CFR Part 75, Appendix
B.  (0007, 0016, 0024)

	Response:  We agree with your suggestion.  “Normal” has been
changed to “mid.”  

PS-16 and Market Based Programs

	11.  Comment:  We recommend that PS-16 not include PEMS for market
trading programs since PS-16 is being proposed under the 40 CFR Part 60,
Appendix B requirements.  Market trading PEMS are currently approved by
the Clean Air Markets Division (CAMD) as an alternative under 40 CFR
Part 75, Subpart E.  Note that market trading CEMS requirements are
addressed wholly under Part 75 and, in fact, a Part 75 certified CEMS
are deemed to meet the requirements of Part 60.  EPA should take the
same regulatory structure with PEMS for consistency.  

	We have PEMS approved by CAMD and believe that PS-16 has been
unnecessarily influenced by the CAMD PEMS requirements established under
Part 75, Subpart E.  The Subpart E requirements are too costly as only a
few PEMS (less than 10) have been approved under these requirements
whereas over 100 PEMS have been approved for use under 40 CFR Part 60
utilizing the PEMS requirements in the example performance
specifications published by EPA in January 1996.   

	We recommend that any PEMS requirements for market-based programs be
proposed and promulgated under 40 CFR Part 75.  If EPA does this, then
PEMS can be used under the Acid Rain Program as well as other Market
Based Programs.  (0028) 

	Response:  We agree with the commenter.  In the final rule, we have
removed the provisions for market-based PEMS that were in the proposal. 
Performance Specification 16 now only addresses compliance-based PEMS. 
Market-based PEMS should follow the requirements listed in Subpart E of
40 CFR 75 as applicable.  Other market-based programs may incorporate
PS-16 as they see fit.

	12.  Comment:  Although PS-16 refers to PEMS used in market-based
programs, there are major discrepancies between the initial
certification and periodic quality assurance test requirements provided
in the proposed PS-16 and the ones provided in Appendix E of 40 CFR 75
which is the regulation that applies to market-based programs. I would
like to request that before PS-16 is finalized, the requirements
provided in Appendix E of 40 CFR 75 for PEMS are reviewed and
incorporated into the final PS-16.  (0020)

	Response:  See the response to Comment 11.

	13.  Comment:  We believe the Agency should make clear that PS-16 does
not apply to established alternative or predictive monitoring systems
under 40 CFR 75, for example, those approved under Subpart E or Appendix
E, the optional NOx emissions estimation protocol for gas- and oil-fired
peaking units.  We believe that these provisions are well established
and used by many sources; making changes to the requirements could have
a significant impact on facilities that currently use those provisions. 
(0022)

	Response:  Performance Specification 16 now notes that it does not
apply to Part 75 PEMS.  We also discuss in the final rule preamble that
application of PS-16 is not retroactive to PEMS that have already been
approved as alternative monitoring systems.

	14.  Comment:  We request clarification from the Agency on the intended
applicability of PS-16.  In reading the preamble, the proposed PS-16
seems to apply to PEMS used for compliance with monitoring requirements
under NSPS and MACT standards at 40 CFR 60, 61, and 63.  However, in
Section 8.2.3, the proposal language implies that this specification may
apply to PEMS used in market-based trading programs, such as those for
which monitoring is done under 40 CFR 75.  Is it the Agency’s
intention to apply this performance specification to monitoring done in
accordance with 40 CFR 75?  (0022)

	Response:  We have clarified PS-16 to note that it does not apply to
Part 75 alternative monitoring applications.  In the final rule, we have
removed the provisions for market-based PEMS that were in the proposal. 


	15.  Comment:  We recommend EPA establish a compliance window for
facilities currently operating data acquisition and handling
system-emission monitoring and reporting (DAHS EMR) under Appendices D
and E of 40 CFR 75 to estimate emissions.  We suggest PS-16 allow
existing DAHS EMR owners/operators 180 days to post the final rule
effective date to achieve compliance with PS-16.  The compliance window
allows facilities to select, order, and install additional equipment
beyond current regulatory requirements.  Finally, we recommend no
additional relative accuracy (RA) tests be required for sources which
have completed RA tests in the last five years.  (0024)

	Response:  Performance Specification 16 does not apply to 40 CFR 75
facilities nor does it apply retroactively to PEMS that have already
been approved and certified.  It is a specification that applies to
future PEMS use.  Current PEMS that have been approved for use under the
NSPS are not required to reassess their performance upon promulgation of
PS-16.

PS-16 and the Draft Performance Specifications on the EPA Website

	16.  Comment:  We recommend that EPA abandon the proposed PS-16 PEMS
requirements and instead propose the example PEMS requirements that were
published in January 1996 by the Emission Measurement Center (two
documents: “Example Specifications and Test Procedures for Predictive
Emission Monitoring Systems” and “Alternative Monitoring Protocol
Predictive Emission Monitoring System To Determine NOx and CO Emissions
From An Industrial Furnace”).   

	From October 1994 through December 1995, EPA studied PEMS and published
example PEMS requirements and PEMS protocols in 1996.  These PEMS
requirements have been the de facto PEMS requirements throughout the
United States since their publication.  A departure from the 1996 PEMS
requirements will result in the demise of PEMS because of increased cost
for initial certification and ongoing maintenance. 

	The January 1996 PEMS requirements have been approved for numerous PEMS
installations by EPA’s Office of Air Quality Planning and Standards 
and Regional Offices for determining compliance with operation and
maintenance standards as well as emissions standards under various
subparts of 40 CFR  60.  EPA’s applicability determination index has a
large number of PEMS determinations that effectively state that EPA will
allow the PEMS provided they meet the January 1996 PEMS requirements.
Each determination spells out the requirements or references a PEMS plan
that are reflective of the January 1996 PEMS requirements, and those
PEMS plans were approved by EPA. 

	To state that PEMS is a new technology in the preamble to the August 8,
2005 proposal is misleading.  PEMS have been used for almost 12 years to
document compliance for a boiler subject to 40 CFR Part 60, Subpart Db
and EPA published PEMS requirements in January 1996.  The precedent of
these PEMS requirements and the research that went into them resulted in
good, fair PEMS requirements that ensure accurate, precise, and reliable
PEMS.  This precedent should continue to be followed 

and the final PS-16 requirements should reflect the January 1996 PEMS
requirements as these have been the de facto PEMS requirements for over
9 years.  

	If the EPA would limit the scope of PS-16 to PEMS allowed under 40 CFR
Part 60 (either directly by the Part 60 subpart or as an alternative to
CEMS under §60.13(i)) and adhere strictly to the PEMS requirements
published in January 1996, then EPA would be encouraging the use of
innovative technologies such as PEMS.  (0028)

	Response:  The “Example Specifications and Test Procedures for
Predictive Emission Monitoring Systems” was only a guidance document
to give PEMS users and regulators a general idea of what could be
expected of PEMS in light of the limited performance data available at
that time.  The requirements of that document were primarily based on
the existing requirements in PS-2 for CEMS and not on extensive
research.  The document was offered on the EMC website until the Agency
could develop and finalize PS-16.  Since then, we have acquired
historical RATA data from a number of PEMS, and our understanding of
their capabilities has increased.  This data is presented in the docket
and gives a better indication of PEMS performance than what is reflected
in the guidance document.  This data confirms that the performance
levels set in PS-16 are achievable by the vast majority of PEMS in the
data pool and are more reflective of the technology’s capabilities. 
The requirements in PS-16 should not result in the demise of PEMS due to
increased cost for initial certification and ongoing maintenance.  See
the response to Comment 19 where we offer alternative performance
criteria for low-emitting sources.

	17.  Comment:  While we agree with the Agency’s goal of providing a
uniform PS for PEMS, we are concerned that the proposed rule could lead
to increase costs and unnecessary audits for the manufacturing industry.
 The proposed PS-16 rule departs from the proven method of oversight and
relies on requirements that could result in negative impacts on the
manufacturing industry and the PEMS market.  A more viable,
cost-effective, and reliable alternative to the proposed rule might be
to seek the 1996 requirements as precedence.  We ask that the EPA
carefully consider the impact of the proposed rule and provide
justification for any final rule.  (0037)

	Response:  See the response to Comment 16.

Relative Accuracy Based on Data Quality Objectives (DQO)

	18.  Comment:  EPA distinguishes different applications for PEMS used
for continual compliance standards and market trading programs.  We
agree with these two distinct applications but disagree that the same
acceptance criteria should be applied to both.  We believe that the EPA
should use the DQO process to determine the acceptance criterion for a
PEMS certification.  Over the past few years, EPA has spent considerable
time developing and implementing the DQO process.  The DQO Process is a
quality assurance procedure that ensures data submitted to EPA is of
adequate quality to meet regulatory requirements.  We agree with the
general approach EPA has taken with the DQO process.	 Certainly data
collected from CEM systems and PEM systems used in compliance decisions
are applicable to the DQO process.  However, the acceptance criteria
proposed in PS-16 are not consistent with this approach. 

	The disconnect begins with Step 1 of the DQO process:  stating the
problem. Getting this step right is extremely important to correctly
interpret the resulting data.  In EPA’s own words: 

	It is critical to carefully develop an accurate conceptual model of the


environmental problem in this step of the DQO process, as this model
will serve 

as the basis for all subsequent inputs and decisions.  Errors in the
development of 

the conceptual model will be perpetuated throughout the other steps of
the DQO 

process and are likely to result in developing a sampling and analysis
plan that 

may not achieve the data required to address the relevant issues. 

From EPA/600/R-96/055 

	A continuous compliance system (whether a CEMS or a PEMS) provides data
to 

determine whether a source is in compliance with its emission limit.  On
the other hand, a 

market trading system provides data to determine how many emission
credits have been 

used by a source.  These problems pose fundamentally different questions
in Step 2 of the 

process.  In the case of market trading system, the question is:  How
much of compound X is the source emitting?   In the case of continuous
compliance systems the question is:  Is the source below its emission
limit? 

	In the first case, a regulator needs fairly high confidence of the
actual quantity of 

emissions from that source.  However, in the second case, the regulator
needs fairly high 

confidence that the source is “below the line.”  EPA’s approach in
the proposed PS-16 

always assigns the confidence to the quantity.  While this approach is
reasonable for 

market trading systems, it may result in unnecessarily stringent
requirements for sources 

not engaged in emission trading.  

	As an example, assume that a source has an emission limit of 5 lb/hr of
NOx.  It does not participate in any emission trading program.  Past
testing shows that the plant has a maximum NOx emission rate of 2 lb/hr.
 The plant tests a PEMS system and finds that it has a relative accuracy
of ± 25 percent.  This means that at its maximum emission rate, the
PEMS may (worst case) report emissions of 2.0 lb/hr but actual emissions
may be as high as 2.5 lb/hr.  However, even with this degree of
uncertainty in the PEMS output, the confidence of compliance is greater
than 99 percent, i.e., the data shows that the source is below their
emission limit with greater than 99 percent confidence.  This PEMS
system successfully answers the compliance question “Is the source
below its emission limit?”  Yet under EPA’s proposal such a system
could not be used as a PEMS. 

	By forcing PEMS used for continuous compliance purposes to conform to
an arbitrary 10 percent RA requirement, EPA eliminates many PEMS
strategies that are cost-effective while providing very high assurance
of compliance. 

	We propose that EPA modify the proposal to allow continuous compliance
PEMS that can demonstrate that their output provides at least a 95
percent confidence that the source is below its emission limit. The
simplest way to achieve this is to replace the static 10 percent RA
requirement with a requirement that the PEMS output its data at the 

upper 95 percent confidence interval.  This value is already known from
the RA test and calculations can follow those already established in
Section 12.3 of Performance 

Specification 11. 

	The advantage to this approach is that it preserves the ability of the
regulatory authority to have a high degree of confidence that a source
is meeting its compliance requirements while at the same time, providing
the source with much more flexibility as to the type of PEMS system it
can use.  (0023)

	Response:  We have dropped the proposed application of Performance
Specification 16 to market-based PEMS.  The requirements for
market-based PEMS are discussed in Subpart E of Part 75 or in other
provisions that have been developed for this purpose.  We agree that for
compliance PEMS, the primary purpose of the data is to indicate whether
a source is in compliance.  However, we do not discount the need for
accuracy in measurements that are substantially below the emission
limit.  Data have shown that current PEMS technologies are capable of
producing accurate determinations over a broad  range of measurements. 
Due to this capability, we do not see a need to relax data quality
requirements, which would allow the use of inferior systems, because the
general DQO for showing compliance has been met.  The use of compliance
data for secondary purposes would also be compromised with this approach
of tailoring the prescribed accuracy.  Enforcement discretion may be
used in cases where  performance limits are marginally exceeded yet
compliance is reasonably assured due to the margin of compliance.  

	Requiring an upper 95 percent confidence level for PEMS data in place
of the 10 percent RA requirement is one way of determining acceptability
that has been used in PS-11.  However, it was used in PS-11 due of the
difficulty of PM CEMS meeting the RA criterion with data from different
load conditions.  Data from PEMS in current use confirm the accuracy
needed to meet the proposed limits under different load conditions.  In
addition, the calculations for determining the 95 percent confidence
interval are more rigorous and more subject to errors than the
traditional RA calculations.  

Alternative Limits for Low Emitters and PEMS RA Stringency vs CEMS

	19.  Comment:  The relative accuracy test audit (RATA) performance
criteria under the existing draft website PS and under PS-2 for NOx and
SO2 CEMS is 20 percent.  This is a criterion many source owners are
familiar with, and it has served as a standard for relative accuracy as
measured by a RM testing team and as witnessed and administered by local
regulatory agencies.  Performance Specification 2 was revised to include
an alternate standard for low emitters.  The relative accuracy test
requirements for 40 CFR 75 CEMS are 10 percent for semiannual RATA
testing and 7.5 percent for annual RATA testing.  There is an
alternative standard for 40 CFR 75 PEMS and CEMS for low emitters that
is an absolute difference of 0.020 lbs/mm BTU for semiannual RATA
testing and 0.15 lbs/mm BTU annual RATA testing. 

	Proposed Performance Specification 16 sets the RA standard for PEMS at
10 percent.  Alternative standards are also provided, however, the
alternate standard is applicable only when the unit is operating either
below 25 percent of the emission standard (20 percent criteria is used)
or below 10 percent of the emissions standard (2 ppm difference is
used).

	Although some PEMS can achieve RAs better than 10 percent, this can be
a problem for low-emitters.  The reason is that the RM tests are only
accurate to 5 percent of the actual measurement at best at higher levels
(i.e., above 100 ppm), and at the lower end of the instrument range
(i.e., < 10 ppm for NOx) only accurate to within a ppm or about 10
percent.  The alternate criteria as put forth in the proposed PS will
work if they are applied to the low emitters.  Unfortunately this will
not be the case in most instances due to the way permits are written
today.

 	Many NSPS permits are written to set limits just above the actual
emission level of the source.  This is done routinely by States, and
many low-emitters have permit limits that are just above the lowest
base-load emission level put forth in design documentation by the
manufacturer prior to construction.  Thus, these sources, even though
they are low-emitters, generally run in the 75-95 percent of emission
standard range.  Any alternative standard that is based on a percentage
of the emission standard (such as 10 or 25 percent of the emission
standard) will not apply to these sources.  

	Other performance specifications provide an alternative accuracy
requirement for low emitters by qualifying based on absolute emission
levels (either a rate or concentration).  This includes the RATA
criteria alternatives under 40 CFR 75 such as the allowance for NOx RATA
difference of less than 0.15 lbs/mm BTU when emissions are below 0.2
lbs/mm BTU or the alternative for CO relative accuracy when levels are
below 200 ppmv as detailed in 40 CFR Part 60, Appendix B, PS-4A.

	The PEMS alternatives set at 25 and 10 percent of the applicable
standard provide no method for low-emitters to validate PEMS in the low
ppm range.  This problem would make it difficult for most PEMS to
certify also.  A unit emitting 5 ppmv NOx with a permit set at 9 ppmv
NOx would have to be within 0.5 ppmv of the RM which is an unrealistic
standard since the RM itself does not provide that level of accuracy.

	We see no justification for PEMS RA requirements to be greater than
those for CEMS.  A PEMS will have to undergo more testing and quality
assurance than a CEMS (i.e., three load RATA tests), but there is no
reason a PEMS should have to certify against a RM to a level twice as
accurate as a CEMS used for the same compliance documentation and
reporting purposes.  

	We suggest that at a minimum, the alternative accuracy requirement be
changed to be based on an absolute emissions level instead of a
percentage of the emission standard.  We also suggest that the
performance specification take into account and detail all the possible
alternative criteria based on the various pollutant standards such as
SO2, CO, NOx in ppmv and ppmv corrected to a certain oxygen level, SO2,
CO, and NOx in lbs/mmBTU since these standards are used in permits that
are issued by the States.

	For example, the criteria could be less than 10 percent RA for CEMS and
PEMS when emissions are greater than 100 ppm or 0.2 lbs/mmBTU.  The
alternative criteria could be a relative accuracy limit of 20 percent of
the RM when emissions are under 100 ppm (0.2 lbs/mmBTU) and greater than
10 ppm or (0.05 lbs/mmBTU) for any pollutant parameter such as SO2, 
NOx, and CO.  For emissions below 10 ppm, we suggest the proposed lower
alternative of less than 2 ppm absolute difference be applied for SO2
and NOx and 5 ppm absolute difference for CO.  (0008, 0013, 0016, 0033)

	Response:  We think the commenter’s suggestion for alternative PEMS
criteria for low emitters is a good idea.  We have added the suggested
alternative criteria for concentrations between 10 and 100 ppm and below
10 ppm.  

	The 20 percent relative accuracy criterion was set for CEMS in the
1970’s and reflects the capabilities of systems at that time. 
State-of-the-art CEMS are capable of much better performance as is shown
by their success under the tighter Part 75 rules.  The data (added to
the docket) that we have evaluated on PEMS currently in use show that
the overwhelming majority are capable of meeting the 10 percent RA
criterion on a regular basis.  We believe that the expectations of
emission data quality should parallel the improvements in technological
capabilities.

	20.  Comment:  The accuracy requirements for the RATA are too limited
and restricted.  They do not reflect the current options for RATA
accuracy allowed for CEMS under Part 60.  Future PEMS should be allowed
the same options and should not have to meet a more strict level of
accuracy requirement than a comparable CEMS.   We recommend the
following RATA requirements: 

Pollutant	Relative Accuracy	Basis	Comments

NOx less than 0.20 lb/mm Btu or equiv.	Within 0.020 lb/mm Btu of the avg
test method value	Under Parts 60 & 75 CEMS RATA requirements are
acceptable.  This reflects NOx for low-emitting NOx units	Original RATA
requirements under Part 60 did not anticipate low- emitting NOx units
and Part 75 has incorporated appropriate requirements

NOx (general), O2, SO2, CO, CO2	20 % of the avg test method value
Identical to Part 60, Appendix B, PS 2, 3, and 4 for CEMS	Compliance
CEMS are allowed to meet this criteria

NOx (general), SO2, CO	10 % of applicable standard	Identical to Part 60,
Appendix B, PS 2, 3, and 4 for CEMS	Compliance CEMS are allowed to meet
this criteria

O2 and CO2	Within 1 percent of the average test method value	Identical
to Part 60, Appendix B, PS 3 for CEMS	Compliance CEMS are allowed to
meet this criteria

CO	Within 5.0 ppm of the average test method value	Equivalent to PS 4
requirement	Compliance CEMS are allowed to meet this criteria for CO
CEMS on refinery units under 40 CFR Part 60



(0028)

	Response:  See the response to Comment 19.  We do not anticipate
diluent applications for PEMS; therefore alternative criteria for low
concentration diluent systems have not been added.  

	21.  Comment:  The performance criteria for the RATA and quarterly RAA
for PEMS are more rigorous than the analogous requirements for CEMS
(e.g., see PS-2 for NOx CEMS).  The EPA should revise the acceptability
criteria in PS-16 to the analogous criteria from PS-2. 

	Performance Specification 16 serves as a PEMS analogue to existing
specifications for CEMS.  The test requirements for PEMS and CEMS should
be similar.  The added stringency for PEMS is unwarranted.  In fact,
since the quarterly cylinder gas audit (CGA) option for CEMS is not
available to PEMS and an RAA is required, PEMS criteria are already more
rigorous than the analogous CEMS criteria.  In this case (i.e., CGA
versus RAA), the additional rigor is unavoidable and the result of
inherent differences in the two systems.  However, when identifying
identical criteria for the same test requirement (e.g., percent relative
accuracy), the PS-16 criteria should be revised to reflect the same
criteria as for CEMS.  In addition, relaxed accuracy criteria should be
allowed when the compliance margin is sufficiently large. 

	The criteria for RATAs and RAAs also need to be clearly defined.  Based
on similar requirements in PS-2 and the QA procedures in Appendix F, the
RA for the annual RATA is a statistically weighted computation that
considers the confidence interval for the data set.  For the quarterly
RAAs, the “accuracy” is determined by a direct comparison of the
“measured value” for the PEMS to the reference value (i.e., portable
analyzer or reference method).  The requirements are not apparent in PS
16, and the existing analogues from Part 60 Appendices B and F should be
used as the basis for the computation, as well as for performance
requirements. 

	For example, consider the accuracy requirements from PS-16 versus PS-2.
For a NOx RATA, the performance criteria include: 

	• PS-2:  RA based on RM <20% 

	• PS-16:  RA based on RM <10%  

	• PS-2:  Emissions < 50% of std, RA based on standard <10% 

	• PS-16:  Emissions < 25% of std, RA based on standard <20% 

	• PS-16:  Emissions < 10% of std, 2 ppm absolute accuracy  

For the quarterly RAA 

	• PS-2:  RA = 15% of RM or 7.5% of std 

	• PS-16:  RA = 10% of RM 

	Other than the allowances under PS-16 for audits with emissions less
than 25 percent of the standard, the performance criteria for PEMS are
more stringent than for an analogous CEMS. The EPA does not discuss
these differences or identify why more stringent requirements are
warranted for PEMS.  With the use of PEMS requiring rigorous ongoing
audits and quality assurance measures to ensure performance, including
the more rigorous requirement for a quarterly RAA due to the
inapplicability of a CGA, more stringent performance criteria are not
warranted under PS-16.  The PS-16 proposal should be revised to include
annual RATA criteria and quarterly RAA criteria that are consistent with
the analogous CEMS specifications from Appendices B and F in Part 60. 

	In addition, the proposal should revise criteria related to relaxed
performance requirements when emissions are much less than the emissions
standard (i.e., an ample compliance margin exists).  As noted above,
this is considered in the PS-16 proposal, but a threshold that requires
NOx to be less than 25 percent of the standard is not functional in
practice.  More reasonable criteria should be identified.  For example,
if PS-16 is revised to include the 20 percent RA requirement (as
discussed above), a provision related to compliance margin should be
included to indicate that a 30 percent RA is acceptable for equipment
with emissions less than 70 percent of the standard.  Similar
performance steps could also be identified as the compliance margin
increases.  Such criteria will ensure that the PEMS is providing a
proper indication of compliance, while accounting for the actual
emission performance of the affected unit and the associated compliance
margin. 

	The recordkeeping requirement in Section 6.2 of the PS-16 proposal
implies that 100 percent data capture is necessary to report a valid
one-hour average.  This is more stringent than CEMS-based requirements. 
Performance Specification 16 should be revised to include data recovery
requirements analogous to existing requirements for CEMS.  (0019)

	Response:  See the response to Comment 19 concerning relaxed RATA
allowances when emissions are low and why RATA stringency in PS-16
differs from the PSs for CEMS.  Other scenarios concerning acceptance of
test data that are outside the performance criteria of the PS even
though compliance is clearly demonstrated are more appropriately left to
enforcement discretion.  The recordkeeping requirements in Section 6.2
that discuss valid hour data requirements have been dropped.  Such
requirements are already addressed in §60.13 of 40 CFR part 60.

	22.  Comment:  The EPA is proposing more stringent RATA accuracy
requirements for NOx than are required under the EMC guidelines or than
are required for CEMS.  A PEMS should not be required to meet a higher
level of RA than a CEMS.  The EPA should not make the proposed changes
and should retain the requirement that RATA testing must meet 20 percent
of the average test method value for NOx and oxygen.  (0034)

	Response:  See the response to Comment 19.

	23.  Comment:  The EPA has proposed allowing a RA of 10 percent of the
RM results when the RM is above 25 percent of the standard and 20
percent of the standard when RM results are below 25 percent of the
standard.  The proposed RA is inconsistent with PS-2 for CEMS which
allows 20 percent RA for RM emissions greater than 50 percent of the
standard.  Newer equipment with low emission characteristics are
typically never less than 50 percent of the standard.  We suggest that
EPA maintain 50 percent as the cutoff and use 20 percent of the RM above
50 percent and 20 percent of the standard less than 50.  This is
consistent with the RM requirements for CEMS.  The RM requirements for a
PEMS should not be more stringent than those for CEMS.  (0026)

	Response:  See the response to Comment 19.

	24.  Comment:  40 CFR 75 includes alternative specifications for RATAs
conducted on units with low emissions.  For a unit where the NOx
concentrations are less than 250 ppm, an acceptable RATA result is
obtained if the analyzer mean value is within ± 15 ppm of the reference
mean.  Also, if NOx emissions are less than 0.200 lb/mmBtu, an
acceptable result is obtained if the analyzer mean value is within ±
0.020 lb/mmBtu of the reference mean.

	We propose that the PEMS RA alternative specifications be set to
emissions rates relative to the reference mean as in Part 75, not
relative to the source’s emissions standards.  For example, the PS for
NOx could be within 10 percent RA,  ± 15 ppm or 0.020 lb/mmBtu
difference relative to the reference mean, depending on the basis of the
accuracy determination, whichever is less restrictive. This alternate
specification would be similar to that under Part 75 and will provide a
way for low-emitters to verify their PEMS.  (0016)

	Response:  The Part 75 alternative specifications for RATAs are set
relative to  the reference mean because testing under that program does
not involve emission standards.  The Part 75 alternative specifications
for RATAs are not suitable for compliance-based testing with low-level
emission standards.  We have, however, added alternative specifications
that are more reasonable for standards-based testing as noted in the
response to Comment 19.

Testing Loads and Number of Relative Accuracy Runs 

	25.  Comment:  After the initial certification it may be advisable to
require 3- load RATAs for the annual RATAs, instead of merely at one
load.  (0007)

	Response:  We do not believe a 3-load RATA is necessary for the annual
RATA.  We believe an evaluation of the PEMS under the mid-range
conditions provide a sufficient periodic indication of performance.

Number of Runs for Market-Based PEMS

	26.  Comment:  The EPA is contemplating 27-run RATA tests for PEMS used
on market-based emissions trading programs.  We have suggested that the
statistical analysis of these RATA test runs will yield inconsistent
results when applied to PEMS that have no response time.  Additional
base-load testing at a given load will not significantly contribute to
the assessment of PEMS accuracy for the hour average intervals to be
reported on.  The additional RATA test runs are not necessary if the
statistical tests are not applied.  Multiple load RATA testing can be
accomplished using the standard 9 RATA run protocol.  In the States of
Michigan and Ohio, we have certified PEMS using a 12 RATA run protocol
(four runs at each of three load conditions) from which 9 RATA runs were
used (throwing out at most one run per each of the load conditions). 
This RATA procedure is carried out for each fuel firing condition (ie.,
fuel type, oil vs. natural gas) for initial certification. The protocol
was developed with support from state and local 

regulatory agencies.  (0008, 0033)

	Response:  We have not retained in the final rule the provisions for
market-based PEMS that were proposed.  The provisions for evaluating
market-based PEMS will continue to be Subpart E of Part 75 or other
applicable regulatory specifications.

	27.  Comment:  Proposed PS-16 defines “compliance and market trading
PEMS” and distinguishes these from a second category of PEMS, “O&M
PEMS.”  Does EPA anticipate adding market-trading programs to Part 60?
 We request clarification to determine which sources fall into this
category and must perform the extended, 27-run, 3-level (9 runs at each
level) comparison test against the RM.  (0016)

	Response:  We do not anticipate adding market-trading programs to Part
60.  We have dropped the provisions for market-based PEMS in the final
PS-16.  Predictive systems that are used to show continual compliance
with an emission standard (and usually have minimum data requirements
and are subject to Appendix F) are required to perform the extended
27-run, 3-level comparison test.

	28.  Comment:  The RATA testing at three load conditions (min. 27 runs)
is not recommended for several reasons.  First, it would require three
days of testing that would add to the cost of the ongoing maintenance. 
Second, testing at only three load conditions would allow a source to
pick their three best loads, and then perform the tests at those load
conditions, thus virtually ensuring that it would pass.  Third, at any
given load conditions, units such as gas fired boilers do not vary much
and nine runs at a given load will basically give you nine consistent
readings.  In lieu of 27 test runs, it is highly recommended that EPA
allow a source to test 12-runs at six load conditions covering the
operating range of the unit. The source would require that at least nine
(of the twelve) runs be used for the RATA calculation, and that at least
one run at each load condition must be used in the RATA calculation.
This would allow testing to be completed in one day, and evaluate the
full operating range of the unit being evaluated.  

	One of the load conditions must be the full load condition for the unit
being tested.  Full load would be defined at ≥ 95 percent of maximum
load.  If the unit was unable to test at full load, then the unit would
be restricted to operate no greater that the maximum load tested for the
annual RATA.  The source would, of course, be allowed to retest at a
later date to increase the full load rating of the unit.  (0038)

	Response:  A PEMS must be evaluated over 3 key parameter operating
levels.  This key parameter may or may not be process load.  We believe
that 9 runs at each of  3 key parameter operating levels are needed to
generate sufficient data for the required statistical tests.  The source
does not have the freedom to choose any 3 levels, but must test in the
low-, mid-, and high-level ranges.  We understand that this 27-run test
will require more field time than a typical CEMS certification. 
However, such a evaluation is needed to properly validate a compliance
monitoring system that is based on specialized relationships that are
not readily understood. 

	29.  Comment:  The definition for compliance and market trading PEMS is
not provided in the performance specification.  What is the NSPS
standard doing adding additional requirements for performance for PEMS
to be used under another program?  Are not all NSPS PEMS going to be
used for continual compliance?  (0008 or 0033)

	Response:  We agree.  See the response to Comment 19.

Statistical Analyses

	30.  Comment:  There are three major issues we have taken with proposed
PS-16.  The first is with the relative accuracy test performance
criteria.  The second is with the statistical test required for units
that report under emissions trading or continual compliance programs. 
The last issue of significance is the relative accuracy audit (RAA)
requirement.  

	These issues are somewhat related and generally involve procedures or
criteria that are unrealistic or provide no value.  The relative
accuracy requirements are in some cases too severe and would prevent
most CEMS from certifying using standard RM testing in addition to
preventing all but the most sophisticated PEMS from passing
certification.  The statistical tests have no bearing on the evaluation
of the performance of the PEMS.  The RAA uses a methodology that is
inherently less accurate than the PEMS it would be auditing.  

	We are a leading supplier of PEMS, and we have focused on PEMS quality
assurance requirements in compliance reporting of emissions and
recommend additional procedures and methods for quality control of PEMS
data.  These additional tests and ongoing tasks provide the regulatory
agencies with an increased level of comfort with a system that is
software-based as opposed to real stack gas measurements.  The
additional quality control we are proposing is a daily zero and span
calibration of the PEMS and quarterly linearity error tests.  Together
with the annual relative accuracy test audit, the daily sensor
evaluation system, and these built-in quality controls, PEMS emissions
data will be as, or more accurate, reliable, and timely as any CEMS data
produced today.  (0008, 0033)

	Response:  We do not believe the relative accuracy requirements are too
severe and would prevent most CEMS or PEMS from certifying using
standard RM testing.  The CEMS used under the EPA’s Acid Rain program
have met the relative accuracy requirements proposed in PS-16 and have
functioned well for years.  We disagree that the required statistical
tests have no bearing on evaluating the performance of PEMS.  The
statistical tests are the standard tools for determining whether PEMS
are biased, correlated, or significantly different from standard RMs. 
Daily zero and span checks of the PEMS calibration is a task that most
PEMS are not capable of performing.

	31.  Comment:  Regarding the statistical tests, the F-test and
correlation tests are specified in Subpart E of Part 75 for
applicability under the market-based emission trading program for NOx
and SO2.  These tests are based on 720 operating hours of data and
compared with a certified instrumental RM.  The minute-level data that
are averaged in a RATA test and then used in these statistical tests do
not reflect on the reportable averaging period (one hour) for NSPS
sources.  The RATA test looks at minute level data (21 minute run
averages) to assess accuracy of the CEMS or PEMS.  For the statistical
analysis, correlation, and F distribution or variance, it is not
relative to evaluate minute averages since hour averages are the
shortest period reported on.  The minute-level data could very well show
correlation problems even with the CEMS due to response time issues
among two difference analyzers and will certainly show up in a PEMS vs.
RM evaluation since most PEMS do not have response time issues.  

	For example, at base-load on a gas-fired turbine, the carbon monoxide
(CO) and NOx emission rates are steady and low (for example we see
levels of 0-1 ppm for CO and 7-8 ppm for NOx for a DLN unit).  The RM
data will show some variations from run to run that do not reflect
variations in emissions but in sampling and analysis issues or analyzer
drift.  The PEMS will show a steady emission rate reflective of the
process in a steady state mode.  The correlation analysis of the RATA
test run data may or may not meet the 0.8 requirement depending on the
quality of the RM data only and the presence of sampling and analysis
issues that cause slight variations in the RM data.  The PEMS can
produce a steady value of 7.5 ppm, but the reference method data show
variations among the run from 7 to 8 ppm.  The PEMS can be very accurate
(i.e. relative accuracy of less than 7.5 percent) in this case and the
resulting correlation analysis shows a failure.  Analysis of normal CEMS
data shows a similar result in that the RATA test data can fail the
statistical analysis, even if the relative accuracy is good, to random
‘noise’ and sampling/analysis issues.  We would recommend analysis
of the hourly average data over a greater time period (more than the
several hours of RATA testing) to properly assess PEMS performance with
the correlation test, F test, and t tests.  (0008 or 0033,  0013)

	Response:  The statistical tests under PS-16 do not compare PEMS and RM
data on 1-minute averages, but on run averages.  A minimum 21-minute run
average for the PEMS is compared to same time interval for the RM.  We
believe 21-minute averages (which in most cases are based on the
averaging of 1 minute data points) compared to similar results from a RM
is a valid means of evaluating PEMS performance.  System performance
using the statistical tests does not need to be tied to data reportable
averaging times.  Extending the RA run time to 1 hour would double the
time required for the certification test.  We do not believe the RM or
CEMS output will be subject to inherent analytical errors that will not
also be present in a PEMS.  The PEMS inputs will also experience a
measure of random error based on the measurement capabilities of the
sensors.

	32.  Comment:  We suggest the statistical analyses be required of all
PEMS and CEMS or not at all.  (0008 or 0033)

	Response:  Statistical analyses are required of PEMS because of the
indirect nature of the measurement system.  The collective effects of
multiple sensor calibrations and the stability of model
interrelationships at different concentrations make an initial
establishment of correlation and variance necessary.  The underlying
principles of detection employed by CEMS are well understood and based
on the detection of a specific compound.  Predictive systems, on the
other hand, may be developed using different operating  principles and
will involve predictive relationships that are not readily understood by
the source or testers.  

	33.  Comment:  The inherent variability associated with RM test data
and the changing loads (and emission levels) required of the 3-load RATA
under proposed PS-16 renders the correlation coefficient as
insignificant.  (0008 or 0033)

	Response:  We disagree.  Under static emission conditions, the
correlation test may be difficult to pass, but not where parameter
conditions (such as load) are being changed.

	34.  Comment:  If a correlation or other statistical analysis is
required by regulators, we suggest the analysis be of data that is
averaged in reportable time periods (i.e., hour or daily averages).  We
also suggest that any statistical analysis of PEMS data be conducted
over a longer period than is represented in a typical RATA test but
should be similar to the procedure used in 40 CFR Part 75, Subpart E.  

	This could require some side by side PEMS and CEMS data for a period of
a week or a month or more.  The resulting data set (either 100 hours or
more of paired CEMS and PEMS data or 30 days for example of daily
average data) could be readily analyzed using the correlation test,
F-test, t-test or any other statistical test that would have meaning to
the evaluation of the long-term performance of the PEMS.  If the
additional long term evaluation is performed, RAAs may be waived for
such qualifying robust PEMS models.   (0008, 0033)

	Response:  Relative accuracy tests lasting 100 hours or 30 days would
render PEMS use as a monitoring tool economically impractical.  The PEMS
technology has been shown to work favorable in past applications.  We do
not want to handicap their use by requiring expensive, long-term
certification tests that we don’t believe are necessary.

	35.  Comment:  Our members use CEMS at many of their facilities and are
therefore interested in the Agency’s consideration of alternative
techniques such as PEMS.  Further, proposed PS-16 is dependent on
mathematical statistics.  The proposed statistical criteria such as
agreement between the PEMS and the RM at 95% confidence interval (CI)
may be difficult to achieve in practice. Therefore, we support the
Agency’s position that the use of a PEMS system in any application be
voluntary.  (0010)

	Response:  The majority of PEMS are used as voluntary alternatives to
CEMS.  Performance Specification 16 is not intended to be retroactive to
PEMS that are already in use. 

	36.  Comment:  We recommend that PS-16 only require RATA requirements
for compliance PEMS as was done in the January 1996 example PEMS
requirements.  The other statistical tests are representative of market
trading PEMS requirements established in 40 CFR 75 and add nothing to
the accuracy or precision of the PEMS – just more onerous
requirements.  The additional RATA requirements and statistical tests
will cost over $10,000 more annually in comparison to the current RATA
requirements.  (0028) 

	Response:  The final rule only covers PEMS that are used for operation
and maintenance and for determining compliance.  The provisions for
market-based PEMS have been dropped from PS-16.  We disagree that the
statistical tests add nothing to the accuracy and precision of the PEMS.

	37.  Comment:  We recommend that the waiver for the r-correlation test
be made permanent if the data are determined to be either
auto-correlated or the signal-to-noise ratio of the data is less than 4.
 A low signal to noise ratio is when most of the paired observations of
PEMS and test method CEMS values are within the noise level of the
analyzer.  The noise level of an analyzer is + 2.5 percent of the range
of the test method CEMS values. 

	To conduct the initial certification test for a PEMS, the key operating
parameter affecting the pollutant is varied as significantly as allowed
within operational and environmental constraints with the goal to vary
the parameter as was done during the data gathering to create the PEMS. 
If the pollutant does not vary significantly in comparison to the
reference method data then the r correlation test will never be
appropriate for the data. 

	The proposed requirement to perform additional subsequent tests will be
fruitless. Forcing the operation of the plant into non-representative
operating modes merely to pass the r-correlation test may result in
unsafe and unstable operations.  Therefore, if the PEMS data is
determined to be either auto-correlated or the signal to noise ratio is
determined to be to low, then r-correlation should be permanently
waived. 

	We have enclosed an example PEMS r-correlation waiver test report
submitted to the Texas Council of Environmental Quality (edited to not
reveal the customer name). Texas has an r-correlation waiver test
requirement similar to the proposed PS-16 requirement.  We urge EPA to
be more proactive in the preplanning of the initial PEMS certification
tests to ensure that the key operating parameter affecting NOx will be
moved to the limits encountered during the data gathering phase to
create the PEMS.  If this key parameter is moved during the initial
certification test and the r-correlation test is not passed, then an
analysis to determine if the data is auto-correlated or has a low signal
to noise ratio should be conducted.  If either condition exists, then
the r-correlation test should be waived, permanently and with no retest.
 If not, then the PEMS has failed the r-correlation test and corrective
action should be required.  (0028) 

	Response:  We agree that comparison data that are auto-correlated or
have a low signal-to-noise ratio may not pass the correlation test.  We
are therefore permanently waiving the r-correlation test where either
condition exists during the initial certification test.  We do not
believe the key operating parameter affecting the pollutant
concentration should be required to be moved to the limits experienced
during the data gathering phase to create the PEMS.  Such limits may
push the unit beyond its emissions bounds or into non-representative
operating modes.  Testing at the low-, mid-, and high-levels of the test
parameter should suffice.

	38.  Comment:  There are a number of problems with the statistical
tests for alternative methods in Subpart E.  Many of these problems were
discussed in the material provided by EPRI’s Utility PEMS Advisory
Committee to EPA’s Acid Rain Division in 1997. Some of these problems
were also recently highlighted in the PEMS evaluation conducted by The
Cadmus Group for the Clean Air Markets Division (CAMD).  The flaws make
the Subpart E statistical tests unreliable under a number of situations.


	The Subpart E statistics ask the wrong question. They are designed to
evaluate whether the alternative is equivalent to a CEMS, not whether it
would be able to satisfy the same performance criteria or the underlying
accuracy requirements. The statistics may fail for a variety of reasons
including potential cases where an alternative might be more accurate
than a CEMS so equivalence would not be apparent. The subpart statistics
can also fail a system that would otherwise be able to meet all the Part
75 QA requirements. 

	Subpart E is a poor template for PS-16. The multi-load relative
accuracy tests approach and the periodic quality assurance (QA) and
sensor analysis are preferable to the F-test and correlation statistics
because they are more analogous to the requirements for CEMS.  Like the
CEMS QA requirements in Part 60 and 75, RATAs and other periodic QA are
intended to demonstrate whether the PEMS exhibits sufficient accuracy,
not some nebulous notion of equivalency. 

	The proposal makes some concessions to the statistical problems by
creating a threshold to address the potential impact of low RM
variability on the F-test results. The correlation analysis may also be
waived (temporarily) by the Administrator if the emissions cannot be
sufficiently varied.  While these types of accommodations are reasonable
and supported, a more fundamental question is whether the remnants of
Subpart E tests in the PS-16 proposal are necessary. 

	Relative accuracy results include an error component reflecting the
mean difference but also a variability component represented in the
confidence coefficient.  The confidence coefficient is a function of the
standard deviation, which is the square root of the variance of the
difference between the RM and PEMS measurements.  The variance of the
difference, if the measurements are independent, is the sum of the RM
and PEMS variance.  Thus, the confidence coefficient essentially serves
the same purpose as the F-test.  If the variance of the PEMS is too
great, the confidence coefficient will mushroom and the relative
accuracy criterion will not be met. 

	Conducting multi-load relative accuracy tests also satisfy the heart of
the correlation analysis.  If the PEMS does not correlate with
emissions, the PEMS is unlikely to pass the RA test at each load level. 
Perchance, there could be a case where there is little variation in the
emissions and the PEMS just happens to match the RM value.  But, when
there is little variation in the emissions, the correlation analysis
would be meaningless anyway.  (0018)

	Response:  See the response to Comment 37.  Performance Specification
16 does not try to show a PEMS’s equivalence to a CEMS.  It simply
evaluates a PEMS’s performance relative to that of a RM, which serves
as the standard.  This is the normal procedure for evaluating the
accuracy of non-market-based monitors.

	39.  Comment:   In Section 12.3.2, the F-test is conducted to compare
the variances of two populations that are normally distributed.  Most
environmental data are not normally distributed.  The variance ratio
test is severely and adversely affected by sampling non-normal
populations.  Thus, it must be employed with caution and
reservation.   Either a more suitable standard test should be proposed
or this section should be dropped.  (References:  Markowski, C.A. and
E.P. Markowski (1990) Conditions for the effectiveness of a preliminary
test of variance.  American Statistician, 44, 322-326; Zar, Jerrold H.
(1996) Biostatistical Analysis, Third Edition.  Upper Saddle River, NJ:
Prentice Hall).  (0021, 0017)

	Response:  The commenter did not provide sufficient information to show
that environmental data are not normally distributed  or that the F-test
is not appropriate for evaluating the variances of RMs and PEMS. 
Therefore, the F-test remains a requirement.

Periodic PEMS Assessment 

	40.  Comment:  Following the development of a robust statistical hybrid
PEMS model and its certification under the applicable performance
specification, ongoing quality assurance will be required to ensure the
compliance reporting data is valid for the PEMS much like is done with a
CEMS.  In fact, we propose to utilize the same procedures as are used
with a CEMS including daily calibration drift testing and quarterly
linearity error tests (only without calibration gas cylinders).  The
initial testing for PEMS resiliency to input drift and failures is
repeated daily and following any model retraining activity.  Statistical
analysis of the model and definition of the model training envelope is
also conducted following any model retraining activity.  Care is taken
to pick out inputs that are reliable and offer data that is not subject
to drift or failure.  If selection of the inputs used in the model is
restricted to inputs that can be included in a QA program and whose
drift is measurable and small, then the PEMS model can be demonstrated
to be resilient to input failure and drift such that accurate compliance
data can be generated and recorded during most normal unit operations
including startups, shutdowns, and transitional states.  We propose the
following QA procedures:

	The calibration error test is conducted by first sending the PEMS a
non-operating or low-pollutant level process vector representing the
average values of all process inputs recorded during a non-operating or
low-pollutant level period.  The PEMS will provide predictions that
should reflect either zero emissions or low-pollutant levels that can be
evaluated against a zero emission or low-pollutant level as measured by
reference method testing.  Subsequently, a process vector representing a
base-load (span) or high-pollutant level is presented to the analyzer.  
This process condition is documented during previous annual reference
method testing.   This RM test data is used to determine a base-load
(span) or high-pollutant level that should be equivalent to the
predicted emission for this high-pollutant level process condition.

	The calibration error test can be conducted every day as is done for a
CEMS or it can be conducted every hour or even every minute if required.
 This PEMS calibration error test is a valid test of the entire system
if the inputs are properly simulated upstream of all PEMS processing
such that the entire system is evaluated for normal functionality and
response.  If the historical training data set is corrupted, has been
biased, or has lost its connectivity with the PEMS analyzer, the
resulting calibration record (zero and span) would not be consistent
with the values provided during RM testing.  This would cause a
calibration failure and an alarm would be raised that would be resolved
by PEMS support staff onsite or the PEMS supplier would be contacted to
resolve the issue.

	Similarly, a linearity error test can be performed on any PEMS model. 
The annual three-load RA tests can be used to determine three process
calibration records representing the average measured value for each
process parameter during the RA tests.  These three process vectors
designated low-, mid-, and high-level audit points can be presented to
the PEMS analyzer for predictions.  The resulting predictions can be
evaluated against the average RM test value obtained during that
year’s RATA.  The deviation can be recorded and submitted on a
quarterly basis for the linearity error test.  This test is valid for a
PEMS just as with a CEMS, in that it validates a low-, mid-, and
high-level predicted emission level against a RM or standard.  The test
ensures proper functionality of the entire PEMS across the emission
range of the combustion unit.   (0008, 0033)

	Response:  Performance Specification 16 allows PEMS users the
flexibility to use a variety of checks such as the calibration error and
linearity checks you described for their sensor evaluation system.  The
noted calibration/linearity checks would not appear to detect faulty
sensors.

	41.  Comment:  We propose that the QA requirements and data reporting
for PEMS be similar to those already established for CEMS.  This would
simplify reporting for sources that utilize PEMS for continual
compliance or market-trading purposes.  This would also simplify data
validation routines for both sources and EPA.  Adding daily calibration
tests and quarterly linearity tests would allow PEMS to be maintained
similar to CEMS which allows for some consistency in ongoing quality
assurance.  (0016)

	Response:  Predictive systems do not lend themselves to daily
calibration tests and quarterly linearity tests in the same way CEMS do.
 Therefore, alternative daily checks and quarterly RAA are prescribed. 
Performance Specification 16 does not apply to market-based PEMS.

	42.  Comment:  The issue of the relative accuracy audit (RAA) boils
down to this: can the PEMS model be valid year to year without
additional verification on a quarterly or monthly basis from gas
analyzers?  In the previous regulatory regime (40 CFR Part 60), sources
that were not required to continuously monitor emissions were required
to do periodic RM testing on an annual or longer basis.  The assumption
here is that if a source was in compliance from year to year, then it
will demonstrate its ability to be in compliance in the periods
intervening.  Continuous monitoring (as is done with both a CEMS and a
PEMS) under the existing regulations bridges the gap between annual
certification testing.  

	For quality assuring  a CEMS, the RATA is conducted on an annual basis
for 40 CFR Part 60.  The daily calibration error tests and quarterly
linearity error tests validate the normal function of the CEMS in the
inventing periods between RATA tests.  Likewise with a PEMS, the
calibration error and linearity error tests can be used to evaluate the
normal function of the entire PEMS and can be relied upon to validate
the PEMS performance in the intervening periods.

	If a robust statistical hybrid PEMS model has been developed over all
ambient and operating conditions that can be expected for the unit and
it passes the requirements of Subpart E or PS-16, then its is a fair
assumption to conclude that the PEMS model will have demonstrated
performance similar to a CEMS and will be valid year to year without
periodic gas analyzers being mobilized to the site.  Furthermore, PEMS
daily calibration error testing and quarterly linearity error testing as
described above can be utilized to validate data collected in the
intervening periods between RATA tests as with CEMS.

	We recommend that annual unit maintenance be conducted just prior to
annual PEMS performance testing.  This allows the RM testing team to
pick up any problems with the unit coming out of maintenance during
annual PEMS RATA testing.  The annual RATA tests will be capable of
picking up any fundamental changes in the process and its emission
profile that occur over long periods of time or due to improper or
inadequate unit maintenance.

	We envision PEMS to be the most desirable solution for remote
installations and smaller sites that do not have instrumentation staff. 
In some cases, such on gas pipelines, these sites are not manned at all.
 Mobilization of specialized staff to such installations each of the
non-RATA quarters of the year would be an unwarranted expense given the
fundamental issues noted above.  If the portable analyzer (the least
accurate method – compared to gas analyzers used in reference method
testing or certified CEMS or PEMS or U.S. EPA Protocol 1 gas cylinders)
is used to validate a PEMS that is at least twice as accurate as the
auditing method, then it undermines the intent of the regulation and QA
program.

	Although we see some value in RAA tests conducted during the first year
of operation, we do not agree that quarterly RAA tests are valuable for
all sites and PEMS installation all the time.  We question the use of
portable analyzers to audit qualified PEMS since the level of accuracy
of the auditing tool is less than the established accuracy of the PEMS. 
PEMS certified under 40 CFR Part 75, Subpart E would by definition be
considered equivalent to CEMS, but under this PS, these PEMS would have
to be audited three quarters a year by a portable analyzer that has not
been certified and is inherently less accurate to meet the requirements
under 40 CFR Part 60.

	One compromise to address this issue is to require the quarterly RAA
tests following initial certification for one year or three quarters. 
If the model performs over the entire year and is certified the
following year (the same model without any tuning or additional data
added to the historical training dataset), the RAA requirements could be
waived.  The RAA requirement could be waived for models that have
certified under 40 CFR Part 75, Subpart E and perform daily calibration
error and quarterly linearity error tests.  This waiver may apply to
low-emitter, smaller combustion units, or all units with robust PEMS
models as defined by additional performance requirements (such as
passing 2 consecutive RATAs with three consecutive passing RAAs).  Any
failed RATA attempts or any PEMS retraining that triggers
recertification would necessitate a year (three quarters in a row) of
RAAs.  Again, if following a failed certification or major process
modification, the model is reworked and deployed, initial certification
is achieved, three successful RAAs are completed, and another full RATA
test is completed validating the robust PEMS model, the RAA requirement
would again be waived prospectively.   This waiver may be fundamentally
important to the viability of the technology for compliance monitoring
at remote sites.

	We propose that EPA issue QA requirements that are similar between CEMS
and PEMS for 40 CFR Parts 60 and 75.  We propose that the bulk of the
reporting, including electronic records, be similar between CEMS and
PEMS.  This will simplify reporting by sources, emissions trading
programs, and owner or EPA data validation routines.  The Subpart E
petition, as with the PS-16 certification, would be handled differently
for PEMS in that the additional statistical analyses would be performed
on the paired PEMS and RM data presented in the RATA test report.  The
requirement of PEMS calibration error, response time, and linearity
error tests could be maintained similar to CEMS with similar function,
purpose, and result.

	For ongoing QA, if PEMS were required to perform daily zero and span
calibration error tests along with quarterly linearity error tests as
with CEMS, then the response to a failed quality control task could be
consistent.  The missing data and data substitution procedures would be
consistent between CEMS and PEMS.  This would allow EPA and the States
to respond to monitoring system failures in a consistent manner also. 
Regulatory agencies would be evaluating monitor downtime and excess
emissions against a quality control regime that is consistent whether a
CEMS or PEMS is used.

	Again, our proposed compromise is to allow a fully developed and robust
PEMS model to stand on its own in such a manner (with daily calibration
error and quarterly linearity error tests) that eliminates the
requirements for mobilization of portable testing equipment between RATA
tests.  A fully developed and robust model could be validated by design
and test data up front or it could be confirmed through initial
certification followed by three quarters of successful RAAs and a final
RATA test.  This validation process would allow remote sites to deploy
PEMS efficiently and cost-effectively.  (0008, 0033)

	Response:  The provisions for market trading PEMS have been dropped
from PS-16.  Performance Specification 16 will only apply to O&M and
compliance PEMS.  Predictive systems are not required to perform  the
calibration error, response time, or linearity error tests that parallel
CEMS are.  A recent study has shown that portable analyzers are capable
of producing results that are comparable to those of CEMS and Method 7E
for gas-fired combustion sources.  Your recommended annual maintenance
just prior to the RATA would result in a false indication of PEMS
performance over the previous year from the RATA.  Part 75 allows CEMS
with superior performance (a RATA of less than 7.5 percent) to reduce
semiannual RATAs to a single one.  

	In the final rule, we have granted relief for PEMS after the first year
of successful performance has been demonstrated.  During the second and
subsequent years of operation, a single RAA test performed a
half-year’s interval from the yearly RATA may be performed in place of
the quarterly RAAs.  This option may be continued as long as the
quarterly RAAs and yearly RATA are passed.  If either test is failed,
the PEMS must revert to the quarterly RAA requirement until a year of
successful RAAs and RATA are demonstrated.  

	43.  Comment:  We recommend that EPA eliminate the requirement for
quarterly RAAs for compliance PEMS and instead include a requirement for
quarterly electronic data accuracy assessment tests of the PEMS as
specified in the January 1996 PEMS requirements published by the EPA. 
In 1995, during the evaluation of PEMS by the EPA’s Emissions
Measurement Center, it was determined that some sort of quarterly PEMS
QA requirement was necessary and determined that quarterly electronic
data accuracy assessment tests be required.  This quarterly electronic
data accuracy assessment test, also known in the industry as the PEMS
integrity audit, consists of challenging the emissions model of the PEMS
and the sensor evaluation system models with known inputs and ensuring
that the accuracy of the PEMS outputs are within 5 percent.  (0024,
0028) 

	Response:  An electronic data accuracy assessment is a good check of
the emission model integrity but does not evaluate the input sensor’s
integrity.  As such, it is useful for evaluating one portion of the PEMS
but not the other.  A RAA evaluates the entire PEMS for integrity.

	44.  Comment:  The January 1996 example PEMS requirements ensure
accurate and reliable PEMS without the need for quarterly RAA for
compliance PEMS.  In the evaluation of PEMS to create the proposed
PS-16, the EPA reviewed PEMS RATA results submitted by a PEMS
manufacturer.  In no case did a PEMS fail a RATA, either the initial or
ongoing (subsequent either semi-annual or annual) RATA.  In fact, PEMS
generally become better with age because of the ability to tune the PEMS
to enhance its accuracy.  On average, for over 132 NOx PEMS, the RA was
6.4 percent for the initial RA test and 3.2 percent for the ongoing
RATAs.  Some PEMS (over 100) have been subjected to four or more
subsequent RATAs, and greater than 95 percent exhibit a RA of less than
7.5 percent.  Clearly, a RAA is not necessary to ensure the accuracy of
a PEMS over a long period of time and the January 1996 example PEMS
requirements suffice.  (0028) 

	Response:  In the data you note, no PEMS failed a RATA when the
performance level is set at 20 percent.  This hints that the 20 percent
RA criterion is too lenient and not representative of current PEMS
capabilities.  For the same data pool, 85 percent of the PEMS tests met
a 10 percent criterion, and 79 percent met a 7.5 percent criterion.  We
disagree that PEMS routinely operate for extended periods of time
without needing additional QA checks.  Sensor deterioration and process
changes, among other factors, contribute to the uncertainty of PEMS data
generated over time.

	45.  Comment:  As a producer of PEMS, we plan to deploy PEMS models
developed and tested over the widest possible operating envelope. 
Building models that are robust to climatic and process changes is a
part of commercial success.  While we see value in performing quarterly
audits during the first year of PEMS operation, quarterly accuracy
audits for subsequent years add no value.  If a CEMS is valid for a year
with one RATA test, it is logical to assume that with fewer physical
components PEMS should be considered valid for one year as well.  (0013)

	Response:  Most CEMS are required to perform quarterly checks as well
as yearly RATAs.  See the response to Comment 41.

	46.  Comment:  Section 9.4 requires annual audits of all O&M PEMS as
well as of continual compliance and market trading PEMS.  The annual
RATA involves nine runs at the normal operating level, which is as
stringent as the initial certification test for O&M PEMS.  To make the
use of O&M PEMS more attractive, the annual RATA for O&M PEMS should be
less burdensome than the initial certification.  We recommend that EPA
consider reducing the RATA for O&M PEMS to three runs at the normal
operating level.  We believe this, if adopted, would facilitate the use
of PEMS by reducing the burden of the proposed PS- 16.  (0015)

	Response:  Traditional RATAs consist of 9 runs, and this is the current
requirement for CEMS.  We do not think the costs or labor associated
with 3-run RATA (1.5 hrs) would be that much different from a 9-run (4.5
hrs) RATA considering the overall cost of the test.

 

	47.  Comment:  The cost of the RAA is underestimated by EPA and results
in a significant increase in costs for PEMS.  The manpower to perform
the RAA requires that a contractor with the necessary equipment be hired
to perform the work and generate the report.  In addition, the PEMS
provider is also required to be present in case the PEMS does not meet
the RAA requirements and adjustments are necessary. The estimated cost
to perform this work fully by the contractor (preparation, calibration
gases, transportation to the site, performance of the work, and
generation of the report) and to have the PEMS vendor available for
support is $10,000 quarterly or $30,000 annually. 

	This extra annual cost (above and beyond the requirements of the
January 1996 example PEMS requirements) will result in the demise of
PEMS as PEMS will not be cost effective compared to CEMS.  Note the
differentiation between compliance CEMS and market-trading CEMS. 
Market-trading CEMS are more costly to maintain than compliance CEMS. 
(0028) 

TESTING CONTRACTOR





No. of Days	Price Per Day	Calib. Gas Costs

Pretest Preparation	1	$1,000

	Calibration Gases	NA

$1,000

Portable Instrument Usage	1	$500

	Transportation	1	$1,000

	Onsite Work	1	$1,000

	Documentation	1	$1,500







PEMS VENDOR



	Pretest Preparation	1	$1,500

	Onsite Work	1	$1,500

	Documentation	1	$1,500



Subtotals	$9,000	$1,000

	Total	$10,000

	

	Response:  Quarterly audits are typical requirements for monitoring
equipment.  This does not place an unfair burden on PEMS beyond typical
CEMS requirements.  Performance Specification 16 no longer covers
market-trading PEMS.

	48.  Comment:  The proposed PS-16 calls for quarterly RAAs.  We
estimate that the costs to implement such audits would be in excess of
$250,000 per year for our operations.  This requirement could drive us
to now switch to new CEMS – a cost we estimate at well over $400,000. 
To date, we have confirmed the accuracy of PEMS through annual RATAs. 
These RATAs confirmed an average RA for NOx below 7.5 percent (the last
four RATAs in 2005 have averaged 1.93 percent).  Further, the 5 plant
average NOx emission rate has been 0.062 lb/MM Btu (or roughly 60
percent of the NSPS limit of 0.1 lb/MM Btu).  Thus, the RATAs conducted
so far show good accuracy for the PEMS and compliance with applicable
limits by a substantial margin.  The proposed requirements for quarterly
RAAs therefore simply add substantial additional annual costs and, in
our view, would do little to increase the accuracy of the PEMS and
likely would have no impact in insuring compliance with applicable
limits.  Accordingly, we recommend that EPA eliminate the proposal for
quarterly RAAs.  (0034, 0035)

	Response:  See the response to Comment 47.  There are cases where PEMS
have failed their first quarterly audit after being certified.  We
believe the quarterly audit requirements are justified.

	49.  Comment:  Quarterly RAA are not needed for PEMS if the January
1996 example PEMS requirement concept of semi-annual or annual RATAs is
adopted into PS- 16.  In the January 1996 PEMS requirements, the
following ongoing PEMS QA requirement was specified in the PEMS
protocols: “Conduct semiannual RA tests of the PEMS. Annual RA tests
may be conducted if the most recent RA test result is less than or equal
to 7.5 percent.”  This requirement, combined with the fact that the
unit has to be 

operated at three different emission levels sufficiently challenges the
PEMS performance and rewards good PEMS accuracy.  (0028)

	Response:  See the response to Comment 42.  Do we need to reward
superior RAs with semiannual audits? 

	50.  Comment:  We are concerned with the proposed requirement that
every PEMS system undergo a RATA test every year.  This proposed
requirement does not adequately address the issue of quality assurance
testing for identical units at the same location.  The current practice
in permits issued by the Texas Commission on Environmental Quality
(TCEQ) allows for RATA testing of a statistical cross-section of
identical units at the same location every year on a rotating basis so
that each individual unit is tested every second or third year.  This
type of testing assures that the model is valid and accurate while
reducing the burden, expense, and disruption caused by RATA testing of
multiple identical units.  We urge EPA to adopt a similar approach in
the performance specification.  (0036)

	Response:  The testing of identical units at the same location is best
addressed on a case-by-case basis by petitioning the Administrator. 
Relaxing the RATA to every two or three years would substantially
lengthen the period in which a poorly-performing PEMS would go
undetected.

 

	51.  Comment:  We are concerned with the proposed requirement to
perform quarterly RAAs on each PEMS system for several reasons.  One
concern is in regards to the cost and burden of testing multiple units
rather than testing a statistical cross-section of identical units at
the same location.  

	 It should also be recognized that the existence of a requirement for
quarterly assessments for CEMS systems does not provide a useful
precedent for application of such a requirement to PEMS.  The sensors
used in CEMS systems have much greater reliability issues including
frequent degradation over time compared to the operating condition
monitors used for operational control and relied upon by a PEMS system. 
Unless EPA has conducted sufficient controlled studies showing
degradation of PEMS models (particularly degradation in one direction)
to an extent that is significant for purposes of emissions policy, a
requirement for quarterly RAAs is unjustified and creates an unnecessary
and costly burden on operators.   We urge EPA to remove the requirement
for quarterly RAA testing.  (0036)

	Response:  The commenter did not provide evidence that CEMS sensors
were less reliable than PEMS sensors.  The quarterly RAAs have served
well in the past as periodic checks of CEMS performance and, in the
absence of data showing PEMS components operate differently,  we cannot
see why PEMS should be exempt from these tests.

QA of Associated Parameters

	52.  Comment:  For emission sources that utilize flue gas recirculation
for NOx control, many times the damper position is "fixed" versus
"variable."  For fixed damper positions, the position of the damper must
be noted in the PEMS report to ensure that it is not moved during
subsequent maintenance activities. For variable damper positions, the
damper position must be proven to be accurate (i.e., calibrated) and
used in the PEMS QA process.  For instance, even if the variable damper
position is not used in the PEMS calculation, it must be evaluated as
part of the daily QA to ensure that the unit is operating within the
operating range established during testing.  In other words, if the
damper position for a load condition was expected to be in the 38-42
percent range, then the QA evaluation must document that the damper is
still in that range even though it might not be used in the calculation
itself.  (0038)

	Response:  The commenter makes a valid point.  Damper position and its
effect on predicted emissions must be addressed in the PEMS design and
monitored for the appropriate operational envelope as are the sensor
inputs.  Any process component that affects the regulated pollutant
concentration or emission rate must be accounted for during normal PEMS
operation. 

Use of Portable Analyzers for the RAA

	53.  Comment:  Portable analyzers are a useful tool for evaluating
process emissions where CEMS or PEMS are not installed.  The performance
of the portable analyzer is fundamentally dependent on the operator and
state of the equipment.   In order to be accurate, portable analyzers
must be supplied with certified calibration gases.  Typical portable
analyzers do not provide the accuracy required to be reference
standards.  Sample train and extraction point location issues cause
problems for the portable equipment.

	A portable analyzer mobilized to a site can be expected, at best if
operated properly and calibrated with standard, to be within 20 percent
accuracy (see OAR-2003-0074-0006.pdf docket item 6 Report of Portable
Analyzers).  This level of accuracy is not sufficient for us to build
models that meet the requirement of this performance specification or 40
CFR Part 75, Subpart E.  Reference method testing would be required or
the utilization of existing compliance CEMS for the collection of
historical emission training data.  In fact, the only analyzer in the
study referenced that achieved accuracy better than 10 percent was the
more sophisticated analyzer using the RM methodology.  The utility of a
portable analyzer that has almost twice the inaccuracy of the PEMS for
auditing is questionable.

	The single point test at base-load with a portable analyzer that shows
a significant deviation from a fully developed statistical hybrid PEMS
would be doubted.  The analyzer would be recalibrated and tweaked to
meet the 20 percent requirement.  The PEMS is expected be at least twice
as accurate as this and typically delivers such accuracy under all
normal operating conditions.  The PEMS supplier would not be called out
to correct a sophisticated PEMS model on the basis of a failed portable
analyzer test.  Perhaps a reference method testing team would be
deployed for tuning.  Any PEMS could easily be biased to pass an RAA at
a single load point.

  	Responses to a failed RAA due to a deviation detected by a portable
analyzer would not be constructive.  The operator of the portable
analyzer would in all likelihood not be capable of adjusting the PEMS
model, tuning the combustion unit, or analyzing the source of the
emission problem.  A reference method test team might be brought in to
validate the portable reading, the PEMS supplier might be brought in to
adjust the model, or the combustion unit manufacturer could be called in
for combustion unit tuning.  Most likely though, the portable analyzer
will be adjusted and the RAA would be repeated until it passed.  In
essence the RAA for a validated robust statistical hybrid PEMS model
would be valueless.  It would only add significant cost that would not
add any quality assurance to the compliance monitoring program.  (0008,
0033)

	Response:  We have not seen data that supports the idea that PEMS are
inherently accurate such that their performance is guaranteed over long
periods of time.  The performance of PEMS, like CEMS, depends on a
number of criteria which can be variable.  The noted problems associated
with portable analyzers, their efficient calibration and operation and
their tweaking to match the PEMS are problems also associated with
reference methods.  The summary and findings of the noted report on
portable analyzers (OAR-2003-0074-0006) stated that “The portable
analyzers produced results that were comparable to those of the CEMS and
Method 7E for the two natural gas-fired combustion sources and low
concentrations tested.”  Portable analyzers are offered as a cheaper
testing option to add flexibility to the RAAs.  However, RMs may also be
used in place of portable analyzers.  A RAA for a validated robust
statistical hybrid PEMS model would not be valueless but would confirm
that such a PEMS is operating properly.

	54.  Comment:   Typical portable analyzers do not offer the 10 percent
accuracy required by PS-16, thus making the validation impossible. 
Instead, calibration error and linearity error tests could be employed
to define the accuracy level of the PEMS, similar to CEMS.  Quarterly
visits to remote sites outside of PEMS introduction or major maintenance
add unnecessary cost to both the OEM and the 

customer.  (0013)

	Response:  See the response to Comments 40 and 50.  Predictive emission
monitoring systems do not lend themselves easily to calibration error
and linearity tests where calibration gases are used.  For a PEMS,
reference methods would have to be brought on-site.

	55.  Comment:  The proposed performance specification requires a
quarterly audit of the PEMS against a portable analyzer for three
30-minute tests periods. EPA’s recent field demonstration evaluated
portable gas analyzers as an alternative way to assess the operating
status of the PEMS.  While portable gas analyzers provide a less labor
and cost intensive method for obtaining independent measurements for
comparison to the PEMS data (as compared to standard tests using EPA
methods), the field test confirms that they do not have the accuracy to
be used as an alternative to reference standards. As demonstrated in the
document “Evaluation of Portable Analyzers for Use in Quality Assuring
Predictive Emission Monitoring Systems for NOx,” portable analyzers
that are properly operated and calibrated will be accurate to within 20
percent.  The accuracy of the PEMS is required to be within 10 percent. 
Using a portable analyzer that has approximately twice the inaccuracy of
the PEMS that it is assessing is debatable. While portable analyzers
have been used for diagnostics and for compliance determination for many
years, they are inherently less accurate than reference standards and do
not provide the accuracy required to assess the periodic performance of
a PEMS.

	We propose using daily calibration tests and quarterly linearity tests
as a way to periodically validate the performance of the PEMS between
RATAs.  (0016, 0024)

	Response:  See the response to Comment 54.  

	56.  Comment:  A significant concern is that testing with a portable
analyzer is not a highly accurate or repeatable process.  We suspect
that assessing the accuracy of a PEMS installation with a portable
analyzer would provide little or no benefit and could possibly result in
false indications regarding PEMS system performance that could lead to
unnecessary and expensive additional testing of the PEMS system.  (0036)

	Response:  See the response to Comment 50.

	57.  Comment:  Portable instruments are not reliably accurate for RAA
and that false PEMS inaccuracy will occur resulting in undue expense and
the demise of PEMS because PEMS will become suspect in the eyes of
regulators and industry.  Despite the Clean Air Markets Division’s
contention that portable instruments are reliably accurate, portable
instruments are not as reliably accurate as EPA instrumental test
methods.  If portable instruments are as reliably accurate then CAMD
would allow portable instruments for the certification of CEMS as would
the Emission Measurement Center under Part 60.  Portable instruments are
not allowed for RATA because they are not reliably accurate and should
not be allowed for RAA. 

	A RAA cannot be taken lightly and the failure to pass a RAA would be a
serious matter.  Relying upon a portable instrument with unreliable
accuracy would not be prudent.  We believe the RAA requirement should be
eliminated for PEMS since portable instruments are not reliably accurate
and utilizing instrumental reference methods would increase the cost
even more with no appreciable increase in the accuracy or 

reliability of the PEMS. 

	A comparison of a periodic portable instrument to two instrumental
reference methods was conducted by MeadWestvaco at the request of CAMD
as part of an evaluation for CAMD developed PEMS requirements.  The
portable instrument was determined not to be reliably accurate in
comparison to the instrumental reference method.  The CAMD contends that
subsequent field tests prove that portable instruments are reliably
accurate for the RAA of PEMS despite the facts of the study conducted by
the third party contractor at MeadWestvaco.  (0028) 

	Response:  The commenter did not submit information from the
MeadWestvaco test to show that portable analyzer had insufficient
accuracy relative to the reference method.  The commenter also feels
that dropping the quarterly audits and trusting PEMS to be accurate
would be better than the proposed requirement.  We disagree; however, in
cases where the source or tester prefers using the reference method,
this option is available.  The quarterly audit may be performed with
portable analyzers meeting the requirements of ASTM D6522 or by using a
reference method.

	58.  Comment:   Proposed PS-16 includes the requirement that quarterly
RAAs be conducted with a portable analyzer.  Based on the high moisture,
particulate matter, volume, and temperature nature of the emission
streams from the state-of-the-art thermal oxidizer control equipment for
which our company utilizes PEMS, we do not believe such portable
analyzers are technically feasible or appropriate.  The high volume
emission stream from the thermal oxidizers at facilities such as ethanol
plants are high in moisture (frequently near 50 percent).  These gas
streams are also the result of two stages of combustion, the first of
which are the dryers, and the second is the thermal oxidation system. 
We have observed that a significant percentage (> 10 percent) of the NOx
from these systems is present as NO2 at the measurement point.  Since
portable analyzers do not typically have sufficient gas conditioning
systems for this environment, it is likely that condensation and hence
scrubbing of the NO2 will occur.  This will lead to an underestimation
of the true NOx concentrations and result in a failed annual RATA. 
Further, we do not believe such portable analyzers would be able to meet
the stringent RA requirements EPA has proposed in Section 13.1.  

	Attached is a letter from Dan Despen, the president of Interpoll
Laboratories, Inc., a leading testing firm, and his opinion that the use
of such portable analyzers for sources such as those at our facilities
would be infeasible for the purposes EPA has proposed.  Accordingly, we
believe that EPA should eliminate from PS-16 the requirements related to
quarterly RAAs and that accuracy of PEMS should be assessed through
RATAs consistent with the guidelines published by EPA’s Emission
Measurement Center in January 1996.  (0034, 0035)

	Response:  See the response to Comment 50.

	59.  Comment:  We ask that the requirement for a quarterly audit using
a portable analyzer be removed.  The background section states that PEMS
can be used where process parameters have a predictable relationship to
emissions.  When proposing a PEMS, an operator should be required to
either demonstrate that process instrumentation will track with
equipment deterioration or that the equipment deterioration will have
minimal impact on the PEMS accuracy.  Alaska North Slope experience on
uncontrolled turbines has demonstrated that predictive correlations
using process parameters are stable with time.  As long as the
instrumentation is maintained and equipment hardware design is
unchanged, PEMS have been found to be stable.  (0026)  

	Response:  The sensor evaluation system gives a general daily
indication process sensor condition.  However, other periodic checks are
needed that are similar to those established for CEMS.  We have not been
provided with information that shows PEMS have better long-term
reliability than CEMS.  Therefore, PS-16 has adopted the periodic QA
checks that CEMS are subject to. 

Ongoing QA Tests Table in Section 9.1

	60.  Comment:  The table entitled “Ongoing Quality Assurance Tests”
in Section 9.1 includes a sensor evaluation check and a sensor
evaluation alert test. The difference between these two tests is not
clear.  Additional guidance would be most helpful.  (0010)

׀cc׀ ,‌‌‌” but this is the condition when the bias test is
unacceptable and a bias adjustment factor needs to be applied.  We
propose wording to clarify this, such as: “davg ≤ ׀cc׀,‌‌‌
bias test is passed” with a frequency to be done “after each
RATA.”  (0008, 0016, 0033)

	Response:  The commenter’s suggestion adds clarity, and the suggested
correction has been made.

Number of RATA Runs 

	62.  Comment:  The PS-16 proposal requires more test runs for the
annual RATA than analogous specifications for CEMS.  The PS-16 proposal
should be revised to reflect analogous requirements – for example,
Section 8.4 of PS-2 can serve as the basis to identify that a minimum of
nine paired test runs are required.  (0019)

	Response:  Performance Specification 16 requires the same number of
tests as PS-2.  A minimum of nine paired test runs must be taken. 

	63.  Comment:  The PS-16 proposal includes RA requirements of nine test
runs for O&M PEMS which should be deleted from proposal.  The Section
2.1 summary also indicates that PEMS used for excess emissions reporting
require nine runs, but does not reference excess emissions PEMS in the
section that addresses relative accuracy tests (Section 8.2).  Other
PEMS require at least 27 test runs – well in excess of the 9 test run
minimum for analogous Appendix B specifications.  If EPA believes that
the additional runs are warranted to evaluate additional statistical
criteria for PEMS, than this rationale should be provided.  If
additional tests are deemed necessary, EPA should require this rigor
only for the initial performance test, with subsequent RATAs completed
using a minimum of nine runs.  In this scenario, if PEMS performance in
a subsequent audit indicates a performance failure, the failed audit
could be used as a trigger for additional test runs and PEMS data
analysis when the next RATA is completed.  (0019)

	Response:  See the response to Comment 60.  Yearly RATAs only require a
single set of 9 runs conducted under the normal key parameter operating
conditions.

Three Operating Conditions Unclear 

	64.  Comment:  The operating level requirements proposed – low,
normal, and high are unclear and should be revised to require a minimum
of three runs each at three operating conditions across the operating
envelope.   In addition, the RATA requirements define operating levels
that are confusing, i.e., “low”, “normal”, and “high.” 
These labels do not relate to actual operations – e.g., “normal
load” may be “high load” for many processes or units.  In Section
8.2, reference to operating requirements should stipulate that the
paired test runs be completed at three operating levels that cover the
breadth of the typical operating envelope.  In addition, subsequent to
the initial performance certification, RATAs should have the flexibility
to complete the tests over a narrower operating range if process
conditions preclude operating a unit across the entire operating
envelope.  (0019)

	Response:  We have dropped “normal” load in favor of “mid”
load.  The yearly RATAs may be performed only at a key parameter normal
operating range. 

Definition of and Examples of PEMS in Section 3.6

	65.  Comment:  We recommend adding examples to the definition of PEMS
provided in Section 3.6.  Specific examples of PEMS, as well as examples
of systems that should NOT be considered PEMS, would be most helpful,
especially to owners and operators who may not be familiar with emerging
technologies.  One specific example that we offer for the Agency’s
consideration pertains to an emitting unit that uses CEMS to directly
measure stack pollutant concentration and parametric monitoring (input
air flow rate) to predict the stack flow rate.  The measured pollutant
concentrations are combined with the “predicted” stack flow rate to
determine mass emission rates in lb/hr.  

	An article in the docket entitled, “Thoughts on Predictive Emission
Monitoring from a Regulatory Perspective,” by Gus Eghneim describes
two main categories of PEMS.  The first category relies on physical
principles, analytical methods, and laws of nature to describe the
dynamics of the process while the second category of PEMS uses linear or
non-linear regression analysis of historical data to model the process
dynamics over a broad range of operating conditions.  It is unclear if
PS-16 is intended to apply to both categories of PEMS or only to the
second category.

	Sections 3.6 and 6.1.1 clearly state that simple relationships using
fewer than 3 variables may not be acceptable as PEMS and would require
the Administrator’s approval before use.  Additional information and
specific examples would be most helpful since the Agency’s rationale
for these statements is somewhat unclear.  (0009) 

	Response:  Performance Specification 16 gives performance criteria a
PEMS must meet.  Example systems are not given due to the range of
potential candidates.   Either of the categories of PEMS noted in the
Eghneim article are acceptable.  Any system that uses 3 or more
variables to predict an emission concentration or emission rate and
meets the performance requirements of PS1-16 is acceptable.   The
example of the CEMS and a parameter system where the pollutant
concentration is measured and the flow rate predicted is not a good PEMS
candidate.  The flow rate prediction would be a parametric monitoring
tool and come under the provisions of PS-17 for parametric monitoring
systems.  

Applicability of PS 16

	66.  Comment:  As currently written, PS-16 applies to all PEMS used to
show compliance with an emission limitation, regardless of whether or
not continuous emission monitoring is required by the underlying rule or
permit condition which established the emission standard.  We suggest
the applicability language of PS-16 be modified to limit the
applicability of PS-16 to only those PEMS used as an alternative to CEMS
in order to satisfy continuous emission monitoring requirements.  (0009)

	Response:  There are no current regulations requiring the use PS-16 to
certify PEMS.  As such, all PEMS currently in use are grandfathered
under PS-16.  The new requirements will apply to all future PEMS, except
under Part 75, that are used as alternatives to CEMS or to satisfy new
requirements specifying the use of PEMS.  To facilitate such future
regulations, it best not to strictly associate PS-16 with PEMS that are
used as alternatives to CEMS.

Potential Overlap of PS-16 and PS-17

	67.  Comment:  A significant number of our member facilities have
operations that are subject to parametric monitoring requirements under
a variety of federal rules, which conceivably could be affected by EPA's
PEMS proposal because the applicability is so difficult to determine
from the proposed rule as written.  While we think the Agency's intent
was a narrow scope of applicability, the proposed Scope and Application
section and the definition of PEMS in the proposed rule are very vague,
and the preamble discussion does not adequately explain the scope of the
proposed specification.  

	Therefore either EPA must provide a Supplemental Notice, with better
explanation for comment, or EPA must expressly confirm in the final rule
that (1) Performance Specification 16 is confined in its application to
prospective rules that 

expressly identify PEMS as alternatives, and (2) that the rule does not
cover parametric 

monitoring that is already incorporated in the NSPS in 40 CFR 60 or
Maximum Achievable Control Technology (MACT) rules in 40 CFR 63 and
which are not part of a specific PEMS alternative . 

	Several important changes need to be incorporated into the final rule .
The 

preamble language states that:

if adopted as a final rule, all PEMS that will be used to comply with 40


CFR Parts 60, 61, and 63 will be required to comply with PS-16 .  In 

addition to new PEMS that are installed after the effective date of
PS-16, 

other PEMS may also be required to comply with PS-16 at the discretion 

of the applicable regulatory agency or permit writer. 

	EPA needs to delete this language from the preamble to the final rule
and replace it to make apparent that PS-16 does not apply to all
different types of parametric monitoring, but only to those rules that
expressly provide for PEMS.  In addition, a parallel clarification needs
to be inserted in the definition of "PEMS" in section 3.6 of the
proposed rule.

	If EPA does not intend this narrow scope of applicability for PS-16,
then the Agency must publish a Supplemental Notice in the Federal
Register for comment with a more explicit definition of PEMS, including
specific examples as to what constitutes a PEMS or not, and more
discussion of the regulatory impact of imposing the PS requirements. 
Greater outreach to the regulated community by EPA is also necessary so
that stakeholders are aware of the potential impact the performance
specifications could have on compliance with existing and future rules. 
Finally, a broader scope would necessitate an Executive Order 12866
review as well as an analysis of the impact the rule would have on small
businesses as required by the Regulatory Flexibility Act .  (0025)

	Response:  We note in the final rule that Performance Specification 16
is confined in its application to prospective rules that expressly
identify PEMS, and that the rule does not cover parametric monitoring
that is already in place or will be in place in the future.  Parametric
monitoring systems are subject to PS-17, not PS-16; this will be
explicitly stated.  We also clarify the applicability to note that PS-16
will apply to PEMS installed under 40 CFR 60, 61, and 63 after the
effective date.  The definition of PEMS in Section 3.6 also reflects
this.

	68.  Comment:  We understand that EPA is also working on the
development of a performance specification for continuous parametric
monitoring systems (CPMS) in PS-17 and are concerned about potential
overlap or conflict between PS-16 and PS-17.  A device or sensor used to
satisfy parametric monitoring requirements is also very likely to be
used as a component of a PEMS.  This would be especially relevant to the
operation and maintenance PEMS but could also occur with the compliance
and market trading PEMS.  We request the Agency provide clarification
regarding which performance specification, PS 16 or PS 17, would apply
in these types of situations.  (0009, 0010)

	Response:  Performance Specification 16 will note that it is not
applicable to CPMS.  If a device or sensor that is used to satisfy the
requirements of a parametric monitoring system also serves as a
component of an affected PEMS, then it is subject to PS-16 as it affects
the PEMS.  Also see the response to Comment 67.

	69.  Comment:  PS 16 separates PEMS into 2 categories having different
levels of requirements.  The first category is referred to as “O&M
PEMS” which are used for excess emission reporting and as indicators
of control device operation.  The second category is referred to as
“compliance and market trading PEMS” which are used for continual
compliance standards or in a market trading program.  We appreciate the
agency acknowledging the difference in criticality between the two
categories of PEMS and adjusting the requirements in PS-16 accordingly. 
However, we are concerned that the O&M PEMS are likely to involve
continuous parametric monitoring devices whose performance will be
addressed in a future performance specification (PS 17). Please clarify
which performance specification would apply to these sensors, PS-16 or
PS-17.  (0010)

	Response:  Performance Specification 17 would apply to parametric
monitoring systems, those that have associated parametric limits. 
Performance Specification 16 applies to predictive emission monitoring
systems, those that have associated emission limits.  This difference
has been noted in PS-16.

Number of RM Tests Where Emissions are Steady Across Loads 

	70.  Comment:  Section 8.2.3, requires nine RM tests to be conducted at
each of three different modes of operation (low-level, normal-level, and
high-level) for initial certification of continual compliance and
market-trading PEMS.  For some sources, the variance or difference
between low, normal, and high level may be negligible with respect to
emissions, or the sources may operate under only one load level.  For
these sources, requiring 27 RM runs (nine runs at three load levels) to
certify the PEMS would be unnecessary and excessive.  We recommend that
sources be allowed to show no appreciable emission changes with respect
to low-, normal-, or high-level operation modes and sources that operate
under one mode of operation to perform initial certification at only one
load level.  We believe this, if adopted, would facilitate the use of
PEMS by reducing the burden of the proposed PS-16.   Another option
would be to allow operators to petition the permitting authority for a
waiver of the 27-run requirement and provide justification to support a
9-run RATA at a single operating level, representative of normal
operations.  (0010),  (0015)

	

	Response:  See the response to Comment 37.  For the RA test, the key
parameter that affects emissions is varied at 3 levels, not the load
(unless it is the key parameter).  If there is little variation in this
parameter during normal operation, the source may partition the Agency
under 40 CFR 60.13 ©  to allow a single 9-run RA test.  

PEMS Differences with RM Due to Response Times

	71.  Comment:  We request that EPA recognize that statistical tests
need to provide for potential differential response time issues between
PEMS and RM data. Correlation problems are likely since most PEMS have
zero response time.  Correlation tests need to have the same kind of
language between the RM test and certifying the PEMS under discussion. 
(0016)

	Response:  As with CEMS certifications, the differences in response
times between the PEMS and RM must be taken into account when comparing
data for the RA test.  This is discussed in Section 8.2.4 of PS-16. 

PEMS Use in Showing Exceedances 

	72.  Comment:  The PS-16 proposal suggests that it can be used to
evaluate PEMS for market-based programs, for determining continual
compliance, for measuring excess emissions or to indicate control device
operation and maintenance. The PEMS data that we have reviewed suggest
that PEMS are adequate to the task of providing reliable emissions
estimates for trading purposes and can be used to provide a reasonable
assurance of compliance. However, by definition, PEMS should not be
considered a direct method of compliance determination.  If the PEMS
output suggest an emission value above the limit, it should be treated
only as a possible exceedance, like an excursion under 40 CFR Part 64. 
This does not necessarily preclude the use of PEMS data as credible
evidence, but it should be understood that PEMS estimate emissions 

and a variety of factors may influence the results.  (0018)

	Response:  We are not retaining applicability to market-based PEMS in
the final rule.  A PEMS predicts emissions, not estimates them, and
those PEMS that meet the requirements of PS-16 predict emissions with a
high level of accuracy.  These predicted values are just as enforceable
as those measured by CEMS or RMs.  Discretion in handling non-compliance
events is beyond the scope of PS-16.

PEMS Use in Startups/Shutdowns

	73.  Comment:  For determining potential exceedances, the use of PS-16
should be limited to normal operation since startup and shut down events
are difficult for PEMS to simulate.  A combination of factors
contributes to this problem.  While only limited startup/shutdown data
is generally available for PEMS training purposes, the more critical
factor seems to be the dramatic changes in the operating parameters.  In
a PEMS evaluation conducted recently by CAMD, there was poor PEMS
agreement for both one- minute and hourly averages.  The report
summarized that the PEMS were unable to predict startups and shutdowns
accurately and that there was “no evidence to support the use of PEMS
predictions during startup and shutdown.”   For evaluating compliance,
it is recommended that the use of PEMS be limited to periods only when
the source is normally operated and excluding startup and shutdown
periods.  (0018)

	Response:  The continuous emission monitoring requirements in Parts 60,
61, and 63 do not require emissions to be measured during periods of
startup and shutdown.  Predictive emission monitoring systems may not be
good candidates for rules that require emissions be measured during
periods of startup and shutdown.  

	74.  Comment:  The PEMS we developed includes an evaluation of
startup/shutdown emissions.  This allows a source to calculate emissions
at all times, including during a startup/shutdown.  The basic
methodology is that startup/shutdown emissions are measured (one-minute
averages) and an emission factor is developed to reflect emissions
during a startup, and for all operation below the minimum tested load
condition.  The emission factor is in lb/million Btu and is used to
calculate lb/hr mass emissions based on fuel heat input (million
Btu/hr).  Further, since lb/million Btu is directly proportional to
ppmvd at any fixed oxygen correction level, then emissions in ppmvd can
be calculated as well.  It is highly recommended that All PEMS have a
methodology for the calculation of startup/shutdown emissions and that
these calculations be published in a PEMS implementation report for
review and approval by the Agency.  (0038)

	Response:  Since many PEMS will not be capable of accurately predicting
emissions during startups and shutdowns, and since the emissions they
are required to monitor generally will not include startups and
shutdowns, such measurements have not been addressed in PS-16.

PEMS for O&M Monitoring

	75.  Comment:  The EPA should eliminate reference to control device O&M
monitoring purposes from PS-16 and provide a definition for the other
three PEMS applications identified.  The proposal identifies four types
of PEMS applications: 

• Market-based programs, 

• Continual compliance determination with an emission limit, 

• Measurement of excess emissions, and 

• Performance indication for control device O&M. 

	The latter application does not predict emissions, and its inclusion
will cause confusion for parameter monitoring requirements that are
common for NSPS and MACT standards.  The EPA should eliminate reference
to using PEMS for O&M monitoring.  In discussing the different
applications for PEMS, EPA indicates that more rigorous criteria may be
warranted for some applications – such as units used for market-based
programs. We agree that different performance metrics may be warranted,
but also believe that the specification should clearly address only
those monitoring systems that are predicting emissions.  PS-16 should
not serve as the basis for validating the performance of parameter
monitoring systems, which are common in NSPS and MACT standards under 40
CFR, Parts 60 and 63. 

	While EPA may believe that PS-16 is not intended for application to
parameter monitoring and that no such link to PS-16 is included in the
governing regulations, history associated with implementing regulations
clearly indicates that published specifications and methods related to
emissions measurement or monitoring systems often are applied, or become
de facto standards, for applications beyond their “mandated” use. 
It is imperative that EPA acknowledge and address the potential for this
to occur.  In the PS-16 proposal, reference to systems that “indicate
control device operation and maintenance (O&M)” provides a directive
that will surely result in misapplication of PS-16 to applications where
parameter monitoring is required.  If EPA intends for such application
of PS-16, we are adamantly opposed to such an interpretation.  Such a
requirement will dramatically increase the monitoring burden and
associated cost for a number of standards, and will result in increased
stringency and rigor for some existing regulations without providing the
opportunity for public comment.  If EPA does not intend for PS-16 to be
applied in this manner, then revisions should be made to the proposal. 

	Another issue with the four identified applications of PEMS is that the
types of use are not discussed or defined.  A PEMS is defined as a
system that “predicts an emission concentration or emission rate.” 
Device O&M monitoring typically provides an indication of equipment
performance – not an emissions prediction.  There are many examples of
operating parameter monitoring in NSPS and MACT standards, such as
catalyst temperature monitoring or fuel rate monitoring, which could be
misconstrued as PEMS based on the proposal reference to O&M monitoring. 
Thus, including control device O&M monitoring as an application of PEMS
in the PS-16 proposal is inappropriate and should be deleted.  In
addition, if EPA includes different performance criteria for the types
of PEMS applications identified, then definitions should be added to
PS-16 to clearly differentiate the types of systems so that performance
criteria applicability can be properly identified and applied.  (0019)

	Response:  In proposed PS-16, emissions monitoring as an indication
control device operation and maintenance and monitoring for excess
emissions represent the same systems.  In these cases, as the comments
points out, and concentration or emission rate, not a parameter limit,
is specified.  We have dropped the reference to PEMS use for control
device operation and maintenance.  We have added language to PS-16
noting that the specifications do not apply to parametric monitoring.  

	76.  Comment:  We recommend that PS-16 not differentiate between O&M
PEMS and compliance PEMS because all PEMS are used to determine
compliance with emission standards under 40 CFR Part 60 or the federally
recognized Title V permits. The notion that some emission monitoring
systems (PEMS or CEMS) are used just to document compliance with O&M
requirements as was done under the older requirements of 40 CFR Part 60
Subparts does not reflect the newer continuous compliance requirements
imposed under Title V permitting requirements necessitating the need to
document compliance with emission limitations if emission monitoring
systems are present.  Section 1.1 of PS-16 entitled “Does this
performance specification apply to me?” seems to have this concept
correctly reflected: “If you, the source owner or operator, intend to
use a predictive emission monitoring system (PEMS) to show compliance
with your emission limitation(s), you must use the procedures in this
performance specification (PS) to determine whether your PEMS has
acceptable performance.”  Nowhere in Section 1.1 does it mention using
the PEMS to document compliance with O&M requirements.  (0028) 

	Response:  We have dropped the proposed reference to O&M PEMS.  See the
response to Comment 75.  

PEMS with Less Than 3 Parameters

	77.  Comment:  The requirement to acquire special approval for PEMS
that use less than three input parameters is arbitrary and may limit
sensor advances and PEMS technology development.  Since the performance
criteria define whether a PEMS is adequate, a limitation on the number
of input parameters is unnecessary. 

	The definition of PEMS in Section 3.6 indicates that special approval
from the Administrator is required for systems that use fewer than three
parameters.  The EPA provides no basis for this restriction.  We
continue to participate in research investigating and advancing sensor
technology, and the PS-16 proposal may impose restrictions that limit
technology progress.  Since the audits and quality assurance criteria
provide a demonstration of technology adequacy, the arbitrary limitation
on parameter count is unnecessary. 

	We are aware of ongoing efforts related to identification and
optimization of parameters for equipment monitoring.  While current
understanding may not be able to specify one or two parameters that can
serve as the basis of a PEMS, investigative research and analysis in
this area is ongoing.  EPA should not artificially restrict this field
of endeavor. 

	It is possible that the requirement to use three or more parameters (or
otherwise petition for special dispensation) is drawn from Acid Rain
monitoring provisions.  For large Part 75 sources with voluminous
exhaust flows – and a requirement for determination of emission rate
– multi-parameter approaches may be the only viable approach now and
in the future.  However, PEMS have broader potential application and may
be applied to less complex units that only require reporting of
concentration rather than emission rate.  For less complex units,
prediction of a concentration rather than a rate, and the potential for
sensor advances, PEMS based on fewer parameters may become viable. 
Increasing the approval burden is not warranted and the PS-16 proposal
should eliminate reference to minimum parameter counts.  (0019) 

	Response:  At this time, the Agency is not aware of any two variable
PEMS that show any reliable promise.  We are requiring Administrator
approval for PEMS with less than three input parameters to discourage
the use simple, non-robust systems.  If research shows at a later date
that PEMS with fewer than three input variables produce acceptable and
reliable performance, then PS-16 can be amended to allow their use
without special approval.

Part-time PEMS Use

	78. Comment:  Several units using PEMS may operate very infrequently,
in which case, it would not be economical nor meaningful for these units
to comply with the initial certification and ongoing QA requirements
provided in the proposed PS-16.  I recommend that alternative
requirements or exemptions be provided in the final PS-16 for such
units.  (0020)

	Response:  Systems that will be operated very infrequently are best
addressed on a case-by-case basis.

Flexibility for Unique Operation Scenarios

	79.  Comment:  We support the agency’s goal of providing a uniform PS
for PEMS.  However, we are concerned that the draft PS could limit the
flexibility of State and Local agencies to respond to unique operation
scenarios at facilities.  We would encourage EPA to add language to
draft PS-16 that allows the State and Local agencies some limited
discretion when approving PEMS for use at existing facilities.  (0022)

	Response:  Discretion for alternatives to monitoring requirements, in
some cases,  have already been delegated to the States by EPA.  Approval
for alternatives to monitoring requirements that are used to show
compliance with Federal rules  has been reserved by the Agency.  Unique
operating scenarios are best addressed on a case-by-case basis.

Sensor Evaluation System 

	80.  Comment:  We recommend that the term “Sensor Validation
System” be used since “Sensor Evaluation System” is technically an
incorrect PEMS term. The term “Sensor Validation System” should be
used since all past PEMS documents published by EPA use this term.  Less
confusion for existing and new PEMS would occur.  (0028) 

	Response:  We have chosen to use “sensor evaluation system” instead
of  “sensor validation system” because the latter term is the
prescribed name of the validation system of a leading PEMS manufacturer.


	81  Comment:  We recommend that criteria for the creation of the
ability of the sensor evaluation system to determine failed sensors and
to reconcile failed sensors and other input values be established as was
done in the January 1996 PEMS example requirements.  The proposed PS-16
is too vague and will result in disparate requirements across the
country when implemented by the State and Local agencies.  Also,
case-by-case approval will result in a burden to the agencies even in
cases of previously approved sensor evaluation system types.  We
recommends that the following criteria be followed as specified in the
January 1996 PEMS Protocol: 

Prior to the initial RATA, a demonstration of the ability of the PEMS to
identify failed sensors and to reconcile failed sensors while
maintaining the accuracy of the PEMS to within 20% of the original PEMS
value will be performed.  This demonstration will be conducted over the
entire operating range of the emission unit.  The demonstration will
consist of: artificially failing each sensor and then ascertaining the
accuracy of the PEMS when utilizing the calculated sensor value;
artificially failing each combination of 

sensors and then ascertaining the accuracy of the PEMS when utilizing
the calculated sensor values; and the ability of the PEMS to alert the
operator regarding the status of PEMS accuracy in the unlikely event of
sensor failure or failures.  The results of the demonstration will be
reported in the initial verification test report.  (0028) 

	Response:  The specifications for the 1996 PEMS protocol were written
around the most popular PEMS in use at that time.  Since then, the use
of other simpler PEMS has necessitated our expanding the criteria to fit
a variety of systems.  We have added flexibility to accommodate other
acceptable systems.  We do not believe a system to check for failed
sensors on a daily basis needs to be complicated to be effective.  We do
not believe this flexibility adds vagueness to the rule nor results in
disparate applications.   

	82.  Comment:  Sections 6.1.8 and 9.2 require the sensor evaluation
system to check the integrity of each PEMS input at least daily.  Given
the limited number of hours a peaking station must operate it
turbine-generators, we recommend daily sensor evaluation checks be
conducted only when the units have operated for more than one hour in
any calendar day.  (0024)

	Response:  We agree with the commenter and have added a note that the
system evaluation check is only required if the PEMS is operated for
more than one hour in any calendar day. 

Graphic Representation of PEMS Relationships

	83.  Comment:  We find crucial the use of graphical representations to
see the relationships between process sensors and emissions.  The
presentation of these graphs should be required to document the PEMS
provider recommended extrapolation.  

	Response:  In Section 6.1.2, the PS does require that the integrity of
the parameter operating envelopes be demonstrated using graphs and data
from the PEMS development process.  Section 6.1.5 also states that if
your PEMS is developed on the basis of linear or nonlinear regression
analysis, you must make available the paired data (preferably in graphic
form) used to develop or train the model.

	84.  Comment:  A graph of the distribution of residuals (observed -
predicted emissions) should be prepared for all PEMS data used in model
development, in the units of the standard.  This plot will demonstrate
whether or not the residuals are distributed normally.  (0038)

	Response:  See the response to Comment 83.

Sensor Envelopes

	85.  Comment:  We recommend that PEMS be allowed to be extrapolated by
5 percent beyond the range of the process sensor values encountered
during the RATA testing.  We recommend that PEMS not be limited to the
range of process sensor values encountered during PEMS RATA testing as
this may unduly limit the operating range of the PEMS compared to the
data used to create the PEMS.  The EPA clearly recognizes the need to
extrapolate as this is allowed for ambient conditions.  

	For backup fuels, no RATA testing of the PEMS for the backup fuel is
required if “the effects of the alternative fuel on predicted
emissions or diluent were addressed in the model training process.” 
This is identical to the case of the wider operating window for a
process sensor discussed above.  During the data gathering to create the
PEMS and the subsequent model building process, the effects of a wider
operating window for each 

process sensor and ambient sensor used in the PEMS was taken into
account.  (0028) 

	Response:  Extrapolation is allowed for ambient conditions because
yearly variations cannot be simulated during the model development
tests.  The PEMS operation is restricted to the sensor envelopes
encountered in the PEMS development, not the initial RA test.

	86.  Comment:  In no case should the PEMS be limited to the range of
process sensors encountered during the RATA tests.  Instead, PEMS should
be limited to the process sensor ranges encountered during the data
gathering to create the PEMS.  In some cases, RATA testing may be
limited due to the economics of the process unit on the day or days of
testing (i.e. must run at higher rates because of an unexpected outage
and can only vary fuel flow by 20 percent whereas during data gathering
fuel flow was varied by 50 percent).  Since the PEMS was developed over
a wider range than the subset range evaluated during the successful RATA
testing, penalizing the PEMS owner by ruling the PEMS as inaccurate
simply because the process sensor range was not available during the
RATA testing would unjustly punish the PEMS owner and not reflect the
effort used to create the PEMS for the wider range of operation. 

	Response:  We agree that PEMS operation is acceptable if process
sensors are operated during the ranges encountered during the data
gathering to create the PEMS, not the ranges encountered during the RATA
tests.  We have ensured that PS-16 plainly says this.

	87.  Comment:  The EPA should allow some variation to the operating
ranges of any operating parameter as part of the QA evaluation.  It is
recommend that input parameters be allowed to vary ± 15 percent outside
of the range during the initial test program to allow for some
instrument drift, measurement uncertainty, and operating flexibility. 
Our experience has shown this to be an acceptable range for the
prediction models to be considered accurate.  (0038)

	Response:  See the responses to comments 85.

Inputs Calibrations

	88.  Comment:  All sources should be required to calibrate input
parameter controls on a minimum of a quarterly basis, and document the
calibrations.  (0038)

	Response:  A good QA program would incorporate calibrations and
documentation on a regular basis.  The performance specification
requires sensor recalibrations per manufacturer recommendations.  These
recalibrations will vary depending upon the type of sensor.  We have not
specified a recalibration interval because of this, and we believe the
daily sensor evaluation check will detect malfunctioning sensors on an
ongoing basis.  

Detailing Neural Network PEMS Calculations

	89.  Comment:  Neural network PEMS are basically black box prediction
algorithms and there is an inability to provide the EPA with the
calculation algorithms for verification.  Quite often, neural network
PEMS are retrained prior to annual RATA testing, and therefore, one
cannot accurately determine the relative accuracy of the pre-training
PEMS.  Furthermore, PEMS calculations should always be published so that
one could have the ability to verify emissions predictions after the
fact.  For instance, facility data acquisition systems log operating
parameters used for the predictive emissions calculations.  When the
equation is published, EPA could manually verify a prediction as part of
a PEMS audit to demonstrate that the emissions are being calculated
accurately and correctly.  With neural networks, this type of audit is
not possible and hence, should not be allowed.  PEMS calculations should
always be published in a report to ensure that the same calculations are
being made throughout the year.  (0038)

	Response:  A PEMS is not allowed to be retrained before an initial or
annual RATA test.  The algorithms and trained models used in neural
network PEMS do not lend themselves to manual calculation checks. 
Understanding such complex neural network relationships and making spot
check calculations would be beyond the capabilities of most auditors and
report reviewers anyway.  We do not believe requiring such PEMS to
publish their predictive relationships would offer much benefit.

Data Substitution

	90.  Comment:  Data substitution of input parameters should not be
allowed. Although it is technically possible, for instance, to provide
an expected oxygen value as a function of fuel flow, it allows the
source to delay corrective action of its metering. Should a PEMS input
parameter signal be lost, the PEMS should be flagged as “input
parameter malfunction” and the system should be considered out of
control or out of service.  PEMS calculations would therefore be invalid
until the input signal was repaired.  In the event of an input signal
malfunction, it is recommended that EPA require a data substitution
routine whereby the source must substitute the maximum emission rate
from the previous 90 days (or year) for all hours that the unit is
operating, and the input signal(s) required for a calculation are out of
service.  (0038)

	Response:  Data substitution of input parameters is only allowed with
Administrator permission.  Permission would be contingent upon a
corrective action plan that limit the use of substitution data after a
fault.  The Agency has studied the data substitution capabilities of a
major PEMS system and has found the process to have merit.  The proposed
application of PS-16 for market-based monitoring has been dropped from
the final rule.

PEMS Design

	91.  Comment:  A means analysis as a function of load conditions should
be prepared as part of the PEMS report.  This would show the min, max,
and mean value for all of the various operating parameters during the
test program.  For instance, with a gas-fired boiler, testing might take
place at 30, 40, …100 percent load conditions.  The means analysis
would show the min, max, and mean value for each operating parameter and
measured emission levels for every load condition to establish the
variation of each parameter across the operating range.  We have
examples of this for EPA review, if desired.  (0038)

	Response:  An analysis of means may be a useful tool in evaluating PEMS
and may be used in the PEMS development.  However, we believe the
statistical test that are currently required in PS-16 offer adequate
establishment of  PEMS sufficiency.

Specific Section Comments 

Section 1.0

	92.  Comment:  The applicability of PS 16, as currently proposed, is
very broad. We suggest limiting the applicability to only those PEMS
used to satisfy continuous emission monitoring requirements in lieu of
CEMS.  PS 16 should not be required for sources using standard emission
calculation methodologies (like AP-42 or unit specific emission factors)
as a means to demonstrate compliance with emission limits in permits or
state rules, which require periodic monitoring instead of continuous
emission monitoring.  An example would be emissions from a small boiler
that are calculated based upon fuel usage. The proposed version of PS-16
leaves it up to the discretion of the regulatory agency or permit writer
to apply PS-16 in these types of situations.  (0010)

	Response:  Predictive emission monitoring systems are used for
continuous monitoring applications, not periodic monitoring. 
Performance Specification 16 may be used in the example of the small
boiler emission being predicted from fuel usage if the prediction
contains at least 3 variables.

	93.  Comment:  EPA should clearly indicate in this section that PEMS
are not mandated and are used at the discretion of the owner/operator as
an alternative to regulatory requirements for CEMS.  The preamble to the
PS-16 proposal notes that PEMS are not mandated by any Federal rule, and
indicates that PEMS are sometimes used for steam generating units as an
alternative to CEMS.  We agree with this statement, and recommend that
EPA clearly state in this section that PEMS are a CEMS alternative for
application to combustion sources.  This is necessary to ensure that the
specification published in Appendix B of 40 CFR, Part 60 clearly
provides the context for PEMS use. In addition, it should be stated that
PEMS are used at the discretion of the owner/operator as an alternative,
pursuant to acceptance by the regulatory authority. 

	The General Information section in the preamble states “Predictive
emission monitoring systems are not currently required in any Federal
rule.  However, they may be used under the NSPS to predict nitrogen
oxides emissions from small industrial, commercial, and institutional
steam generating units.  In some cases, PEMS have been approved as
alternatives to CEMS for the initial 30-day compliance test at these
facilities. Various State and Local regulations are incorporating PEMS
as an emission monitoring tool.”  We understand that EPA is proposing
PS-16 to provide standard criteria for evaluating PEMS, which are
beginning to be implemented as an alternative to CEMS for some
applications.  However, the specification should clearly indicate that
PEMS are not mandated and that PEMS are used as an alternative to other
monitoring requirements, such as a requirement for CEMS.  In addition,
PEMS are still a developmental technology and the specification should
clearly indicate that PEMS are used at the discretion of the owner/
operator.  This context needs to be clearly stated in this section.
Since the explanatory text in the preamble is not included in the
specification that will be published in Appendix B of Part 60, EPA
should add the appropriate text within the body of PS-16 to ensure
clarity in regard to PEMS application.  (0019)

	Response:   Performance Specification 16 does not mandate PEMS but sets
the performance criteria that PEMS must meet for acceptability. 
Although PS-16 is being applied to parts 60, 61, and 63, they may also
be applied to other PEMS as mandated. published in Appendix B of Part
60, they may be applied to PEMS wherever mandated.  

Section 1.1

	94.  Comment:   The proposal states that if your PEMS contains a
“diluent measuring component” then it must be tested as well. This
requirement does not match the requirements for CEMS.  Sources should
calculate the RATA results directly in terms of the standard as done
under Parts 60 and 75. Separate testing of the diluent component is not
necessary and should be removed.  (0018)

	Response:   The diluent component must be tested as part of the PEMS if
the terms of the standard include the diluent.  This has been clarified.

	95.  Comment:  The proposed performance specification should only apply
to PEMS developed and implemented after the promulgation date of the
final specification. Owners or operators with existing PEMS have already
come to some agreement with their local permitting authorities on the
accuracy, RATA validation, and associated QA/QC requirements, and
therefore, it would be overly burdensome to require, or imply, that an
existing PEMS would have to be re-developed and validated.  Based upon
the proposed standard the cost of the initial development and validation
of a PEMS would approach, or exceed, the cost to install a CEMS system. 
Owners or operators of existing PEMS should not be required to absorb
the added costs required to resubmit a second time.  (0026) 

	Response:  It is not the intent in promulgating PS-16 that
currently-certified PEMS be recertified.  Performance Specification 16
applies to new PEMS that are brought into operation after its
promulgation date. 

Section 1.1.1

	96.  Comment:  The proposed rule states, “We require that a relative
accuracy (RA) test and accompanying statistical tests be passed in the
initial certification test before your PEMS is acceptable for use in
demonstrating compliance with applicable requirements.”  Since the
source must be operating in order to conduct a RATA, the specification
should allow for an appropriate window of time after startup in which to
conduct the tests required for PEMS initial certification. This approach
is consistent with the majority of underlying regulations, which may
allow sources up to 180 days (after startup) to conduct initial
certification tests.  (0010)

	Response:  Predictive emission monitoring systems are subject to the
same certification timelines after source startup as CEMS.  Those
timelines are more appropriately discussed in the applicable regulations
and not the PS.  For the New Source Performance Standards, these
timelines are discussed in Part 60.13.   

Section 2.1.2

	

	97.  Comment:  Section 2.1.2 refers to Section 8.1.6 which does not
exist.  This should be corrected to refer to Section 8.2.  (0015)

	Response:  The commenter is correct.  This revision has been made.

Section 3 .3 

	98.  Comment:  The definition of defective sensor should exclude the
text "operates outside the approved operating envelope".  A defective
sensor should only include the calibration drift outside of prearranged
limits or failure.  Sensors that read outside of the demonstrated
operational range are not defective.  The sensors are still providing
valid or "real" process data, however, it is just that the data being
provided is outside the range or envelop where the PEMS was validated. 
(0026)

	Response:  We have clarified the definition to note that a defective
sensor may be functioning properly but its input is undefined by the
PEMS and the resulting predicted emission is not validated. 

Section 3.5

	99. Comment:  This section provides the following definition:
“Operating envelope means the defined range of a parameter input that
is established during PEMS development.  Emission data generated from
parameter inputs that are outside the operating envelope are not
considered quality assured and are therefore, unacceptable.”
Oftentimes, a device used to satisfy required parametric monitoring
requirements will also be used as part of a PEMS.  Thus, there is
potential for overlap and conflict between the PEMS operating envelope
and OPLs established during a compliance performance test.  It is
unclear how a source would deal with these types of situations. 
Additionally, it is unclear which performance specification (PS-16 or
the future PS-17) would apply to sensors used for both parametric
monitoring and as part of a PEMS.  (0010)

	Response:  See the response to Comment 70.  Where there are PEMS and
parametric input sensor overlap, the sensors that are a part of the PEMS
just comply with PS-16 and the sensors that are part of the parametric
system must comply with PS-17.

	100.  Comment:  The EPA uses the phrase ”operating envelope” and we
generally understand the intent is to define those parameter values that
would be expected to occur during normal operation.  We seek
clarification, however, that it is not the EPA’s intention to require
sources to conduct reference method relative accuracy testing of the
PEMS at conditions that exceed permitted emissions limits.  We
understand that the PEMS must accurately predict exceedances, but have
concern that State and Local agencies may interpret the validation
requirements to include testing above the emission limits.  (0022)

	Response:  The commenter is correct.  It is not EPA’s intention to
require sources to conduct reference method relative accuracy testing of
a PEMS at conditions that exceed permitted emission limits.

Section 3.6

	101.  Comment:  The definition of PEMS in the last sentence reads: “A
PEMS may or may not predict emissions data that are corrected for
diluent.”  We request clarification as to the purpose or underlying
intent of this statement, specifically the wording “may or may not.”
 We propose wording to clarify this, such as: “A PEMS may predict
emissions data that are corrected for diluent if all relevant QA tests
are passed.”  (0016)

	Response:  We have clarified the statement to read:  “A PEMS may
predict emissions data that are corrected for diluent if the relative
accuracy and relevant QA tests are passed in the emission units
corrected for diluent.”

Section 3.9 

	102.  Comment:  The definition of RAA should be revised so that the
audit is required every “QA operating quarter” in accordance with
the definition in §72.2.  This would reduce the testing burden for
infrequently operated units and reduce unnecessary emissions that would
be incurred for a unit that would not otherwise operate but would simply
be brought online to meet the audit requirement.  It is recommended that
PS-16 also include grace periods similar to those in Part 75 to allow
sources time to perform tests (RAAs and RATAs) after coming on- line
after an outage.  (0018)

	Response:  The elements of the Part 72.2 definition of QA operating
quarter have been incorporated in the definition of RAA.  A grace period
similar to the allowance in Part 75 is better addressed in the general
provisions to monitoring rather than the PS.

	103.  Comment:  We ask that the requirement for a quarterly audit using
a portable analyzer be removed.  Our experience on uncontrolled turbines
has demonstrated that predictive correlations using process parameters
are stable with time.  As long as the instrumentation is maintained and
equipment hardware design is unchanged, PEMS have been found to be
stable.   (0026)

	Response:  See the response to Comment 55.  The quarterly audits are to
ensure that sensors and instrumentation are maintained and equipment
hardware design is unchanged. 

Section 3.10 

	104. Comment:  It is recommended that the RATA frequency be once every
four QA operating quarters consistent with the requirement for RATAs in
Part 75.  (0018)

	Response:  See the response to Comment 102.

	105.  Comment:  The definition of a RATA includes the following
statements; “The relative accuracy test audit means a RA test that is
performed at least once every four calendar quarters while the PEMS is
operating at the normal operating level.  The RATA must not be conducted
in consecutive quarters.”  The last sentence in this definition is not
necessary and we recommend that it be omitted.  There may be situations
in which a source would choose to conduct a RATA in consecutive quarters
(significant changes to the PEMS or major component replacement).  If
the agency is concerned about the elapsed time between RATAs being no
greater than 3 quarters, this is appropriately addressed in the first
sentence, which requires the RATA to be performed at least once every
four calendar quarters.  (0010)

	Response:  We believe the sentence that RATAs not be conducted in
consecutive quarters is important to encourage the keeping of RATAs
dispersed and more representative of PEMS performance over time.  The
first sentence only requires a RATA per four calendar quarters, which
could result in operation periods of 6 quarters between RATAs. 

Section 6.1.2

	106.  Comment:  We recommend rewording the second sentence of this
section as follows:  “Before you evaluate your PEMS through the
certification test, you must specify the input parameters your PEMS
uses, define their range of minimum and maximum values (operating
envelope), and demonstrate the integrity of the parameter operating
envelope using graphs and data from the PEMS development process to
provide justification for the parameter operating envelope.”  We
recommend the Agency allow sources to use any of the following tools to
justify an appropriate operating envelope: vendor information,
engineering calculations, PEMS test data and extrapolations, as
appropriate.  (0010)

	Response:  We have taken the commenter’s recommendation and reworded
the sentence as follows:  “Before you evaluate your PEMS through the
certification test, you must specify the input parameters your PEMS
uses, define their range of minimum and maximum values (operating
envelope), and demonstrate the integrity of the parameter operating
envelope using graphs and data from the PEMS development process, vendor
information, or engineering calculations, as appropriate.”

	107.  Comment:  I have concern that the requirement to demonstrate the
integrity of the parameter operating envelopes could be reviewed with
undue stricture. The validity of the relationship between the operating
parameters and emissions does not magically vanish simply because the
value of a parameter extends beyond that experienced during the training
period or exhibited during a performance test.  One would expect that
the correlations that exist within the range of the training data would,
naturally, to a reasonable degree, extend beyond such an envelope.  The
validity of a PEMS parameter envelope is dictated by the strength of the
relationship, not the range of test points.  Limiting PEMS operation to
the range of the parameter inputs encountered 

during the certification test is unnecessarily restrictive.  (0018)

	Response:   Section 6.1.2 only requires that the parameter operating
envelopes be clearly defined.  The validity of a PEMS parameter envelope
may be dictated by the strength of the relationship but that
relationship must be established up front and validated through graphs,
PEMS development data, vendor information, or engineering calculations. 
Parameter operating envelopes are only restricted to the ranges
experienced during the certification test when the operating envelopes
are no clearly defined before the certification test.  The envelopes
may, however, be expanded later when new information is provided to
support such expansions. 

	108.  Comment:  We find the proposed wording to be unclear as to
whether EPA is requiring operation within the bounds of data points used
to develop the model, or within the bounds of data points tested in the
certification RATA.  It is presumed that the requirement relates to data
points used to develop the model because the alternative interpretation
of restricting operations to the bounds of data points tested during the
certification RATA would lead to totally impractical operating
restrictions due to the inability to test through a full range of
uncontrollable conditions such as ambient temperature and humidity.  We
believe that a certain level of extrapolation beyond the actual data
points tested is both reasonable and appropriate and such extrapolation
should be expressly authorized.  For example, in Section 6.1.4, EPA
addresses the use of extrapolation to address issues such as changes in
absolute ambient humidity.  We urge EPA to allow the use of emission
data that is extrapolated to some percentage beyond the last data point
if graphic or mathematical relationships indicate that this data is
valid.  (0036)

	Response:  See the response to Comment 85.  The wording in Section
6.1.2 has be clarified to note that PEMS operation must remain with the
parameter envelopes established during the model development.  

Section 6.1.4

	109.  Comment:  This section requires a detailed analysis of the
effects of ambient conditions upon the PEMS.  This particular section
contains helpful guidance for the development of a PEMS and would be
more appropriately contained in a guidance document instead of a PS. 
(0010)

	Response:  This section gives general information and suggestions on
how one might explain the effects of ambient conditions on a PEMS, if
they exist.  It is intentionally brief and non-prescriptive.

Section 6.1.7

	110.  Comment:  This section addresses sensor location and repair by
stating, “We recommend you install sensors in an accessible location
in order to perform repairs and replacements.  Permanently installed
platforms or ladders may not be needed.  If you install sensors in an
area which is not accessible, you may be required to shut down the
emission unit to repair or replace a sensor.”  As with Section 6.1.4,
this section provides helpful guidance and we believe it would be more
appropriately placed in a guidance document instead of the PS.  Sensor
accessibility issues, which could result in shutting down an emission
unit in order to make repairs, should be left to the discretion of the
PEMS owner/operator.  (0010)

	Response:   Even though sensor accessibility issues are left to the
discretion of the PEMS owner/operator, we believe the note is a helpful
reminder against future problems with PEMS operation due to inaccessible
failed sensors.  Equipment specifications are also an important element
of performance specifications and sensor siting precautions are added
here as helpful suggestions.  

	111.  Comment:  The PS contains the following statement:  “If
necessary after repairing or replacing a sensor, correct the process
data to match the data obtained from the originally tested sensor, or
conduct another RA test.”  Further guidance and explanation should be
provided to clarify “correct the process data to match the data
obtained from the originally tested sensor.”  It is unclear what type
of data corrections are required by this statement.  (0010)

	Response:  In cases where a reference value is set for a sensor that is
being replaced, the new sensor response may be adjusted to match the
original sensor output.    The statement in Section 6.1.7 has been
clarified.

	112.  Comment:  We agree that sensors should be calibrated as often as
needed.  However, calibration intervals should not be restricted by
manufacturer's recommendations.  Manufacturers can not conceive of all
possible operating environments nor can they be expected to customize
calibration intervals for each customer.  It is recommended that
manufacturer's recommended intervals be the starting point and that the
flexibility be allowed for the owner or operator to be able to change
the calibration interval based upon the owners or operators own
experience.  In the instances where the calibration interval is
different than the manufacture's recommended calibration interval,
documentation supporting the change would be maintained as a
recordkeeping requirement.  (0026) 

	Response:   The commenter’s recommendations are currently in the
performance specifications.  The sensors may be calibrated as frequently
as desired as long as the manufacturer recommended calibration interval
is not exceeded.  However, exceeding the manufacturer recommended
calibration interval because of owner or operator experience is best
handled on a case-by-case basis by petitions through the EPA Regional
Office or the Permitting Authority.

Section 6.1.8

	113.  Comment:  The proposal states that, “you must have prior
approval before you use reconciled data.”  In order to facilitate the
Agency’s stated goal to “establish standardized performance
requirements” for PEMS, I suggest including a standardized procedure
for making such a determination.  If the PEMS allows the parameter input
data to be manually invalidated and can calculate new results based on
reconciled inputs, then relative accuracy determinations can be made
based on various combinations of actual and reconciled data sets.  I
recommend that reconciliation be allowed only for combinations where the
PEMS was able to pass the relative accuracy requirement using reconciled
data at each load level during initial certification.  (0018)

	Response:  The EPA does not have sufficient information on potential
systems that reconcile data to establish standardized performance
requirements at this time.  We have evaluated the data reconciliation
system of a prominent PEMS provider and found the system to work
reliably well.  However, the purpose of the underlying monitoring
requirements and the confidence in the reconciling techniques themselves
favor approval of this practice on a case-by-case basis. 

Section 6.1.9

	114.  Comment:  With respect to the concept of “parameter envelope
exceedances,” we ask that the Agency help the regulated community
understand the consequences of the data outside the operating range
before taking PS-16 to final promulgation, and how this condition
relates to the “operating envelope.”  Specifically, what are the
compliance consequences of parameter envelope exceedances?  Is data
collected under these conditions treated as missing data from continuous
emissions monitoring systems?  We would have concerns if parameter
envelope exceedances have an additional significance with respect to
compliance with the underlying standard.  (0022)

	Response:  Parameter envelope exceedances simply mean that data
collected under these conditions are out of the calibration range and
are not adequately quality assured.  The consequences of such a
condition depends upon the data quality objectives.

Section 6.2

	115.  Comment:  The criteria for a valid hourly average should allow
for a valid reporting hour with less than 100 percent data capture.  In
proposed Section 6.2, it is stated that “All valid data recorded by
the PEMS must be used to calculate the emission value.  For a valid
hourly average emission value, each 15-minute quadrant of the hour in
which the unit combusts any fuel must contain at least one valid
emission value.”  This implies that at a minimum, a PEMS hourly
average must be based on four, 15-

minute data points and that all four points are required for a valid
hour.  The PEMS specifications from PS-16 may be used for both Part 60
and Part 63 sources, and data 

validation requirements from existing regulations include data averaging
criteria that are less stringent than the proposal.  PS-16 should
include similar requirements for data recording and averaging as those
currently in Parts 60 and 63. 

	Emissions monitoring requirements in §60.13(h) indicate: “…For
continuous monitoring systems other than opacity, 1-hour averages shall
be computed from four or more data points equally spaced over each
1-hour period.”  There are a number of examples where fewer than four
of the 15-minute data points are acceptable for a valid hour.  For
example, the NSPS for electric generating units (Subpart Da) indicates
at §60.47a(g):   “The 1-hour averages required under paragraph
§60.13(h) are expressed in ng/J (lb/million Btu) heat input and used to
calculate the average emission rates under 

§60.46a… At least two data points must be used to calculate the
1-hour averages.” 

	This standard also includes requirements for less than 100 percent data
capture for valid hours when computing daily averages and requisite days
in the 30 day compliance period (see §60.47a(f)).  Similarly, under
Part 63, §63.8(g)(2) identifies the acceptable approach for CEMS data: 
“…Data from CEMS for measurement other than opacity, unless
otherwise specified in the relevant standard, shall be reduced to 1-hour
averages computed from four or more data points equally spaced over each
1-hour period, except during periods when calibration, quality
assurance, or maintenance activities pursuant to provisions of this part
are being performed.  During these periods, a valid hourly average shall
consist of at least two data points with each representing a 15-minute
period.” 

Alternatively, an arithmetic or integrated 1-hour average of CEMS data
may be used. 

Time periods for averaging are defined in §63.2. 

	The implied requirement for data averaging and valid hourly data
implied in PS-16 is excessive in comparison to the requirement in the
general provisions for both Parts 60 and 63.  The EPA should not impose
PEMS requirements that exceed the stringency of CEMS requirements under
Part 60 or 63, and this should be clarified in PS-16.  (0019)

	Response:  The reference to valid hourly data in Section 6.2 has been
dropped.  This information is already addressed in the general
provisions to the regulations (e.g. §60.13).

Section 6.2

	116.  Comment:  To provide flexibility and to make the PEMS monitoring
requirements more in line with the requirements for CEMS,  we ask that
the valid hour definition be made consistent with 40 CFR Part
75.10(d)(1) .  This change would allow for calibration checks to be
conducted without losing complete hours.  (0026)  

	Response:  See the response to Comment 115.

Section 8.1

	117.  Comment:  A reference is made to "PEMS training" but there is no
definition on what that means or is required.  If the term is going to
be retained it should be defined in Section 3.  (0026)

	Response:  We have added a definition for PEMS training to Section 3.0.

Section 8.2.1

	118.  Comment:  The procedure for selecting the operating conditions
for the RM tests is unclear and, in many cases, would be impracticable. 
The section states that RM tests should be conducted at three operating
levels of the “key operating parameter that affects emissions” but
does not suggest how one might determine how to identify that “key”
parameter.  Also, what if the analysis suggests that the most critical
parameter affecting emissions is something like combustor discharge
pressure or exhaust temperature that cannot be readily altered?  I
recommend that this section be revised to read: 

…use the test methods in Appendix A of this part for the RM test.
Conduct the RM tests at three operating levels. The RM tests shall be
performed at a “low” load (or production) level between the minimum
safe, stable load and 50 percent of the maximum load, at the
“normal” load level, and at a “high” load level between 80
percent and the maximum load.  Alternatively, if practicable, you may
test at three levels of the key operating parameter (selected based on a
covariance analysis between each parameter and the PEMS output) equally
spaced within the normal range of the parameter.  (0018)

	Response:  We have made the commenter’s recommended change to Section
8.2.1.  Mention of “normal” load has been revised to “mid” load.

	119.  Comment:  For clarification purposes, we recommend rewording the
RM test requirement to "For sources that have a normal operational range
of greater than 50 percent but less than 80 percent, conduct tests at
three load conditions within the normal operational range of the unit." 
 (0026) 

	Response:  We have added clarity by noting that the three test levels
should span the operating range of the key parameter that affects
emissions, if practical. 

	120.  Comment:  This section contains a requirement to conduct the
tests at three “operating levels,” with the low level defined as
“minimum to 50 percent of maximum” and the high defined as 80
percent to maximum, which provides a very narrow window for the
mid-level.  We recommend that EPA look to the load range definitions
found at 40 CFR 75 for guidance when defining the operating levels for
PEMS testing, which provides more of a balance in the load ranges by
looking at the stable operating range for particular units.

	Response:  The high and low-level descriptions came from 40 CFR 75. 
These operating levels relate to the operating parameter that most
affects emissions and may or may not be the load levels.  The chosen
operating parameter should be evaluated over their range of variations
as the unit is operated in s stable manner.  We have added clarified the
mid-level to be an intermediary level between the low and high levels.  
 

Section 8.2.4 

	121.  Comment:  This section appears to be written to only include the
location requirements of Method 1 which is somewhat problematic because
other applicable test methods, such as Method 20 for combustion
turbines, may allow other siting locations. Furthermore, the section
appears to not allow or provide for the use of other EPA accepted
monitoring methodologies such as the use of multi-holed probes.  To
provide flexibility, we ask that the section be reworded to allow the
use of other accepted monitoring methodologies including multi-hole
sampling probes and sampling locations other than those defined by
Method 1.  (0026) 

	Response:  The reference method measurement locations in Section 8.2.4
were detailed assuming a 3-point traverse would be conducted without a
full Method 1 assessment.  Section 8.2.5 allows for the use of different
traverses to be used if they provide representative sampling. 
Multi-hole sampling probes are now allowed in the instrumental test
methods that are used as the RM.

	122.  Comment:  There are 2 occurrences of 8.2.4 and 8.2.5.  The second
occurrences of 8.2.4 and 8.2.5 should be re-numbered as 8.2.7 and 8.2.8,
respectively.  (0026)

	Response:  The commenter is correct; this correction has been made. 

Section 8.2.5

	123.  Comment:  As discussed in Section 8 .2 .4, we are requesting the
addition of multi-hole probes as an acceptable sampling method.  This
flexibility is consistent with 40 CFR 60.335(a)(4) for combustion
turbines that allows the use of multi-hole probes. 

Additionally, EPA has specified a measurement time of two minutes plus
twice the response time for each traverse point.  This sample time
appears inconsistent with methods such as Method 20 which calls for a
sampling time of the system response time plus 1 minute.  (0026) 

	Response:  Multi-hole probes are now allowed in Method 6C, 7E, 10, and
20 which are used as the RMs in PS-16.  The reference to a measurement
time of two minutes plus twice the response time per traverse points
only applies when showing that  single-point RM testing is acceptable. 
The sample measurement times in the RMs should be followed for the RA
test.

Section 8.2.6

	124.  Comment:   Our experience with combustion sources, such as gas
turbines or process utility heaters, has been that emissions
measurements are relatively stable and valid readings are achieved after
three minutes (taking into account analyzer response time).  If a
multi-hole probe is used, the measurements do not start until readings
are stable.  After that point whether your run length is 5 minutes or 1
hour, the emission values remain the same.  It is presumed that 7
minutes is required to take into account sample response time.  We
recommend that EPA take a more flexible approach and require a minimum
of 5 minutes of sample time after the reading has stabilized.  (0026) 

	Response:  The 7-minute sampling time per sampling point is a standard
procedure for RA testing of CEMS and is based on a 21-minute sample
collected at three traverse points rather than the sample response time.
 The response time must be taken into account before the sampling is
initiated.  

	125.  Comment:  We request clarification of the term “fuel type.” 
In many cases, boiler owners combust one or more opportunity fuels with
fossil fuels.  Is it EPA’s intention to require validation of the PEMS
under specific operating conditions?  If so, what procedures would have
to be followed in order to allow new fuel mixes into the operating
conditions?  (0022)

	Response:  Fuel type refers to different fuels a facility is allowed to
burn.  The effects of combusting each type of fuel on the predicted
emissions must be taken into account in the development of the model. 
Only those fuels that have been evaluated as such or are known not to
affect the predicted values may be used during the PEMS operation.

Section 8.3.1

	126.  Comment:  The bias test requirement should be removed. Because of
the snapshot nature of the RATA, the bias test may suggest a
“statistically significant” bias when no meaningful bias exists due
to the small sample size.  Not only would any random RM error due to
drift likely exhibit itself as a bias (since the test period is short),
but the operating parameters are likely to remain relatively constant.
Thus, a PEMS might seem to indicate a bias at the time of the RATA, but
these results might not be typical of the fuller normal range of
operating conditions. In the very least, the bias test should be based
on the RATA data from all levels so that it reflects a broader variation
of operating parameters.  (0018)

	Response:  Any short-term performance test is only a snapshot of the
actual system performance.  The bias test as prescribed has worked well
under the Part 75 program and we do not think it should be changed here.
 

Section 8.3.3

	127.  Comment:  The correlation analysis should be removed since the
core of the requirement is already addressed by the three-load RATA
requirement. However, there is also a problem with the statement that
the correlation analysis may be waived “if the emission concentration
is less than 50 percent of the applicable standard.” This provision
should be struck since the problem with the correlation analysis, that
there will be no correlation if the emissions exhibit little variation,
is independent of the level of emissions. The statistics show no
deference 

to the emission limits. Furthermore, why require petition to the
Administrator instead of just automatically waiving the correlation test
when the RM measured emissions vary by less than 30 percent?  (0018)

	Response:  See the response to Comment 37.

Sections 8.3.4 

	128.  Comment:  Both Section 8.3.4 and 8.4 contain language instructing
the PEMS owner/operator to consult their reviewing or permitting
authority with jurisdiction over the emissions unit for additional
requirements.  We recommend EPA include all necessary statistical tests
in PS-16 and discourage efforts of state and local agencies to establish
their own, unique requirements for PEMS.  EPA stated in the Federal
Register notice:  “We are proposing PS-16 to provide regulatory
agencies a uniform procedure for assessing the capabilities of this new
monitoring tool.”  Leaving the current wording in Section 8.3.4 would
defeat the purpose of providing uniform performance criteria for PEMS. 
If the EPA considers it necessary to acknowledge that states or local
authorities may choose to promulgate standards in addition to the
requirements of PS 16, the language in these sections could be modified
to read:  “Consult your state or local regulations for additional
requirements.”  (0010)

	Response:   We agree with the commenter and have deleted the statement
referring  the source to their reviewing or permitting authority for
additional requirements.

Section 12.3.1

eq \o(d,‾)  is greater than the confidence coefficient (cc)…” 
Also, an equation showing how to compute the difference di could be
added for clarity.  (0021)

	Response:  To conserve space and minimize terminology, we did not
include equations for simple calculations such as the difference between
PEMS and RM values per run or comparing the absolute mean difference of
runs to the confidence coefficient to determine if a bias exists.

	130.  Comment:  The bias test should be removed.  In addition, as
written, the bias test is arbitrarily applied only to cases where the
PEMS is deemed to be statistically low in comparison to the RM.  If the
results of the bias test are considered meaningful, then a high bias is
just as statistically real as a low bias and should be likewise
corrected.  This could be accomplished by revising the section to read: 


the arithmetic mean of the differences determined in Equation 16-1 based
on the data for all levels.  If the absolute value of the arithmetic
mean of the differences ( d ) is greater than the confidence coefficient
(cc), subsequent PEMS values shall be adjusted as in Equation 16-5:

 				Eq. 16-5

Where

 					Eq. 16-6a

(0018)

	Response:  Corrections for positive biases are not allowed in RA
testing.  Methods or monitoring tools that generate data that is biased
high are used at the discretion of the source.  

Section 13.1

	131.  Comment:  The alternative RA criterion of 2 ppm for NSPS sources
should be raised.  For example, the alternative RA criterion for low
emission units is 15 ppm in Part 75.  The alternative 2 ppm is also
allowed in the proposal only if the emissions are below 10 percent of
the emission standard. This limitation should be removed since the
uncertainty associated RM measurements at low levels is independent of
the emission standard.  (0018)

	Response:  We believe the 15 ppm allowance is too lenient for sources
that have low emission limits.

Equation 16.2

	132.  Comment:  The first summation appearing in Equation 16.2 should
be for di2, not di.  The entire bracketed expression on the right-hand
side of the equation should be raised to the 0.5 power.  (0021, 0038)

	Response:  The commenter is correct, and the change has been made.

Equation 16-6a

	133.  Comment:  In Equation 16-6a, │ eq \o(d,‾) │ should be
replaced by just  eq \o(d,‾) .  Otherwise, the bias correction will
always be positive, which does not make sense.  (0021) 

	Response:  The bias correction should always result in a higher PEMS
concentration.  Bias correction is only used for a low-biased PEMS, not
for high biased ones.  

3.2  Comments on Correction of the Subpart J Coke Burn-off Equation

	134.  Comment:  The revised Part 60 subpart J coke burn equation in
60.106(3) at 70 FR 45614 does not agree with the equation that EPA has
previously published for Part 63 Subpart UUU.  There are differences in
algebraic signs on the terms and parentheses are in different places so
that the two equations cannot yield the same answer as was intended.  In
order to finally resolve this issue, We recommend that EPA adopt the
same coke burn equation for Subpart J that is provided in Subpart UUU
(Refinery MACT II) at 40 CFR section 63.1564(b)(40(i).  Using the coke
burn equation from Subpart UUU in Subpart J makes sense for several
reasons.  First, most refineries are subject to the requirements of both
Subpart J and Subpart UUU.  Second, the equation in Subpart UUU
cross-references Subpart J and is provided for use in determining the
coke burn-off rate of those units that elect to comply with Subpart J as
the method for meeting the emission limitations of Subpart UUU.  Third,
the equation is Subpart UUU results in the correct burn-off rate. 
(0010, 0011, 0012, 0014)

	Response:  The coke burn equation in Subpart J was changed to match the
one in Subpart UUU of Part 63 in a recent separate rule amendment that
was finalized in the Spring of 2008.

3.3  Comments on Subpart BB (Kraft Pulp Mills) Amendment

	135. Comment:  We wish to support EPA’s proposed amendment which
would delete from the kraft pulp mill NSPS in §60.284 certain quality
assurance provisions of §60 Appendix F for continuous emission
monitors.  EPA states that it recognizes that those provisions were
added by mistake in an October 17, 2000, amendment.

Our member kraft pulp mills subject to the NSPS requirements report that
they sometimes have had difficulty convincing their state that the
provision was an error.  And, in a few cases, states have told mills
they were reluctant to correct their state requirements related to this
error until EPA publishes a correction deleting the requirement.  We
are, therefore, heartened by this amendment to correct EPA’s
inadvertent error.  (0017)

	Response:  This final correction was made in a separate rulemaking that
was published on September 21, 2006 (71 FR 55127).

	136.  Comment:   The non-controversial correction to Appendix F,
Procedure 1 is long overdue and should be made as soon as possible.  The
EPA should not tie this correction to taking final action on the PS-16
proposal and proposed corrections to PS-11; they are not related in any
manner.  (0021)

	Response:  See the response to Comment 135. 

3.4  Comment on PS-2 Amendment

	137.  Comment:  We support EPA’s proposal to reinstate RATA relief
for low emitting sources; however, the relief is very specific and helps
only those sources with SO2 limits expressed in lb/MM BTU.  We suggest
EPA consider broadening the allowance for low emitting units to include
sources with NOx and/or SO2 limits expressed in other units, such as
ppm.  Recognizing that this type of relief can indeed be very helpful
for low emitting units, circumstances may exist when the measured value
is below the detection limit of the reference method.  These types of
scenarios may require newer, alternative approaches, such as dynamic
spiking, in order to properly assess the accuracy of the system.  (0009,
0010)  

	Response:  This correction to PS-2 was made in a separate rulemaking
that was published on September 21, 2006 (71 FR 55127).  Your suggestion
to also add alternative units for low emitters is a good idea and we
would like to make such an amendment in a future rulemaking. 

3.5  Comments on PS-11 Revisions

General

	138.  Comment:  We support the proposed changes to Performance
Specification (PS) 11 and the proposed changes to Part 60, Appendix F
Procedure 2.  (0009, 0019)

	Response:  No response is necessary.

	139.  Comment:  We support the proposed changes to PS-11.  (0009,
00010)

	Response:  No response necessary.

	140.  Comment:  The Federal Register Notice did not adequately explain
the purpose and impacts of the proposed amendments or how the proposed
amendments correct the problems associated with PS-11 and Procedure 2. 
(0018)

	Response:  Because this action involved the correction of certain
errors found in PS-11 and Procedure 2 that do not affect the
requirements for testing nor alter the cost or other burdens of the
rule, and does not effect substantive changes to the rule itself, we did
not believe highly detailed explanations to be necessary.  Comments we
have received on the proposal tend to reinforce this view.  However, we
are providing further information in this response.

	Minor changes were made to Section 3.4 of PS-11, which defines the term
“Confidence Interval Half Range.”  The revised definition includes a
general reference to Section 12.3, which specifies the procedures for
calculating the confidence interval half range.  The revised definition
also clarifies that confidence interval half ranges must meet the
requirements specified in Section 13.2(2) of PS-11.

	In the second sentence of the introductory paragraph to Section 8.6,
the word “pairs” was changed to “sets” to avoid possible
confusion regarding the use of paired sampling trains.

	Several references to confidence interval half range in Section 12.3 of
PS-11 were revised to clarify that the confidence interval half range
pertains to the predicted PM concentration at the appropriate x value
specified for each correlation model.  These changes help to clarify how
confidence interval half ranges should be calculated and eliminate
potential confusion with other statistical parameters that can also be
characterized by confidence interval half ranges.  For the same reason,
similar changes were made to the references to tolerance interval half
range in Section 12.3 of PS-11.

 in Equation 11-12 was revised to clarify that the variable must be
calculated at the 95 percent confidence level for n-2 degrees of
freedom.  The confidence level and degrees of freedom had inadvertently
been omitted from the final rule.  The second term of Equation 11-22 was
corrected from “S2•S2" to “S2•S2•S2.”  In Equation 11-27,
the variable indicated under the radical sign has been corrected from
“Dmin” to “min.”  Finally, in the variable definitions
following Equation 11-31, the degrees of freedom for the variables un
and vdf have been corrected from "n-3" to "n-3."

0" to indicate that the parameter is based on the log-transformed y
values.  Two new equations (Equations 11-40 and 11-41) have been added
for determining upper and lower confidence limits, respectively. 
Another equation (Equation 11-42) was added for transforming the
confidence limits back to the linear scale and determining confidence
interval half ranges.  Equations 11-43 and 11-44 have been added for
determining upper and lower tolerance limits, and Equation 11-45 has
been added for transforming the tolerance limits back to the linear
scale and determining tolerance interval half ranges.  Finally, we have
revised Section 12.3(5), which addresses power correlation models, to
incorporate the same changes that have been made to Section 12.3(4) in
the final amendments.  The revisions include renumbering Equation 11-42
as Equation 11-46 and renumbering Equation 11-43 as Equation 11-47 to
account for the new equations.

	In the final amendments, a note has been added following paragraph (v)
of Section 12.3(5) of PS-11.  The note concerns the application of
logarithmic, exponential, and power correlation models to calculate PM
concentrations using the response data from an operating PM CEMS.  These
models are undefined for PM CEMS response (x) values that are equal to
or less than zero.  The note also indicates that guidance materials will
be provided on how to address such situations.

	Because of the new equations added to Section 12.3(4) and the
renumbering of subsequent equations, Equation 11–44 becomes Equation
11-48 under the final amendments to PS-11.

	Several minor changes were made to Section 13.2 of PS-11 to clarify
that the acceptance criterion for confidence interval half range
percentages is that the percentages must be less than or equal to 10
percent.  Comparable changes were made to clarify that tolerance
interval half range percentages must be less than or equal to 25
percent.

	Two documents were added to the reference list presented in Section 16
of PS-11.  These two references provide the basis for the statistical
parameters listed in Table 1.  Table 1 also has been revised by
specifying a new set of values for one of the statistical parameters
(un’).  The values for un’ presented in Table 1 of the final rule
differed slightly from the values for that parameter found in standard
statistical reference books.  The values presented in the revised Table
1 are consistent with standard statistical references.  The revised
table also includes values for the parameter kT, which is used for
calculating tolerance interval half ranges.

	The final amendments include three revisions to Procedure 2.  In
Section 10.1, the reference to Section 16.5 has been clarified to
indicate that the referenced section is found in PS-11.  The other
changes concern the evaluation of absolute correlation audit (ACA) data.
 Section 10.4(3) of the final rule specifies that the average ACA value
must be within 10 percent of the average audit value or within 7.5
percent of the applicable standard.  Section 12.0 of the final rule
presents an equation (Equation 2-1) for determining if ACA values meet
the 10 percent criterion.    SEQ CHAPTER \h \r 1 However, that
equation cannot be used to determine if the ACA values meet the
±7.5 percent criterion.  To correct this problem, we have renumbered
Equation 2-1 as Equation 2-1a, and added a new equation (Equation 2-1b)
for determining if ACA data satisfy the 7.5 percent criterion.  We
have added references to these two equations to Section 10.4(3).  In
addition, we have clarified in Section 12.0(2) that Equation 2-1b must
be used when the reference standard value equals zero.  In such cases,
Equation 2-1a cannot be used because the denominator of the equation
equals zero, and the ACA accuracy value is undefined.

	Regarding the impacts of the proposed amendments to PS-11, the
amendments do not change the requirements for testing, the procedures or
frequency for conducting audits, or any other requirements of PS-11 that
are associated with costs or burden.  For that reason, there are no
impacts associated with the amendments, and we did not discuss impacts
in the Federal Register Notice for the proposed amendments.  However, we
will clarify in the preamble to the final amendments that there are no
impacts associated with the changes to PS-11 and Procedure 2.

	141.  Comment:  PS-11 and Procedure 2 are unlawful.  The Utility Air
Regulatory Group (UARG) has challenged the lawfulness of PS-11 and
Procedure 2 and the court case is currently being held in abeyance.  The
proposed amendments do not resolve the fundamental problems with PS-11
and Procedure 2 that prompted UARG to file suit.  (0018)

	Response:  PS-11 and Procedure 2 themselves are not the subject of this
action, except to the extent of technical corrections made to them.  The
corrections made in the proposed amendments were not intended to address
petitioners’ alleged defects in PS-11 and Procedure 2, but to correct
errors which EPA identified subsequent to their promulgation.  We note
that comments submitted by RMB Consulting & Research, attached to this
commenter’s letter, are largely supportive of the corrections and
clarifications adopted in this rule and suggest some additional
corrections and modifications, which we have carefully considered and
responded to elsewhere in this document.

	142.  Comment:  We question the legality of guidance materials for
PS-11 and Procedure 2.  The commenter remarked that guidance is not a
legal means of establishing legally binding requirements, resolving
significant issues with rules, or responding to comments raised during a
rulemaking.  (0018)

	Response:  We agree that guidance cannot establish legally binding
requirements.  We do not agree that guidance cannot legitimately address
significant technical or implementation matters, such as where rules
deliberately provide flexibility for case-specific applications. 
Similarly, where public comments raise issues, guidance can often be
used to clarify further how specific applications can be fashioned to
respond to those issues or to further explain EPA’s interpretations of
rules to show that an issue may not in fact exist.  Because the
commenter did not provide details on any specific issues of concern, we
cannot address in this response the specific issues the commenter has
concerning the guidance developed for PS-11 and Procedure 2.

Sample Volume Audits

	143.  Comment:  Commenter OAR-2003-0074-0018 stated that the
requirement for sample volume audits (SVAs) specified in Section 10.3(3)
of Procedure 2 is inconsistent with the definition of SVA in Section 3.7
of PS-11.  The commenter explained that the term “sample volume
audit” is defined to apply only to extractive PM CEMS that measure the
mass of PM collected and the volume of the sample.  However, as
published, Section 10.2(3) could be interpreted to require SVAs for
extractive light scattering PM CEMS that do not measure sample volume.

	Response:  We agree with the commenter that SVAs should not be required
for extractive light-scattering PM CEMS that do not measure sample
volume.  In Section 10.3(3), the phrase “. . . applicable PM CEMS with
an extractive sampling system” is meant to refer to extractive systems
that measure sample volume.  However, we can understand that this phrase
could be misinterpreted to include extractive PM CEMS that do not
measure sample volume.  Therefore, we have revised Section 10.3(3) of
Procedure 2 to specify that SVAs are required “. . . for applicable PM
CEMS with an extractive sampling system that measure sample volume.” 

	144.  Comment:  Commenter OAR-2003-0074-0018 indicated that the
denominator in Equation 2-4 of Procedure 2, Section 12.0(4), is
incorrect.  The commenter believes the variable in the denominator of
the equation should be the reference sample gas volume (VR) instead of
the full scale value (FS).  The commenter noted that replacing the
variable FS with VR in the denominator makes sense technically and is
consistent with the definition of SVA specified in PS-11.

̀Ĥ옍)

"̀Ĥ옍)

"̀Ĥ옍)

̀Ĥ옍)

(

B

a

‚



Ø

ñ

⠂連$ሂ桤ā㄀Ĥ摧巔y

$

&

'

(

@

A

B

f

i

©

ª

Ú

`

a

b

o

p

q

•

–

Ý

Þ

á

ù

ü

ý

Eñ

	

.

T

q

—

¯

ß

ý

̀Ĥ옍)

̀Ĥ옍)

Ȁ␱愁Ĥ摧㋎¬̀Ĥ옍)

옍)

옍)

옍)

@

h¶

옍)

h¶

옍)

옍)

옍)

옍)

$

@

h¶

옍)

h¶

옍)

옍)

옍)

옍)

옍)

$

@

h¶

옍)

h¶

옍)

옍)

옍)

옍)

옍)

옍)

옍)

옍)

옍)

$

@

h¶

옍)

h¶

옍)

옍)

옍)

옍)

$

@

kd6

h¶

옍)

kd·

h¶

옍)

옍)

옍)

옍)

옍)

옍)

옍)

옍)

옍)

$

@

h¶

옍)

h¶

옍)

옍)

옍)

옍)

h¶

옍)

 hÕ

he

 hÕ

hv

h.

h¬/

hõ

 hœ

h

 hœ

hœ

hœ

 hœ

 hœ

hœ

¸

º

á

â

ã

>

A

Ö

×

hš

hÿ

×z×

_H

_H

h

hD

 hy

 hy

 hy

h

h

h

h

h

h

h

hD

 hÕ

ä

å

S

T

y

à

â

ã

æ

è

ë

ò

U

]

€

hÚ

hÚ

 hˆ\

hˆ\

hˆ\

 h

h

 h

h

hä

hä

h

huF

huF

huF

huF

huF

huF

huF

huF

huF

huF

h

huF

×

Ø

@

$

$

:  We agree with the commenter that the equation for determining SVA
accuracy should be the sample gas volume (VR) measured by the
independent calibrated reference device.  However, we have concluded
that Equation 2-4, as currently specified in Section 12.0(4) of
Procedure 2, should continue to be used for evaluating daily sample
volume check error.  The purpose of daily checks is to determine
instrument stability rather than accuracy.  Therefore, it is appropriate
to use the full scale value in the denominator of the equation when
evaluating daily sample volume checks.  We have retained Equation 2-4
for that purpose and have added a new Equation 2-5.  In addition, we
have revised the text of Section 12.0 (4) accordingly.  The revised
section reads as follows:

	“(4)  How do I evaluate calculate daily sample volume check error and
SVA accuracy?  You must use Equation 2-4 to calculate, in percent, the
daily sample volume check error.  To calculate SVA accuracy, in percent,
you must Equation 2-5.

  ADVANCE \u 12 	(Eq. 2-4)

  ADVANCE \u 12 	(Eq. 2-5)

Where: 

 

VM	= 	Sample gas volume determined/reported by your PM CEMS (e.g.,
dscm),

VR	= 	Sample gas volume measured by the independent calibrated reference
device (e.g., dscm) for the SVA or the reference value for the daily
sample volume check, and

FS	= 	Full-scale value.”

Daily Drift Checks

	145.  Comment:  Commenter OAR-2003-0074-0018 stated that Equations 2-2
and 2-3 specified in Section 12.0(3) of Procedure 2 for determining
upscale and zero drift, respectively, are inconsistent with all other
equations published by EPA for measuring the drift of CEMS.  The
commenter explained that the variable in the denominator of the two
equations should be the span value instead of the upscale reference
standard value (RU).  The commenter cited Section 13.1 of PS-2 as an
example of a requirement to calculate CEMS drift in terms of span value.

	Response:  We acknowledge that other performance specifications require
evaluating drift in terms of the span value.  However, if we were to
require PM CEMS to be evaluated in terms of span value, we would have to
define span value in PS-11 and Procedure 2.  Because the various PM CEMS
technologies do not respond to PM concentrations in common units, it
would be difficult to define the span value in a way that would apply to
all PM CEMS.  Instead, when we developed PS-11 and Procedure 2, we
decided to require evaluating daily drift in terms of the upscale
reference value (RU).  In effect, the evaluation of daily drift, as
specified in PS-11 and Procedure 2, is performance-based rather than
technology-based and eliminates any problems associated with differences
in how PM CEMS technologies respond to emissions.  Although we
understand the commenter’s concern with maintaining consistency with
other performance specifications, we believe it is more appropriate to
evaluate PM CEMS drift in terms of the upscale reference value rather
than the span value.

 PAGE   

 PAGE   iii 

		

	

 PAGE   10 

 

 

