Supporting Statement for Information Collection Request for

Willingness to Pay Survey for §316(b) Existing Facilities Cooling Water
Intake Structures: Instrument, Pre-test, and Implementation

TABLE OF CONTENTS

  TOC \o "1-3" \h \z \u    HYPERLINK \l "_Toc278268440"  PART A OF THE
SUPPORTING STATEMENT	  PAGEREF _Toc278268440 \h  1  

  HYPERLINK \l "_Toc278268441"  1.	Identification of the Information
Collection	  PAGEREF _Toc278268441 \h  1  

  HYPERLINK \l "_Toc278268442"  1(a)	Title of the Information Collection
  PAGEREF _Toc278268442 \h  1  

  HYPERLINK \l "_Toc278268443"  1(b)	Short Characterization (Abstract)	 
PAGEREF _Toc278268443 \h  1  

  HYPERLINK \l "_Toc278268444"  2.	Need For and Use of the Collection	 
PAGEREF _Toc278268444 \h  5  

  HYPERLINK \l "_Toc278268445"  2(a)	Need/Authority for the Collection	 
PAGEREF _Toc278268445 \h  5  

  HYPERLINK \l "_Toc278268446"  2(b)	Practical Utility/Users of the Data
  PAGEREF _Toc278268446 \h  5  

  HYPERLINK \l "_Toc278268447"  3.	Non-duplication, Consultations, and
Other Collection Criteria	  PAGEREF _Toc278268447 \h  6  

  HYPERLINK \l "_Toc278268448"  3(a)	Non-duplication	  PAGEREF
_Toc278268448 \h  6  

  HYPERLINK \l "_Toc278268449"  3(b)	Public Notice Required Prior to ICR
Submission to OMB	  PAGEREF _Toc278268449 \h  7  

  HYPERLINK \l "_Toc278268450"  3(c)	Consultations	  PAGEREF
_Toc278268450 \h  13  

  HYPERLINK \l "_Toc278268451"  3(d)	Effects of Less Frequent Collection
  PAGEREF _Toc278268451 \h  14  

  HYPERLINK \l "_Toc278268452"  3(e)	General Guidelines	  PAGEREF
_Toc278268452 \h  14  

  HYPERLINK \l "_Toc278268453"  3(f)	Confidentiality	  PAGEREF
_Toc278268453 \h  14  

  HYPERLINK \l "_Toc278268454"  3(g)	Sensitive Questions	  PAGEREF
_Toc278268454 \h  15  

  HYPERLINK \l "_Toc278268455"  4.	The Respondents and the Information
Requested	  PAGEREF _Toc278268455 \h  15  

  HYPERLINK \l "_Toc278268456"  4(a)	Respondents	  PAGEREF _Toc278268456
\h  15  

  HYPERLINK \l "_Toc278268458"  4(b)	Information Requested	  PAGEREF
_Toc278268458 \h  16  

  HYPERLINK \l "_Toc278268459"  (I)	Data items, including recordkeeping
requirements	  PAGEREF _Toc278268459 \h  16  

  HYPERLINK \l "_Toc278268460"  (II)	Respondent activities	  PAGEREF
_Toc278268460 \h  20  

  HYPERLINK \l "_Toc278268461"  5.	The Information Collected - Agency
Activities, Collection Methodology, and Information Management	  PAGEREF
_Toc278268461 \h  21  

  HYPERLINK \l "_Toc278268462"  5(a)	Agency Activities	  PAGEREF
_Toc278268462 \h  21  

  HYPERLINK \l "_Toc278268463"  5(b)	Collection Methodology and
Information Management	  PAGEREF _Toc278268463 \h  22  

  HYPERLINK \l "_Toc278268464"  5(c)	Small Entity Flexibility	  PAGEREF
_Toc278268464 \h  23  

  HYPERLINK \l "_Toc278268465"  5(d)	Collection Schedule	  PAGEREF
_Toc278268465 \h  23  

  HYPERLINK \l "_Toc278268467"  6.	Estimating Respondent Burden and Cost
of Collection	  PAGEREF _Toc278268467 \h  24  

  HYPERLINK \l "_Toc278268468"  6(a)	Estimating Respondent Burden	 
PAGEREF _Toc278268468 \h  24  

  HYPERLINK \l "_Toc278268469"  6(b)	Estimating Respondent Costs	 
PAGEREF _Toc278268469 \h  24  

  HYPERLINK \l "_Toc278268470"  6(c)	Estimating Agency Burden and Costs	
 PAGEREF _Toc278268470 \h  25  

  HYPERLINK \l "_Toc278268471"  6(d)	Respondent Universe and Total
Burden Costs	  PAGEREF _Toc278268471 \h  25  

  HYPERLINK \l "_Toc278268472"  6(e)	Bottom Line Burden Hours and Costs	
 PAGEREF _Toc278268472 \h  26  

  HYPERLINK \l "_Toc278268474"  6(f)	Reasons for Change in Burden	 
PAGEREF _Toc278268474 \h  26  

  HYPERLINK \l "_Toc278268475"  6(g)	Burden Statement	  PAGEREF
_Toc278268475 \h  26  

  HYPERLINK \l "_Toc278268476"  PART B OF THE SUPPORTING STATEMENT	 
PAGEREF _Toc278268476 \h  28  

  HYPERLINK \l "_Toc278268477"  1.	Survey Objectives, Key Variables, and
Other Preliminaries	  PAGEREF _Toc278268477 \h  28  

  HYPERLINK \l "_Toc278268478"  1(a)	Survey Objectives	  PAGEREF
_Toc278268478 \h  28  

  HYPERLINK \l "_Toc278268479"  1(b)	Key Variables	  PAGEREF
_Toc278268479 \h  28  

  HYPERLINK \l "_Toc278268480"  1(c)	Statistical Approach	  PAGEREF
_Toc278268480 \h  29  

  HYPERLINK \l "_Toc278268481"  1(d)	Feasibility	  PAGEREF _Toc278268481
\h  29  

  HYPERLINK \l "_Toc278268482"  2.	Survey Design	  PAGEREF _Toc278268482
\h  30  

  HYPERLINK \l "_Toc278268483"  2(a)	Target Population and Coverage	 
PAGEREF _Toc278268483 \h  30  

  HYPERLINK \l "_Toc278268484"  2(b)	Sampling Design	  PAGEREF
_Toc278268484 \h  30  

  HYPERLINK \l "_Toc278268485"  (I)	Sampling Frames	  PAGEREF
_Toc278268485 \h  30  

  HYPERLINK \l "_Toc278268486"  (II)	Sample Sizes	  PAGEREF
_Toc278268486 \h  31  

  HYPERLINK \l "_Toc278268487"  (III)	Stratification Variables	  PAGEREF
_Toc278268487 \h  31  

  HYPERLINK \l "_Toc278268488"  (IV)	Sampling Method	  PAGEREF
_Toc278268488 \h  32  

  HYPERLINK \l "_Toc278268489"  (V)	Multi-Stage Sampling	  PAGEREF
_Toc278268489 \h  33  

  HYPERLINK \l "_Toc278268490"  2(c)	Precision Requirements	  PAGEREF
_Toc278268490 \h  33  

  HYPERLINK \l "_Toc278268491"  (I)	Precision Targets	  PAGEREF
_Toc278268491 \h  33  

  HYPERLINK \l "_Toc278268493"  (II)	Non-Sampling Errors	  PAGEREF
_Toc278268493 \h  34  

  HYPERLINK \l "_Toc278268494"  2(d)	Questionnaire Design	  PAGEREF
_Toc278268494 \h  36  

  HYPERLINK \l "_Toc278268495"  3.	Pretests and Pilot Tests	  PAGEREF
_Toc278268495 \h  38  

  HYPERLINK \l "_Toc278268496"  4.	Collection Methods and Follow-up	 
PAGEREF _Toc278268496 \h  38  

  HYPERLINK \l "_Toc278268497"  4(a)	Collection Methods	  PAGEREF
_Toc278268497 \h  38  

  HYPERLINK \l "_Toc278268498"  4(b)	Survey Response and Follow-up	 
PAGEREF _Toc278268498 \h  39  

  HYPERLINK \l "_Toc278268499"  5.	Analyzing and Reporting Survey
Results	  PAGEREF _Toc278268499 \h  39  

  HYPERLINK \l "_Toc278268500"  5(a)	Data Preparation	  PAGEREF
_Toc278268500 \h  39  

  HYPERLINK \l "_Toc278268501"  5(b)	Analysis	  PAGEREF _Toc278268501 \h
 40  

  HYPERLINK \l "_Toc278268503"  5(c)	Reporting Results	  PAGEREF
_Toc278268503 \h  57  

  HYPERLINK \l "_Toc278268504"  REFERENCES	  PAGEREF _Toc278268504 \h 
59  

ATTACHMENTS

  HYPERLINK \l "_Toc278268505"  Attachment 1: Full Text of Regional
Stated Preference Survey Component	  PAGEREF _Toc278268505 \h  68  

  HYPERLINK \l "_Toc278268506"  Attachment 2: Full Text of National
Stated Preference Survey Component	  PAGEREF _Toc278268506 \h  88  

  HYPERLINK \l "_Toc278268507"  Attachment 3: Federal Register Notice	 
PAGEREF _Toc278268507 \h  108  

  HYPERLINK \l "_Toc278268508"  Attachment 4: Description of Statistical
Survey Design	  PAGEREF _Toc278268508 \h  118  

  HYPERLINK \l "_Toc278268509"  Attachment 5: Description of
Internet-based Sampling Alternative	  PAGEREF _Toc278268509 \h  123  

  HYPERLINK \l "_Toc278268510"  Attachment 6: Telephone Screener Script	
 PAGEREF _Toc278268510 \h  127  

  HYPERLINK \l "_Toc278268511"  Attachment 7: Cover Letter to Mail
Survey Recipients	  PAGEREF _Toc278268511 \h  134  

  HYPERLINK \l "_Toc278268512"  Attachment 8: Post Card Reminder to Mail
Survey Recipients	  PAGEREF _Toc278268512 \h  135  

TABLES

  HYPERLINK \l "_Toc278268457"  Table A1: Geographic Stratification
Design	  PAGEREF _Toc278268457 \h  16  

  HYPERLINK \l "_Toc278268466"  Table A2: Schedule for Survey
Implementation	  PAGEREF _Toc278268466 \h  24  

  HYPERLINK \l "_Toc278268473"  Table A3: Total Estimated Bottom Line
Burden and Cost Summary	  PAGEREF _Toc278268473 \h  26  

  HYPERLINK \l "_Toc278268492"  Table B1: Number of Households and
Household Sample for Each EPA Study Region	  PAGEREF _Toc278268492 \h 
34  

  HYPERLINK \l "_Toc278268502"  Table B2:  Illustration of an External
Scope Test	  PAGEREF _Toc278268502 \h  55  



 PART A OF THE SUPPORTING STATEMENT

1.	Identification of the Information Collection

1(a)	Title of the Information Collection 

Willingness to Pay Survey for Section 316(b) Existing Facilities Cooling
Water Intake Structures:  Instrument, Pre-test, and Implementation

1(b)	Short Characterization (Abstract)

On February 16, 2004, the U.S. Environmental Protection Agency (EPA)
took final action on the Phase II rule governing cooling water intake
structures at existing facilities that are point sources; that, as their
primary activity, both generate and transmit electric power or generate
electric power for sale to another entity for transmission; that use or
propose to use cooling water intake structures with a total design
intake flow of 50 MGD or more to withdraw cooling water from waters of
the United States; and that use at least 25 percent of the withdrawn
water exclusively for cooling purposes. See 69 FR 41576 (July 9, 2004).
Industry and environmental stakeholders challenged the Phase II
regulations. On judicial review, the Second Circuit (Riverkeeper, Inc.
v. EPA, 475 F.3d 83, (2d Cir., 2007)) remanded several provisions of the
Phase II rule. Some key provisions remanded are as follows: EPA
improperly used a cost-benefit analysis as a criterion for determining
Best Technology Available (BTA), and EPA inappropriately used ranges in
setting performance expectations. In response, EPA suspended the Phase
II regulation in July 2007 pending further rulemaking. The U.S. Supreme
Court granted Entergy Corporation’s petition for writ of certiorari,
solely on the question of whether EPA had the authority under §316(b)
of the Clean Water Act to consider costs and benefits in
decision-making. On April 1, 2009, the Court, in Entergy Corp. v.
Riverkeeper Inc., decided that “EPA permissibly relied on cost-benefit
analysis in setting the national performance standards … as part of
the Phase II regulations.” EPA is now taking a voluntary remand of the
rule, thus ending Second Circuit review.

On June 1, 2006, EPA promulgated the 316(b) Phase III Rule for existing
manufacturers, small flow power plants (facilities that use cooling
water intake structures with a total design intake flow of less than 50
MGD to withdraw cooling water from waters of the United States; and that
use at least 25 percent of the withdrawn water exclusively for cooling
purposes), and new offshore oil and gas facilities. Offshore oil and gas
firms and environmental groups petitioned for judicial review, which was
to occur in the Fifth Circuit, but was stayed pending the Supreme Court
decision on the Phase II case. EPA has petitioned the Court for a
voluntary remand of the existing facilities portion of the Phase III
rulemaking. In developing the Phase III regulation, EPA began, but did
not complete, a similar stated preference survey effort.  The current
effort builds on that earlier work.

EPA is now combining the two phases into one rulemaking covering all
existing facilities. EPA will develop regulations to provide national
performance standards for controlling impacts from existing cooling
water intake structures (CWIS) under Section 316(b) of the Clean Water
Act (CWA). 

Under Executive Order 12866, EPA is required to estimate the potential
benefits and costs to society for economically significant rules.  To
assess the public policy significance or importance of the ecological
gains from the section 316(b) regulation for existing facilities, EPA
requests approval from the Office of Management and Budget to conduct a
stated preference survey.  Data from the stated preference survey will
be used to estimate the values (willingness to pay, or WTP) derived by
households for changes related to the reduction of fish losses at CWIS,
and to provide information to assist in the interpretation and
validation of survey responses.  As indicated in the prior literature
(Cummings and Harrison 1995; Johnston et al. 2003a, 2005), it is
virtually impossible to justify, theoretically, the decomposition of
empirical estimates of use and non-use values.  The survey will provide
the flexibility, however, to estimate nonuser values, using various
nonuser definitions drawn from responses to survey question 10.  The
structure of choice attribute questions will also allow the analysis to
separate value components related to the most common sources of use
values—effect on harvested recreational and commercial fish.  In
summary, the survey will provide estimates of total values (including
use and nonuse), will allow estimates of value associated with specific
choice attributes (following standard methods for choice experiments),
and will also allow the flexibility to provide some insight into the
relative importance of use versus non-use values in the 316(b) context.

Within rulemaking, among the most crucial concerns is the avoidance of
benefit (or cost) double counting.  Here, for example, the WTP estimates
will include use and non-use values among a representative population
sample.  These may overlap—to a potentially substantial extent—with
use value estimates that might be provided through some other methods,
including revealed preference methods that might be used to estimate use
values of recreational anglers for fish kill reductions (i.e., through
related improvements in fishing quality).  While using the proposed
stated preference value estimates for benefit estimation, particular
care will be given to avoid any possible double counting of values that
might be derived from alternative valuation methods.  In doing so, EPA
will rely upon standard theoretical tools for non-market welfare
analysis, as presented by authors including Freeman (2003) and Just et
al. (2004).  From a purely mechanistic perspective, survey results will
be used to derive total values following standard practice for choice
experiments (Adamowicz et al. 1998).

The target population for this stated preference survey is all
individuals from continental U.S. households who are 18 years of age or
older.  The population of households will be stratified into four study
regions: Northeast, Southeast, Inland, and Pacific. The Northeast survey
region includes the North Atlantic and Mid Atlantic 316(b) benefits
regions, the Southeast survey region includes the South Atlantic and
Gulf of Mexico 316(b) benefits regions, the Pacific region includes
states on the Pacific coast, and the Inland region includes all
non-coastal states.  In addition, EPA will administer a national version
of the survey that does not require stratification.  Respondents will be
recruited using a random digit dialing (RDD) sample (overlapping dual
frame; land and cell phone numbers) and a brief telephone questionnaire.
Participation in the survey is voluntary.  If the person in the
household agrees, the recruited respondent will be asked to complete the
survey questionnaire by mail.  EPA intends to administer the mail survey
to 5,720 households in order to achieve 2,288 completed responses
assuming a 40% completion rate for households agreeing to participate. 
EPA is also considering implementing the proposed survey as an
internet-based, choice experiment questionnaire.  This alternative
approach is described in Attachment 5.

For the selection of households, the population of households in the 48
states and the District of Columbia will be stratified by the four study
regions.  The sample is allocated to each region in proportion to the
total number of households in that region with the restriction that we
get at least 288 completed responses in each region.  A sample of 288
households completing the national survey version would be distributed
among the study regions based on the percentage of regional survey
sample to ensure that respondents to the national survey version are
distributed across the continental U.S. Non-response bias has the
potential to occur due to households declining to participate in the
mail survey when contacted by phone or households which fail to return a
completed survey after agreeing to participate.  To conduct a
non-response study, EPA will use a brief telephone questionnaire to
collect sample characteristics in the course of the telephone contact. 
EPA will analyze the characteristics of the completed and non-completed
cases from the random digit dialing sample to determine whether there is
any evidence of significant non-response bias in the completed sample. 
This analysis will suggest whether any weighting or statistical
adjustment is necessary to minimize the non-response bias in the
completed sample.

As part of the testing of the survey instrument, EPA conducted a series
7 focus groups with 8-10 participants per focus group, with approval
from the Office of Management and Budget (OMB control # 2090-0028).  The
Agency conducted focus groups in several regions to account for the
potentially distinct information relevant to survey design.  These focus
groups were conducted following standard, accepted practices in the
stated preference literature, as outlined by Mitchell and Carson (1989),
Desvousges et al. (1984), Desvousges and Smith (1988) and Johnston et
al. (1995).  One of the focus groups incorporated individual cognitive
interviews, as detailed by Kaplowicz et al. (2004).  The focus groups
and cognitive interviews allowed EPA to better understand the public's
perceptions and attitudes concerning fishery resources, to frame and
define survey questions, to pretest draft survey questions, to test for
and reduce potential biases that may be associated with stated
preference methodology, and to ensure that both researchers and
respondents have similar interpretations of survey language and
scenarios.  In particular, cognitive interviews allowed for in-depth
exploration of the cognitive processes used by respondents to answer
survey questions, without the potential for interpersonal dynamics to
sway respondents’ comments (Kaplowicz et al. 2004).  Transcripts from
these seven focus groups can be found in the docket for this ICR (ICR #
2402.01).  EPA revised the survey based on the findings of the seven
focus groups.  These seven focus groups were conducted in addition to
focus groups conducted previously by EPA under ICR #2155.01 to test the
draft survey for the Phase III benefits analysis.  Transcripts from the
previously conducted focus groups for the Phase III analysis can be
found in the docket for EPA ICR #2155.02 (Besedin et al., 2005). 
Findings from these previous focus groups were also incorporated into
the development of the current survey.  

The total national burden estimate for all components of the survey is
1,938 hours.  The burden estimate is based on 9,533 participants in
telephone screening interviews and 2,288 respondents to the 5,720 mailed
questionnaires.  EPA assumes an average burden estimate of 5 minutes per
telephone screening participant and 30 minutes per mail survey
respondent including the time necessary to complete and mail back the
questionnaire.  Given an average wage rate of $20.42, the total
respondent cost is $39,583.

2.	Need For and Use of the Collection

2(a)	Need/Authority for the Collection

  SEQ CHAPTER \h \r 1 The project is being undertaken pursuant to
section 104 of the Clean Water Act dealing with research. Section 104 of
the Clean Water Act authorizes and directs the EPA Administrator to
conduct research into a number of subject areas related to water
quality, water pollution, and water pollution prevention and abatement.
This section also authorizes the EPA Administrator to conduct research
into methods of analyzing the costs and benefits of programs carried out
under the Clean Water Act.

This project is exploring how public values for fishery resources are
affected by fish losses from impingement and entrainment (I&E) mortality
at cooling water intake structures.  Understanding total public values
for fishery resources, including the more difficult to estimate non-use
values, is necessary to determine the full range of benefits associated
with reductions in I&E mortality losses, and whether the benefits of
government action to reduce I&E mortality losses at existing facilities
are commensurate with the costs of such actions. Because non-use values
may be substantial, failure to recognize such values may lead to
improper inferences regarding benefits and costs. The findings from this
study will be used by EPA to improve estimates of the economic benefits
of the section 316(b) regulation for existing facilities as required
under Executive Order 12866.

2(b)	Practical Utility/Users of the Data

 

EPA plans to use the results of the survey to improve estimates of the
economic benefits of the section 316(b) regulation for existing
facilities.  Specifically, the Agency will use the survey results to
estimate total values for preventing losses of fish through I&E
mortality at CWIS, following standard practices outlined in the
literature (Freeman 2003; Bennett and Blamey 2001; Louviere et al. 2000;
U.S. EPA 2000).  

3.	Non-duplication, Consultations, and Other Collection Criteria

3(a)	Non-duplication

There are many studies in the environmental economics literature that
quantify benefits or willingness to pay (WTP) associated with various
types of water quality and aquatic habitat changes. However, none of
these studies allows the isolation of non-market WTP associated with
quantified reductions in fish losses for forage fish.  Most available
studies estimate WTP for broader, and sometimes ambiguously defined,
policies that simultaneously influence many different aspects of aquatic
environmental quality and ecosystem services, but for which WTP
associated with fish or aquatic life alone cannot be identified.  Other
studies provide benefit estimates associated with improvements in fish
(or aquatic) habitat, but do not link this to well-defined and
quantified changes in affected or supported organisms.  Still other
studies address willingness to pay for changes in charismatic or
recreational species that have little relationship to the types of
forage fish that are the vast majority of species affected by cooling
water intake structures.

For example, choice experiment studies such as Hanley et al. (2006a,
2006b) and Morrison and Bennett (2004) estimate WTP for aquatic
ecosystem changes that affect fish, but the effects on fish are
quantified and valued solely in terms of the presence/absence of
different types of fish species. This approach renders associated
results unsuitable for 316(b) benefit estimation.  Also, many of these
studies were conducted outside the U.S. (e.g., the European Union or
Australia), making their use for benefit transfer to a U.S. policy
context more challenging.

Other studies have estimated the value of changes in catch rates or
populations of select recreational and commercial species, charismatic
species such as salmon, or changes in water quality that affect fish,
but none have specifically valued changes in forage fish populations.
For example, Olsen et al. (1991) conducted a survey of Pacific Northwest
residents, including both anglers and non-anglers, to determine their
WTP for doubling the size of the Columbia River Basin salmon and
steelhead runs.  EPA’s proposed survey approach differs from this
study and others like it (such as Cameron and Huppert 1989) in that it
would include respondents from various geographic regions in the United
States and would provide values for the full range of forage,
recreational, and commercial species affected by 316(b) regulations,
instead of valuing a few recreational species in one specific
geographical area.

Among available studies, the most closely related is Johnston et al.
(2010), which estimates total willingness to pay (WTP) for
multi-attribute aquatic ecosystem changes related to improvements in
forage fish in Rhode Island. Unlike other studies, the choice experiment
data of Johnston et al. (2010) allow estimation of WTP associated with
quantified changes in forage fish (e.g., WTP per fish or percentage
change in fish), holding other ecological effects constant.  That is,
unlike results provided by other studies in the literature, WTP
estimates of Johnston et al. (2010) are not confounded with values for
other changes including water quality, habitat, overall ecological
condition, charisma of species, etc. In addition, the choice experiment
of Johnston et al. (2010) addresses species such as alewife and blueback
herring that are neither subject to recreational or commercial harvest
in Rhode Island, nor are charismatic species.  Hence, the species
affected are a close analog to the forage fish affected in the 316(b)
policy context.

Although the methods and data of Johnston et al. (2010) allow estimation
of total values associated with specific improvements in forage and/or
recreational fish, the policy context and scale of the survey prevent
its direct use for analysis of national benefits of the 316(b)
regulation.  Specifically, Johnston et al. (2010) estimate Rhode Island
residents’ preferences for the restoration of migratory fish passage
over dams in the Pawtuxet and Wood-Pawcatuck watersheds. Hence, the case
study is for a watershed-level policy with statewide welfare
implications.  In contrast, 316(b) policies would have nationwide
implications, both on ecosystems and on affected facilities.  

3(b)	Public Notice Required Prior to ICR Submission to OMB

In accordance with the Paperwork Reduction Act (44 U.S.C. 3501 et seq.),
EPA published a notice in the Federal Register on July 21, 2010,
announcing that the survey questionnaire and sampling methodology were
available for comment.  A copy of the first Federal Register notice is
attached at the end of this document (EPA ICR #2402.01.02) (See
Attachment 3).  EPA received a number of comments on the proposed
information collection, which are summarized in the following
paragraphs.  Also see docket # EPA-HQ-OW-2010-0595.  EPA considered
relevant comments on the draft survey when developing the survey
questionnaire and sampling methodology for the current survey for
existing facilities.  

Some commenters expressed concern that the draft survey questionnaire
and sampling methodology would not provide accurate estimates of WTP. 
Some stated that the proposed stated preference survey would
overestimate WTP to prevent fish losses.  Another commenter argued the
opposite:  that the proposed contingent valuation survey is biased
against protecting ecosystems, and will drastically undervalue non-use
benefits.  EPA agrees that certain details of the stated preference
survey and supporting documentation for the July 21, 2010 Federal
Register notice required revision.  Following OMB approval of the focus
groups, EPA conducted seven focus groups (including one set of cognitive
interviews) to pretest the draft survey materials, to test for and
reduce potential biases that may be associated with stated preference
methodology, and to ensure that both researchers and respondents have
similar interpretations of survey language and scenarios.  As a result
of this extensive pre-testing, a number of revisions were made to the
draft survey that significantly improved its reliability and reduced its
potential for bias.  Hence, many of the survey design elements on which
commenters took issue have already been removed or changed.  In many
other instances, however, focus groups showed little cause for concern,
suggesting that many of the speculative claims raised by commenters have
little value to the population being surveyed.  EPA also notes that
number of focus groups and interviews conducted for this survey—and
the draft survey tested in 2005-06—far exceeds the number conducted
for most stated preference surveys found in the published literature.  

EPA notes that the survey proposed in this ICR is different in many ways
from the draft survey for the Phase III benefits analysis that was peer
reviewed in 2005 (Versar 2006).  While findings from pre-testing of the
previous draft survey were considered when developing the present
survey, the present survey has undergone various revisions based on
additional analysis and the results of recent focus groups.  Due to
these differences, EPA notes that many of the peer review comments
received on the Phase III draft survey are no longer relevant for the
survey proposed in this ICR.  The Agency, however, takes all peer review
comments very seriously, and has accounted for these comments in all
survey revisions and in the development of the present stated preference
survey.

One commenter argued that in conducting the stated preference survey,
EPA should use the willingness-to-accept (WTA) metric in place of or in
addition to WTP.  EPA notes that good practice guidelines for stated
preference surveys almost universally indicate the use of WTP
elicitation mechanisms over WTA elicitation mechanisms. This is due to
the potential for biases in WTA stated preference surveys that can be
ameliorated by the use of the WTP format.  WTP is also considered to be
the more conservative choice, but in most cases, the divergence between
WTA and WTP, as predicted, by theory, should be very small.  EPA follows
standard practice in proposing a WTP format in order to avoid these
biases, comply with guidance and practice in the stated preference
literature, and ensure a conservative benefit estimate.

Some commenters argued that hypothetical bias in the survey
questionnaire would inhibit respondents ability to provide meaningful
survey responses.  EPA agrees that hypothetical bias is an important
concern with stated preference surveys—and has taken this very
seriously in survey development and testing.  EPA does not agree that
this inhibits respondents or that the literature suggests that
hypothetical bias is unavoidable; in fact, the published literature
includes approaches to mitigate such biases.  The Agency followed the
published literature in designing mitigation strategies to eliminate or
at a minimum reduce the potential for hypothetical bias.  This includes
explicitly designing the survey to maximize the consequentiality of
choice experiment questions through direct linkages to proposed EPA
regulatory efforts.  Moreover, the survey explicitly incorporates
elements such as certainty follow-up question to enable mitigation of
any remaining hypothetical bias (Ready et al. 2010).  Focus group and
interview transcripts show that, when asked explicitly, respondents
almost universally indicated that their answers to choice questions in
the survey instrument would be identical if the same questions were
encountered in a binding referendum.  This indicates that there is
little evidence of hypothetical bias within the draft survey. 

A commenter argued that the respondents to the survey would be informed
and conditioned based on the information included in the survey and that
this would lead to the creation of preferences where none existed
before.  EPA believes that this result is entirely expected, and is
consistent with the academic literature.  The Agency emphasizes that the
sensitivity of values (and behavior) to information is true of both
market and non-market values (Bateman et al. 2002, p. 298), and in no
way invalidates the proposed non-market values that would be estimated
through the proposed stated preference approaches.  It is common
practice in such surveys to provide substantial information to survey
respondents.  The survey information was pretested extensively during
the seven focus group sessions and EPA revised or removed informational
elements which respondents found confusing or misleading.

Some commenters argued that the survey results will be unreliable
because the survey questionnaire contains inaccurate statements and
comparisons which overstate the resource impacts from baseline I&E
mortality and the regulatory options presented in the survey.  EPA
recognizes the importance of accurately characterizing resource and
regulatory impacts and notes that the survey is based on the best
biological and engineering data available.  Despite the general
observation that I&E impacts are small compared to other effects, I&E
has been shown to have measurable impacts on local fish populations and
communities.  Importantly, increases in fishery sustainability and fish
population values presented in the survey instrument are small. 
Overall, the Agency rejects the claims from commenters, based largely on
feedback from focus group participants, that impacts are misrepresented,
and emphasizes that the survey is explicitly designed to provide
respondents with an understanding of the proposed policies that is as
accurate as possible given the best available ecological science.  

Commenters also questioned whether survey respondents would have
sufficient comprehension of the issues in order to provide meaningful
responses to the survey’s valuation questions.  EPA agrees that in
order to receive meaningful responses, a stated preference survey should
provide information to respondents about the hypothetical commodity so
that they understand and accept it and can give meaningful answers to
the valuation questions.  Given the importance of commodity
comprehension, EPA devoted considerable attention to comprehension of
the hypothetical commodity and related issues (e.g., understanding of
payment vehicle, understanding of the ecological scores) during the
recent focus groups and cognitive interviews and focus groups conducted
for the original version of the Phase III survey instrument in 2005.
Focus groups participants showed no difficulty understanding the format
of the payment vehicle and that selecting Options A or B would result in
increased costs to their household.  They also correctly understood this
cost as ongoing. 

In addition to the comments regarding the payment vehicle, commenters
expressed concern regarding respondent comprehension of the ecological
scores used in the survey.  EPA emphasizes that the reaction and
understanding of likely respondents, as opposed to experts, is crucial
when testing the communication of ecological information in stated
preference surveys.  The survey has undergone substantial changes in the
way it communicates ecological information based on the results of the
seven focus group sessions. For example, in the revised survey, EPA
provides more precise information regarding the definition of “young
adult fish” on Page 4, which currently states: “After accounting for
the number of eggs and larvae that would be expected to survive to
adulthood, scientists estimate that the equivalent of about 1.1 billion
young adult fish (the equivalent of one year old) are lost each year in
Northeast coastal and fresh waters due to cooling water use.” 
Overall, focus group and interview participants’ statements implied
different opinions about the importance of preventing fish losses versus
increasing fish population or improving the condition of aquatic
ecosystems.  Also, there did not appear to be any confusion over the
fact that scores of 100 for various attributes are generally
unattainable through reductions in CWIS fish losses alone.  One
commenter argues that the stated preference survey fails to account for
effects on a number of non-fish species as well as effects on
threatened, endangered, and other protected species.  In response to
this comment, EPA agrees but notes that focus group respondents
suggested that additions to the survey’s length should be avoided. 
Thus EPA will not be able to use these survey results to represent
absolutely complete benefits estimation, but will be able to say that a
potentially substantial category of benefits, non-use benefits, has been
included.  As is common in surveys, EPA has chosen to present policy
scenarios in simplified form to facilitate respondent comprehension, and
to encourage respondents to focus on the most important policy
characteristics related to fish losses.  Such simplification of the
survey helps to balance the provision of detailed policy information
against respondents’ cognitive abilities to consider a large number of
attributes simultaneously (Louviere et al. 2000).  

Some commenters stated that EPA did not sufficiently emphasize the
uncertainty associated with effects and costs of the proposed policies
presented within the survey.  EPA agrees that there is uncertainty
regarding the number of fish killed annually, as well as the effects and
costs of the regulatory policies presented within the survey.
Additionally, EPA does note uncertainty within the current existing
facilities survey.  For example, the following statements are included
in the current survey version for the Northeast region:  “Although
scientists can predict the number of fish saved each year, the effect on
fish populations is uncertain.  This is because scientists do not know
the total number of fish in Northeast water and because many factors –
such as cooling water use, fishing, pollution, and water temperature –
affect fish populations”, and “Policy costs and effects depend on
many factors.”  EPA has also included debriefing questions in the
survey instrument that are designed to identify individuals whose
responses are based on incorrect interpretation of the environmental
changes described in the survey, including the uncertainty of the
expected changes.  EPA points out that debriefing sessions during focus
groups and cognitive interviews showed that respondents clearly
understood that the ecological changes described in the survey were
uncertain.  Furthermore, when asked, focus group respondents indicated
that they were comfortable making decisions in the presence of
uncertainty.

Another commenter questioned various components of EPA proposed sampling
methodology, experimental design, and methods for accounting for
non-response bias.  In response, EPA emphasizes that methods for WTP
estimation from ecological choice experiment data—of exactly the type
proposed by EPA—are very well established in the published literature.
 More broadly, when designing the proposed methods, EPA closely followed
accepted contemporary methods in the published literature for the
estimation of WTP distributions under statistical uncertainty.  In the
absence of concrete and established alternatives for the choice of
sampling weights, EPA has proposed a more conservative approach of
reliance on accepted and standard methods from the stated preference
literature.  EPA believes—following guidance in the literature and its
own guidance documents (Arrow et al. 1993; US EPA 2000) for a weighting
and extrapolation approach—that established stated preference methods
are capable of estimating reliable and accurate welfare measures, if
surveys and approaches are appropriately designed.  Regarding the
potential for non-response bias, EPA has proposed standard approaches
for non-response assessments and calibrations in proposing tests and
corrections for non-response based on a small number of attitudinal and
behavioral questions, combined with demographic characteristics.  These
currently reflect standard practice within the literature.

Another commenter argued that although the Supreme Court held that Clean
Water Act section 316(b) does not prohibit the consideration of costs in
relation to benefits of proposed rule it did not find that cost benefit
analysis is required (Entergy Corp. v. Riverkeeper, Inc.).  EPA has the
authority to decide whether to conduct cost benefit analysis of proposed
rule options.  The commenter, however, recognized that Executive Order
12866, “Regulatory Planning and Review,” requires EPA to estimate
potential costs and benefits to society of proposed rule options. In
response to the commenter’s claim about utility of cost-benefit
analysis in environmental context, EPA notes that cost-benefit analysis
is only one tool that can be used to inform policy decisions. EPA is
conducting this survey because of Executive Order 12866, “Regulatory
Planning and Review,” which requires Federal Agencies to conduct
economic impact and cost-benefit analysis for all major rules. 
Furthermore, cost-benefit analysis requires a comprehensive, estimate of
total social benefits, including non-use values.  The current
information collection would provide valuable information regarding
total social benefits of the 316(b) regulation for existing facilities,
thus enabling the Agency to perform cost-benefit analysis for the
regulation, if it should choose to, without ignoring a potentially
important category of benefits (non-use values), and to satisfy the
requirements of Executive Order 12866.

For a more detailed discussion of the issues raised by commenters on
this ICR, see EPA’s response to pubic comments on the Federal Register
notice published on November 23, 2004 (69 FR 68140).  For a discussion
of the issues raised by commenters on the previous Phase III survey ICR,
see EPA’s response to pubic comments on the Federal Register notice
published on June 9, 2005 (70 FR 33746).  For a discussion of issues
raised by commenters on the previous Phase III focus group ICR, see
EPA's response to public comments on the Federal Register notice
published on November 23, 2004 (69 FR 68140).

3(c)	Consultations

	The Principal Investigator for the stated-preference portion of this
effort is Dr. Robert Johnston.  Dr. Johnston is assisted by Dr. Elena
Besedin, a Senior Economist at Abt Associates Inc.  Dr. Erik Helm at the
U.S. Environmental Protection Agency serves as the project manager and a
contributor to this research.  

Robert J. Johnston is Director of the George Perkins Marsh Institute and
Professor of Economics at Clark University.  He is President-elect of
the Northeastern Agricultural and Resource Economics Association
(NAREA), on the Program Committee for the Charles Darwin Foundation, the
Science Advisory Board for the Communication Partnership for Science and
the Sea (COMPASS), and is the Vice President of the Marine Resource
Economics Foundation.  Professor Johnston has published extensively on
the valuation of non-market commodities (goods, services, and
resources), benefit cost analysis, and resource management.  His recent
research emphasizes coordination of ecological and economic models to
estimate ecosystem service values, with particular emphasis on the role
of aquatic ecological indicators.  He has also worked extensively in
methodologies for benefit transfer, including the use of meta-analysis. 
Professor Johnston’s empirical work on non-market valuation and
benefit transfer has contributed to numerous benefit cost analyses
conducted by federal, state and local government agencies in the US,
Canada and elsewhere. 

Elena Y. Besedin, a senior economist at Abt Associates Inc., specializes
in the economic analysis of environmental policy and regulatory
programs.  Her work to support EPA has concentrated on analyzing
economic benefits from reducing risks to the environment and human
health and assessing environmental impacts of regulatory programs for
many EPA program offices.  She has worked extensively on valuation of
non-market benefits associated with environmental improvements of
aquatic resources.  Dr. Besedin’s empirical work on non-market
valuation includes design and implementation of stated and revealed
preference studies and benefit transfer methodologies.  Her recent work
has focused on developing integrated frameworks to value changes in
ecosystem services stemming from environmental regulations. 

EPA notes that the current survey instrument is built upon an earlier
version that was peer reviewed in January 2006. It incorporates
recommendations received from the first peer review panel.  Because the
final product of this study meets the major technical work criteria
specified in the Peer Review Handbook (U.S. EPA 2006) the Agency also
plans to convene a peer-review panel to review the entire survey
process, including the survey instrument, study results, and EPA’s
final estimated results for the 316(b) Existing Facilities rulemaking,
after the survey is completed. 

3(d)	Effects of Less Frequent Collection

The survey is a one-time activity.  Therefore, this section does not
apply.

3(e)	General Guidelines

The survey will not violate any of the general guidelines described in 5
CFR 1320.5 or in EPA’s ICR handbook.

3(f)	Confidentiality

All responses to the survey will be kept strictly anonymous.  To ensure
that the final survey sample includes a representative and diverse
population of individuals, the survey questionnaire will elicit basic
demographic information, such as age, household size, employment status,
and income. However, the detailed survey questionnaire will not ask
respondents for personal identifying information, such as names, phone
numbers, or addresses.  Addresses will be collected during telephone
screening, but the telephone screening and mail survey databases will
not be matched or combined.  Prior to taking the survey, respondents
will be informed that their responses will be held strictly anonymous. 
The survey data will be made public only after it has been thoroughly
vetted to ensure that all potentially identifying information has been
removed.

3(g)	Sensitive Questions

The survey questionnaire will not include any sensitive questions
pertaining to private or personal information, such as sexual behavior
or religious beliefs.

4.	The Respondents and the Information Requested

4(a)	Respondents 

The target population for the Stated Preference Survey is all
individuals from continental U.S. households who are 18 years of age or
older.  Survey participants are recruited randomly through random digit
dialing.  EPA will identify a random stratified sample of 5,720 adults
who express their willingness to participate in a mail survey when
contacted by telephone through random digit dialing.  All of these
recruits will be sent a copy of the mail survey. Approximately 2,288 of
the adults of the 5,720 adults sent a survey are expected to return a
completed survey.

For the selection of households, the population of households in the 48
states and the District of Columbia will be stratified by four study
regions.  There are a total of seven study regions for purposes of
evaluating the 316(b) existing facilities rule benefits. For the
purposes of the stated preference survey implementation, EPA uses four
geographic regions: Northeast, Southeast, Inland, and Pacific. The
Northeast region includes the North Atlantic and Mid Atlantic regions,
the Southeast region includes the South Atlantic and Gulf of Mexico
regions, the Pacific region includes states on the Pacific coast, and
the Inland region includes all non-coastal states. 

A sample of 2,000 households would complete a version of the survey
which specifically addresses policies within their region. The total
sample completing regional survey versions is allocated to each region
in proportion to the total number households in that region with the
restriction that at least 288 persons respond in each region.  This is
the number required to estimate the main effects and interactions under
an experimental design model.  The total sample size for each region is
much larger then the minimum sample size required for model estimation
for all but one region (Pacific).  An additional sample of 288
households will receive a national survey version which addresses
policies at the national scale.  This sample would be distributed among
the study regions based on the percentage of regional survey sample (as
shown in Table A1) to ensure that respondents to the national survey
version are distributed throughout the continental U.S.  Part B of this
document provides detail on sampling methodology. 

Table A1 shows the stratification design for the geographic regions
covered by the sample for this survey.  More detail on planned sampling
methods and the statistical design of the survey can be found in Part B
of this supporting statement.

Table A1: Geographic Stratification Design

Region	States Included	Sample

Sizea	Percentage of Sample

Northeast	CT, DC, DE, MA, MD, ME, NH, NJ, NY, PA, RI, VT	417	21%

Southeast	AL, FL, GA, LA, MS, NC, SC, TX, VA	562	28%

Inland	AR, AZ, CO, ID, IA, IL, IN, KS, KY, MI, MN, MO, MT, NM, OK, ND,
NE, NV, OH,TN, SD, UT, WI, WV, WY	732	37%

Pacific	CA, OR, WA	288	14%

Total for Regional Surveys Versions	U.S. (excluding AK and HI)	2,000
100%

National Survey Version	U.S. (excluding AK and HI)	288	-

a  Sample sizes presented in this table include only the 2,288
individuals returning completed mail surveys.



4(b)	Information Requested

(I)	Data items, including recordkeeping requirements

Individuals who agree to participate in the survey will be mailed a copy
of the survey.  The full text of the regional version of the mail survey
for the Northeast region is provided in Attachment 1 and the full text
of the national version of the mail survey is provided in Attachment 2. 
EPA revised the survey based on the findings of a series of seven focus
groups conducted as part of survey instrument development (OMB control #
2090-0028).  Additional information regarding focus group implementation
is provided in Section 5(b).  EPA has determined that all questions in
the survey are necessary to achieve the goal of this information
collection, i.e., to collect data that can be used to support an
analysis of the total benefits of the 316(b) regulation.

The following is an outline of the major sections of the survey.

Relative Importance of Issues Associated with Industrial Cooling Water. 
The first survey question asks respondents to rate the general
importance of (a) preventing the loss of fish caught by humans, (b)
preventing the loss of fish not caught by humans, (c) maintaining
ecological health in rivers, lakes, and bays, (d) keeping the cost of
goods and services low, (e) making sure there is enough government
regulation on industry, and (f) making sure there is not too much
government regulation on industry.  This question is designed to elicit
the respondent’s general preferences for regulation, reductions in
fish losses, and ecological health.  It also places respondents in the
mindset where they are cognizant of the range of issues associated with
the use of cooling water by industrial facilities.

Concern for Policy Issues.  The second survey question asks respondents
to rate the general importance of protecting aquatic ecosystems compared
to other issues that the government might address.  This question is
designed to remind respondents that there are other issues (such as
public safety, education, and health) to which the government could
direct funds, rather than spending these funds to prevent fish losses. 
Such questions are commonly used in introductory sections of stated
preference surveys (e.g., Mitchell and Carson 1984), in order to place
respondents in a mindset in which they are cognizant that there are
substitute goods and policy issues to which they might direct their
scarce household budgets.

Relative Importance of Effects.  Question 3 asks the respondent to rate
the importance of each of the effects captured by the five scores: (a)
commercial fish populations (b) fish populations (for all fish), (c)
fish losses prevented, (d) condition of aquatic ecosystems, and (e) cost
to my household.  This question is designed to promote understanding of
the scores by placing respondents in a mindset where they consider the
meaning of each score and consider their general preferences for effects
prior to considering specific policy options.  The question also
promotes understanding by telling the respondent that they can return to
previous pages for reminders of what the scores mean.  

Voting for Regulations to Prevent Fish Losses in the Respondent’s
Region.  Questions 4, 5, and 6 are “choice experiment”  or “choice
modeling” questions (Adamowicz et al. 1998; Bennett and Blamey 2001),
and ask respondents to choose how they would vote, if presented with two
hypothetical regulatory options (and a third “status quo” choice to
reject both options) for waters within the respondents’ region (e.g.,
Northeast waters).  Each of the multi-attribute options is characterized
by (a) commercial fish populations (in 3-5 years) (b) fish populations
(all fish; in 3-5 years), (c) fish saved per year (out of [total] fish
lost in water intakes), (d) condition of aquatic ecosystems (in 3-5
years), and (e) an unavoidable cost of living increase for the
respondent’s household.  Following standard choice experiment methods,
respondents choose the regulatory options that they prefer, based on
their preferences.  Respondents always have the option to vote for
neither option—providing the status quo option is necessary for
appropriate welfare estimation (Adamowicz et al. 1998).  Advantages of
choice experiments, and the many examples of the use of such approaches
in the literature, are discussed in later sections of this ICR. 
Following standard approaches (Opaluch et al. 1993, 1999; Johnston et
al. 2002a; 2002b, 2003b), respondents are instructed to answer each of
the three choice questions independently, and not to “add up or
compare programs across different pages.”  This is included to avoid
biases associated with sequence aggregation effects (Mitchell and Carson
1989).  EPA will also vary the order in which the policy option
attributes are presented across respondents, such as presenting
household cost first or presenting fish saved per year lower in the list
of choice question attributes.  While complete randomization is
impractical for the mail survey, the change in order would allow for a
potential test of ordering effects.

Reasons for Voting “No Policy”.  Question 7 is a follow-up to the
prior voting questions, and asks respondents to identify the primary
reason for voting no, if they always voted for “no policy” in
questions 4-6.  It is designed to identify respondents whose “no
policy” responses are based on their budget constraint, respondents
who do not consider fish losses important enough to vote for a policy,
or respondents who ignored information presented in the survey and
answered questions based on their general convictions and principles. 
In an electronic survey format, respondents who voted for a policy would
not see this question, potentially reducing burden.

Respondent Certainty and Reasons for Voting.  Questions 8 and 9 are
follow-up questions to the prior voting.  Question 8 assesses the
certainty that respondents feel in their choice experiment responses,
following methods of Champ et al. (2004; 2009), Akter et al. (2009),
Kobayashi et al. (2010), and others.  It is designed to identify
respondents whose responses are based on incorrect interpretation of the
resource changes and the uncertainty of ecological outcomes from policy
options.  EPA will evaluate respondent understanding when completing
stated preference surveys using methods and approaches discussed in
Boyle (2003), Kaplowitz et al. (2004), Bateman et al. (2002), Powe
(2007) and others.  Question 9 asks respondents to rate the effect of
factors on their choices, and why they voted for or against the
regulatory programs.  Responses to such questions have been used in the
literature to successfully control for hypothetical bias. 

Recreational Experience.  Questions 10 asks respondents how often they
participate in specific types of water-related recreational activities
within the last year.  This question can be used to identify non-users
of the fishery resource—thereby allowing the estimation of non-user
values for I&E mortality reductions. Examples of this approach to
estimation of non-user values are provided by Johnston et al. (2005a),
Whitehead et al. (1995), Croke et al. (1986), Olsen et al. (1991),
Cronin (1982), Whitehead and Groothuis (1992), and Mitchell and Carson
(1981).  

Demographics.  Questions 11-21 ask respondents to provide basic
demographic information, including age, gender, highest level of
education, household size, household composition, location, employment
status, and household income.  This information will be used in the
analysis of survey results, as well as in the nonresponse analysis.

Comments.  The survey offers respondents a chance to comment on the
survey.

The Agency will modify the survey instrument for each region relative to
the regional survey shown in Attachment 1 for the Northeast region as
follows:

Cover – The text on the cover reads “A Survey of Northeast Residents
(CT, DC, DE, MA, MD, ME, NH, NJ, NY, PA, RI, VT)”.  “Northeast
Residents” and the list of included states will be changed to match
the respondent’s region.

Cover –  The photo on the cover is of a forage species (silversides)
in found in the Northeast region. If that species isn't found in other
regions, EPA will replace the cover photo with a similar substitute
photo for a forage species relevant to that region.

Page 1 – The survey states that it “asks for your opinions regarding
policies that would affect fish and habitat in the Northeast U.S.”
“Northeast U.S.” will be replaced with the respondent’s region or
simply, “the U.S.” for the national version.

Page 1 – Includes the statement that “Northeast fresh and salt
waters support billions of fish.” “Northeast” will be replaced
with the respondent’s region or “the U.S.” for the national
version and “salt water” will be removed from the Inland and Great
Lakes regions.

Page 2 - The survey states that “Cooling water use affects fresh and
salt waters throughout the Northeast US, but almost all fish losses are
in salt waters such as coastal bays.”  For the Southeast and Pacific
regions, which include both salt water and freshwater facilities,
reference to the Northeast region would be replaced with the name of the
respondent’s region or “the U.S.” for the national version of the
survey.  For the Inland region, which only include freshwater
facilities, reference to salt water will be removed.

Page 2 – A map of the Northeast region and facility locations is
presented.  A comparable map will be produced for each region or the
U.S., and map included in the survey will correspond to the
respondent’s region or the U.S..

Page 4 - This page provides an example for two Northeast coastal species
and includes a figure comparing estimates of young fish lost to the
total number of adult fish.  The estimate of total losses will be
modified to describe species and losses occurring within the
respondent’s region. 

Page 5 - The total losses (1.1 billion) included within the survey
corresponds to losses from facilities in Northeast coastal waters.  This
total would be replaced with the total losses for the respondent’s
region or total losses for the U.S. for the national version.  The pie
charts will be updated based on the regional (national) total while
maintaining the 25% and 95% reductions used in the example.

Page 7 - The table text describing each score will be changed to include
the current scores for the respondent’s region.

Page 8 - Text describing the commercial fish populations score and fish
populations (all fish) score will be changed to describe the current
score for the respondent’s region relative to its maximum. 

Page 10 – The included figure illustrates the location of
“Facilities Using Cooling Water Intake”.  The figure will be
replaced with a map showing facility locations within the respondent’s
region or in the U.S. for the national version. 

Pages 11-14 – Regional references within the table headings in
Questions 4, 5, and 6 (e.g., “Policy Effect NE Waters”, “Option A
NE Waters”) will be modified to refer to the policy effects and
options for the respondent’s region.  The “Fish Saved per Year”
score includes a note reminding the respondent of total fish lost within
the region due to I&E (e.g., 1.1 billion); this number will be changed
to reflect total losses within the respondent’s region.  The values
describing the current situation and Options A and B within experiment
questions 4, 5, and 6 will also vary across regions.

(II)	Respondent activities

EPA expects individuals to engage in the following activities during
their participation in the contingent valuation survey:  

Review the background information provided in the beginning of the
survey document.  

Complete the survey questionnaire and return it by mail.

A typical subject participating in the mail survey is expected to take
30 minutes to complete the survey.  These estimates are derived from
focus groups and cognitive interviews in which respondents were asked to
complete a survey of similar length and detail to the current survey.

5.	The Information Collected - Agency Activities, Collection
Methodology, and Information Management

5(a)	Agency Activities

The survey is being developed, conducted, and analyzed by Abt Associates
Inc. and is funded by EPA contract No. EP-C-07-023 which provides funds
for the purpose of analyzing the economic benefits of the proposed rule
for existing facilities subject to the section 316(b) regulation. Agency
activities associated with the survey consist of the following:

Develop the survey questionnaire and sampling design.

Printing of questionnaires.

Telephone screening of respondents.

Mailing of questionnaires.

Mailing of postcard reminders.

Follow-up with non-respondents to the mail survey/resending of
questionnaire if needed.

Data entry and cleaning of data file.

Analysis of survey results.  

Although not covered under this ICR, EPA will primarily use the survey
results to estimate the social value of changes in I&E mortality losses
of forage, recreational, and commercial species of fish, as part of the
Agency’s analysis of the benefits of the 316(b) rule for existing
facilities.  If reliable environmental data were to be developed for
population changes and other ecosystem impacts, social values for these
benefit types may also be assessed for the rulemaking using survey
results.

5(b)	Collection Methodology and Information Management

To pretest the survey questionnaire, EPA conducted a series of seven
focus groups, including one using cognitive interview methodologies
under a different ICR (OMB control # 2090-0028).  Focus groups provided
valuable feedback which allowed EPA to iteratively edit and refine the
questionnaire, and eliminate or improve imprecise, confusing, and
redundant questions.  Focus groups and cognitive interviews were
conducted following standard approaches in the literature, as outlined
by Desvousges et al. (1984), Desvousges and Smith (1988), Johnston et
al. (1995), Schkade and Payne (1994), Kaplowicz et al. (2004), and
Opaluch et al. (1993).  

EPA plans to implement the proposed survey as a mail choice experiment
questionnaire.  First, EPA will use a telephone screener to solicit
respondents to receive the mail questionnaire.  A dual frame, random
digit dial design will be used, including both landline and cell phone
numbers.  This ensures that even cell-phone-only households have an
opportunity to participate in the survey to reduce potential bias.  The
telephone screener, shown in Attachment in 6, includes not only an
invitation to participate in the mail survey, but also some attitudinal
and behavioral questions that are also included in the mail survey, as
well as demographic characteristics.  EPA will use this information for
the non-response analysis to compare respondents and non-respondents to
the mail questionnaire.

It is anticipated that approximately 60 percent of respondents to the
telephone screener will agree to participate in the mail survey.  EPA
will then send the mail survey to those who agree to participate, along
with a cover letter explaining the purpose of the survey.  All of these
respondents will receive a reminder postcard approximately one week
after the initial questionnaire mailing. The cover letter and postcard
reminder are included as Attachments 7 and 8, respectively.

Approximately three weeks after the reminder postcard, all those who
have not responded will receive a second copy of the questionnaire with
a revised cover letter.  The following week, another reminder postcard
will be sent.  Based on this approach to mail data collection, it is
anticipated that approximately 40 percent of those who agree to
participate in the mail survey will actually do so.  Since the desired
number of completed surveys is 2,288, it will be necessary to recruit
5,720 respondents by telephone who agree to participate (Dilman 2000).

Data quality will be monitored by checking submitted surveys for
completeness and consistency, and by asking respondents to assess their
own responses to the survey.  Question 8 asks respondents to rate their
understanding of the survey and their confidence in their responses. 
Questions 7 and 9 are designed to assess the presence or absence of
potential response biases by asking respondents to indicate their
reasoning and rate the influence of various factors on their responses
to the choice experiment questions.  Responses to the survey will be
stored in an electronic database.  This database will be used to
generate a data set for a regression model of total values for
reductions in fish I&E mortality by section 316(b) existing facilities.

To protect the confidentiality of survey respondents, the survey data
will be released only after it has been thoroughly vetted to ensure that
all potentially identifying information has been removed.  

Alternatively, EPA is considering implementing the proposed survey as an
internet-based, choice experiment questionnaire.  This alternative
method is described in Attachment 5.

5(c)	Small Entity Flexibility

This survey will be administered to individuals, not businesses.  Thus,
no small entities will be affected by this information collection.

5(d)	Collection Schedule

The schedule for implementation of the survey will be as follows:

Table A2: Schedule for Survey Implementation

Activity	Duration of Each Activity

Printing of questionnaires	Weeks 1 to 2 

Telephone screening of respondents	Weeks 1 to 5

Mailing of questionnaires	Weeks 3 to 8

Postcard reminder (one week after initial questionnaire mailing)	Weeks 4
to 9

Follow-up with non-respondents to the mail survey/resending of
questionnaire if needed (4 weeks after initial questionnaire mailing)
Weeks 6 to 13

Data entry	Weeks 4 to 16

Cleaning of data file	Week 17

Delivery of data	Week 18



6.	Estimating Respondent Burden and Cost of Collection

6(a)	Estimating Respondent Burden

Subjects who participate in the survey and follow-up interviews will
expend time on several activities.  Based on the administration of the
mail survey to 5,720 households, the national burden estimate for all
respondents is 1,938 hours assuming that 9,533 households will
participate in telephone screening and 2,288 respondents complete and
return the survey,  

EPA estimates that telephone screening interviews will take 5 minutes
(0.08 hours) per interview for each of the 9,533 screened households. 
Based on pretests conducted in focus groups, EPA estimates that on
average each respondent mailed the survey will spend 30 minutes
reviewing the introductory materials and completing the survey
questionnaire.  Thus, the average burden per respondent is 30 minutes
(0.5 hours) for these 2,288 respondents.  

These burden estimates reflect a one-time expenditure in a single year.

6(b)	Estimating Respondent Costs

According to the Bureau of Labor Statistics, the average hourly wage for
private sector workers in the United States is $20.42 (2009$) (U.S.
Department of Labor 2009).  Assuming an average per-respondent burden of
0.08 hours for each of the 9,533 screening participants and an average
hourly wage of $20.42, the average cost per screening participant is
$1.70.  Therefore the total cost to participants in the telephone
screening phase would be $16,223.  Assuming an average per-respondent
burden of 0.5 hours for individuals mailed the survey and an average
hourly wage of $20.42, the average cost per respondent is $10.21.  Of
the 5,720 individuals receiving the mail survey, 2,288 are expected to
return their completed survey.  The total cost for all individuals that
return surveys would be $23,360.

EPA does not anticipate any capital or operation and maintenance costs
for respondents.

6(c)	Estimating Agency Burden and Costs

This project will be undertaken by Abt Associates Inc. with funding of
$446,000 from EPA contract EP-C-07-023, which provides funds for the
purpose of analyzing the economic benefits of the proposed rule for
existing facilities subject to the section 316(b) regulation.  Abt
Associates Inc. staff is expected to spend 1,155 hours pre-testing the
survey questionnaire and sampling methodology, conducting the survey,
and tabulating and analyzing the survey results. The cost of this
contractor time is $77,000.  In addition to the effort expended by
EPA’s contractors, EPA staff is expected to spend 320 hours managing
and reviewing this project and contributing to the analysis at a cost of
$31,000.  Agency and contractor burden is 1,475 hours, with a total cost
of $108,000 excluding the costs of survey printing and mailing and
telephone screening.  Telephone screening is expected to take 8,400
hours and cost $247,000 while mailing and printing of the survey is
expected to take 161 hours and cost $122,000, including the cost of
survey printing and mailing.  Thus, the total Agency and contractor
burden would be 10,036 hours and would cost $477,000.

6(d)	Respondent Universe and Total Burden Costs

EPA expects the total cost for survey respondents to be $39,583 (2009$),
based on a total burden estimate of 1,938 hours and an hourly wage of
$20.42.

6(e)	Bottom Line Burden Hours and Costs

The following table presents EPA’s estimate of the total burden and
costs of this information collection:

Table A3: Total Estimated Bottom Line Burden and Cost Summary

Affected Individuals	Total Burden	Total Cost (2009$)

Participants in Telephone Screening	794 hours	$16,223

Mail Survey Respondents	1,114 hours	$23,360

Total for Survey Respondents	1,938 hours	$39,583

EPA Staff	320 hours	$31,000

Telephone Screening 	8,400 hours	$247,000

Survey Printing and Mailing	161 hours	$122,000

EPA’s Contractors	1,155 hours	$77,000

Total Burden and Cost	11,974  hours	$516,583



6(f)	Reasons for Change in Burden

The survey is a one-time data collection activity.

6(g)	Burden Statement

EPA estimates that the public reporting and record keeping burden
associated with the mail survey will average 0.5 hours per respondent
(i.e., a total of 1,144 hours of burden divided among 2,288 survey
respondents).  Households included in telephone screening are expected
to average 0.08 hours per screening interview participant (i.e., a total
of 794 hours of burden divided among 9,533 screening interview
participants).  This results in a total burden estimate of 1,938 hours
including both the telephone screening interviews and mail survey. 
Burden means the total time, effort, or financial resources expended by
persons to generate, maintain, retain, or disclose or provide
information to or for a Federal agency.  This includes the time needed
to review instructions; develop, acquire, install, and utilize
technology and systems for the purposes of collecting, validating, and
verifying information, processing and maintaining information, and
disclosing and providing information; adjust the existing ways to comply
with any previously applicable instructions and requirements; train
personnel to be able to respond to a collection of information; search
data sources; complete and review the collection of information; and
transmit or otherwise disclose the information.  An agency may not
conduct or sponsor, and a person is not required to respond to, a
collection of information unless it displays a currently valid OMB
control number.  The OMB control numbers for EPA's regulations are
listed in 40 CFR part 9 and 48 CFR chapter 15. 

To comment on the Agency's need for this information, the accuracy of
the provided burden estimates, and any suggested methods for minimizing
respondent burden, including the use of automated collection techniques,
EPA has established a public docket for this ICR under Docket ID No.
EPA-HQ-OW-2010-0595, which is available for online viewing at
www.regulations.gov, or in person viewing at the Office of Water Docket
in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301
Constitution Ave., NW, Washington, DC.  The EPA/DC Public Reading Room
is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding
legal holidays. The telephone number for the Reading Room is
202-566-1744, and the telephone number for the Office of Water Docket is
202-566-1752.  

Use www.regulations.gov to obtain a copy of the draft collection of
information, submit or view public comments, access the index listing of
the contents of the docket, and to access those documents in the public
docket that are available electronically.  Once in the system, select
“search,” then key in the docket ID number, EPA-HQ-OW-2010-0595. 

PART B OF THE SUPPORTING STATEMENT

1.	Survey Objectives, Key Variables, and Other Preliminaries		

1(a)	Survey Objectives

The overall goal of this survey is to explore how public values
(including non-use values) for fish and aquatic organisms are affected
by I&E mortality at cooling water intake structures (CWIS) located at
existing 316(b) facilities, as reflected in individuals’ willingness
to pay for programs that would prevent such losses.  EPA has designed
the survey to provide data to support the following specific objectives:

To estimate the total values, including non-use values, that individuals
place on preventing losses of fish and other aquatic organisms caused by
CWIS at existing 316(b) facilities.

To understand how much individuals value preventing fish losses,
increasing fish populations, improvements in aquatic ecosystems, and
increasing commercial and recreational catch rates.

To understand how such values depend on the current baseline level of
fish populations and fish losses, the scope of the change in those
measures, and the certainty level of the predictions.

To understand how such values vary with respect to individuals’
economic and demographic characteristics.

Understanding total public values for fish resources lost to I&E
mortality is necessary to determine the full range of benefits
associated with reductions in impingement and entrainment losses at
existing 316(b) facilities. Because non-use values may be substantial,
failure to recognize such values may lead to improper inferences
regarding policy benefits (Freeman 2003).

1(b)	Key Variables

The key questions in the survey ask respondents whether or not they
would vote for policies that would increase their cost of living, in
exchange for specified changes in: (a) I&E mortality losses of fish, (b)
commercial fish sustainability, (c) long-term fish populations, and (d)
condition of aquatic ecosystems.  More specifically, the choice
experiment framework allows respondents to view pairs of multi-attribute
policies associated with the reduction of I&E mortality losses. 
Respondents are asked to choose the program that they would prefer, or
to choose to reject both policies.  This follows well-established choice
experiment methodology and format (Adamowicz et al. 1998; Louviere et
al. 2000; Bennett and Blamey 2001; Bateman et al. 2002).  Important
variables in the analysis of the choice questions are how the respondent
votes, the amount of the cost of living increase, the number of fish
losses that are prevented, the sustainability of commercial fishing, the
change in fish populations, and the condition of aquatic ecosystems. 
Other important variables include whether or not the respondent is a
user of the affected aquatic resources, household income, and other
respondent demographics.

1(c)	Statistical Approach

EPA believes that a statistical survey approach is appropriate.  A
census approach is impractical because contacting all households in the
U.S. would require an enormous expense.  On the other hand, an anecdotal
approach is not sufficiently rigorous to provide a useful estimate of
the total value of fish loss reductions for the 316(b) case.  Thus, a
statistical survey is the most reasonable approach to satisfy EPA’s
analytic needs for the 316(b) regulation benefit analysis.

EPA has retained Abt Associates Inc. (55 Wheeler Street, Cambridge, MA
02138) as a contractor to assist in questionnaire design, sampling
design, and analysis of the survey results.  

1(d)	Feasibility

The survey instrument was repeatedly pre-tested during a series of seven
focus groups (conducted under a different ICR with OMB control #
2090-0028), in addition to the twelve focus groups conducted for the
Phase III survey (EPA-HQ-OW-2004-0020), and it will be subject to peer
review by reviewers in academia and government, so EPA does not
anticipate that respondents will have difficulty interpreting or
responding to any of the survey questions.  Additionally, since the
survey will be administered as a mail survey, it will be easily
accessible to respondents.  Thus, EPA believes that respondents will not
face any obstacles in completing the survey, and that the survey will
produce useful results.  EPA has dedicated sufficient funding (under EPA
contracts No. EP-C-07-23) to design and implement the survey.  Given the
timetable outlined in Section A.5(d) of this document, the survey
results will be available for timely use in the final benefits analysis
for the 316(b) existing facilities rule.

2.	Survey Design

2(a)	Target Population and Coverage

The target population for this survey includes individuals from
continental U.S. households who are 18 years of age or older.  The
sample will be chosen to reflect the demographic characteristics of the
general U.S. population.

2(b)	Sampling Design

(I)	Sampling Frames

The sampling frame for this survey is the panel of individuals recruited
by random digit dialing (over-lapping dual frame; land and cell) to
participate in a mail survey.  The overall sampling frame from which
these individuals would be selected is the set of all individuals in
continental U.S. households who are 18 years of age or older and who
have listed phone numbers.  Individuals in the panel are recruited using
a list-assisted random digit dialing telephone methodology, thus
providing a probability-based starting sample of U.S. telephone
households.  

The telephone screening will use an overlapping dual frame design of
landline and cell phone numbers to increase the coverage of the sample. 
The inclusion of cell phone numbers will allow EPA to reach potential
respondents who only have cell phones and would be missed if only a
sample of landlines was selected. The dual frame design also avoids bias
because the cell-phone-only population skews heavily towards
lower-income and younger individuals.  The cell and landline samples
will be weighted to correct for each individual’s probability of being
included in the survey.  People who live in a household with more than
one person are less likely to be chosen for the survey, and people with
more than one telephone line (e.g., both cell and landline) are more
likely to be chosen for the survey.  All landline respondents to the
telephone screener will be asked the number of eligible respondents in
their household and every respondent (cell and landline) will be asked
whether they have a landline only, a cell phone only, or both.  Next,
the survey data will be weighted to match national data (U.S. Census and
National Health Interview Survey) by a set of demographic variables
(e.g., age and gender) as well as phone usage (cell-only users,
landline-only users and people who use both cell phones and landline
phones). For January through June 2009, the National Health Interview
Survey indicated that 15.5 percent of households are landline only, 22.7
percent are cell phone only, and 59.4 percent have both landline and
cell phone service.  About two percent of households have no telephone
service.  It is anticipated that estimates for landline and cell phone
service from the January to June 2010 National Health Interview Survey
will be available for weighting purposes by the time the survey would be
implemented.

For discussion of techniques that EPA will use to minimize non-response
and other non-sampling errors in the survey sample, refer to Section
2(b)(II), below.

(II)	Sample Sizes

The intended sample size for the survey is 2,288 households including
only households providing completed mail surveys.  This sample size was
chosen to provide statistically robust regression results while
minimizing the cost and burden of the survey.  Given this sample size,
the level of precision (see section 2(c)) achieved by the analysis will
be more than adequate to meet the analytic needs of the benefits
analysis for the 316(b) regulation.  For further discussion of the level
of precision required by this analysis, see Section 2(c)(I) below.

(III)	Stratification Variables

The survey sample will be selected using a stratified selection process.
 For the selection of households, the population of households in the
contiguous 48 states and the District of Columbia will be stratified by
the geographic boundaries of four study regions: Northeast, Southeast,
Inland, and Pacific.  As described previously, the Northeast region
includes the North Atlantic and Mid Atlantic 316(b) benefits regions,
the Southeast region includes the South Atlantic and Gulf of Mexico
316(b) benefits regions, the Pacific region includes states on the
Pacific coast, and the Inland region includes all non-coastal states.
The sample is allocated to each region in proportion to the total number
of households in that region, with at least 288 completed surveys in
each region.  This is the number required to estimate the main effects
and interactions under an experimental design model as described in
Section 4(a) of Part A. To accommodate this requirement the sample sizes
in other regions will be slightly reduced. A sample of 288 households
completing the national survey version would be distributed among the
study regions based on the percentage of regional survey sample (as
shown in Table A1) to ensure that respondents to the national survey
version are distributed across the continental U.S. 

(IV)	Sampling Method

Using the stratification design discussed above, respondents will be
randomly selected.  First, a larger sample of households will be
selected through random digit dialing in each of the 4 regions.  The
household to which the selected telephone number belongs will be called
and the respondent will be asked whether he or she is willing to provide
information required by the survey.  If the person in the household
agrees, then a questionnaire will be mailed to that household.   If it
is assumed that 40% of the households which agreed to provide
information actually return a completed mail survey (completion rate)
then 5,720 questionnaires will need to be mailed to households.

If we assume that 60% of the households that are called say “yes”
they are willing to receive a questionnaire (the screening response
rate), then we need to screen 9,533 households.  A dual frame random
digit dialing design, including both landline and cell phone numbers,
will be used for telephone screening.  If we assume that 40% of the
households called will participate in the telephone screener, then we
will need to call 23,833 numbers.  The actual number of telephone
numbers selected through random digit dialing sampling will be larger
than the number of households called.  This is because we are expecting
that 40% of the telephone numbers initially selected will be screened
out as non-working or business numbers. Therefore the actual number of
telephone numbers selected will be around 39,722.  The quantity of
telephone numbers that we need to select in each region can be obtained
by allocating the total number in proportion to the total households in
that region. 

For obtaining population-based estimates of various parameters, each
responding household will be assigned a sampling weight.  This weight
combines a base sampling weight which is the inverse of the probability
of selection of the household and then an adjustment for non-response. 
The weights will be used to produce estimates that are generalizable to
the population from which the sample was selected (e.g., percent of
population participating in water-based recreation such as fishing and
shellfishing). Proportional allocation of the sample to regions ensures
an equal probability sample.  To estimate total WTP for the quantified
environmental benefits of the 316(b) existing facilities rulemaking data
will be analyzed statistically using a standard random utility model
framework. 

(V)	Multi-Stage Sampling

Multi-stage sampling will not be necessary for this survey.

2(c)	Precision Requirements

(I)	Precision Targets

Table B1, below, shows the target samples sizes for both the U.S.
(excluding Alaska and Hawaii) and each of the four EPA study regions. 
At the regional level, a sample of 2,000 households (completed surveys)
will provide estimates of population percentages with a margin of error
ranging from 3.6 to 5.8 percentage points at the 95% confidence level. 
A sample of 288 household for the national survey version (completed
surveys) will provide an estimate of population percentages with a
margin of error no greater than 5.8 percentage points at the 95%
confidence level. 

Table B1: Number of Households and Household Sample for Each EPA Study
Region

Region	Household Population	Household Sample 

Northeast	23,281,296	417

Southeast	31,378,122	562

Inland	40,852,983	732

Pacific	16,158,206	288

Total for Regional Survey Versions	111,670,607	2,000

National Survey Version	111,670,607	288

Source: The number of households in each region was obtained based on
the estimated population size and average household size from the
2006-2008 American Community Survey (ACS).



(II)	Non-Sampling Errors

One issue that may be encountered in stated preference surveys is the
problem of protest responses.  Protest responses are responses from
individuals who reject the survey format or question design, even though
they may value the resources being considered (Mitchell and Carson
1989).  For example, some respondents may feel that any amount of I&E is
unacceptable, and choose not to respond to the survey.  To deal with
this issue, EPA has included several questions, including an open-ended
comments section, to help identify protest responses.  The use of such
methods to identify protest responses is well-established in the
literature (Bateman et al. 2002).  Moreover, many researchers (e.g.,
Bateman et al. 2002) suggest that a choice experiment format, such as
that proposed here, may ameliorate such responses (over the earlier
contingent valuation format).

A different type of non-sampling error is non-response bias. 
Non-response rates in this survey are affected by non-response at each
stage of sampling.  There are two types of non-respondents to this
survey: respondents unwilling to complete a questionnaire when initially
contacted by phone and respondents who agree but do not complete the
questionnaire.  EPA will conduct several additional activities to
minimize the potential for non-response bias in the current survey: 

The telephone interviews will be conducted by trained interviewers who
are experienced in encouraging response.  Calls will be placed on
different days of the week and at different times to maximize response. 

The mail survey will be designed to maximize response.  A mail survey
will be sent to all those who agree to participate.  Then, one week
later, a postcard reminder will be sent.  Three weeks later, a second
copy of the questionnaire with a revised cover letter will be sent to
non-respondents.  As final follow-up, another postcard reminder will be
sent one week later.

Response rates will be tracked on a daily basis for both the telephone
screener and mail survey.  If any unexpected declines are encountered,
corrective action can immediately be undertaken.

EPA will undertake non-response bias analysis as detailed in the
following section. 

If necessary, EPA will use appropriate weighting or other statistical
adjustment to correct the bias because of non-response.

Non-response Interviews

To determine whether there is any evidence of significant non-response
bias in the completed sample, EPA will compare the characteristics of
the completed and non-completed cases from the random digit dialing
sample. There are two main stages of non-response: non-response to the
telephone survey and non-response to the mail survey.

To assess the potential for telephone non-response bias, EPA will
collect sample characteristics in the course of the telephone survey as
follows: (1) prior to household contact, (2) at respondent selection for
the survey, and (3) after beginning the interview.  EPA will then
analyze the data available for respondents and non-respondents prior to
household contact and at respondent selection. Because the telephone
instrument is very short, we do not expect many people prematurely
terminating the call after the selected household member has started the
interview.

The drawn sample of random digit dialing numbers from the landline frame
that are released to telephone interviewers for contact attempts can be
coded for certain contextual characteristics.  The “hundreds bank”
from which the RDD number is drawn (e.g., 301-608-38xx) is classified
geographically according to the unit from which the majority of listed
numbers in that bank are located.  This permits us to classify all of
the numbers drawn by EPA region, State, and urbanicity (central city,
metro remainder, and non-metropolitan area).  It also permits us to
classify those numbers by demographic characteristics, such as
percentage of African-American and Hispanic, based on Census data linked
to telephone exchanges.  EPA will compare the characteristics of all
dialed residential numbers in which an interview had not been completed
with those that had been completed using these variables to determine
whether non-response bias is present.

Once contact is achieved with a household, one adult is selected in each
household as the designated respondent.  If there is more than one
eligible respondent per household, then a random selection is done for
the individual with the most recent/next birthday.  EPA will identify
the age and gender of this individual if callbacks are necessary.  This
information permits additional comparisons of any differences in the age
and gender of respondents in the completed sample and those in the
non-completed sample after household contact and respondent selection
have been made. 

The telephone screener includes not only an invitation to participate in
the mail survey, but also some attitudinal (relative importance of
various government activities) differences and behavioral differences
(recreational activities), as well as demographic characteristics (see
Attachment 6 for detail). EPA will use the telephone screening data to
compare mail survey respondents and non-respondents. The items of
information collected during telephone screening will help determine the
type of person that is likely to not respond to the survey and may help
in forming weighting classes for adjusting weights of respondents to
account for non-response and minimize the bias because of non-response. 

2(d)	Questionnaire Design

The information requested by the survey is discussed in Section 4(b)(I)
of Part A of the supporting statement.  The full text of the draft
questionnaire for the Northeast region is provided in Attachment 1 and
the full text of the draft questionnaire for the national survey version
is provided in Attachment 2.  

The following bullets discuss EPA’s reasons for including the
questions in the survey:

Relative Importance of Issues Associated with Industrial Cooling Water. 
EPA included this section to prepare respondents to answer the stated
preference questions by motivating respondents to consider the relative
importance of key issues associated with the use of cooling water by
industrial facilities.

Concern for Policy Issues.  EPA included this section to prepare
respondents to answer the stated preference questions by motivating
respondents to think about the relative importance of different policy
issues.

Relative Importance of Effects.  This section was included to promote
understanding of the metrics included in the stated preference questions
by asking them to consider their relative importance prior to evaluating
policy options and by encouraging respondents to re-read previous pages
for reminders if necessary.

Voting for Regulations to Prevent Fish Losses in the Respondent’s
Region (or Nationally).   The questions in this section are the key part
of the survey.  Respondents’ choices when presented with specific
fish-related resource changes within their region and household cost
increases are the main data that allow estimation of willingness-to-pay.
 The questions are presented in a choice experiment (A, B, or neither)
format because this is an elicitation format that has been successfully
used by a number of previous valuation studies (Adamowicz et al. 1998;
Bateman et al. 2002; Bennett and Blamey 2001; Louviere et al. 2000;
Johnston et al. 2002a, 2005; Opaluch et al. 1993).  Furthermore, many
focus group participants indicated that they have some previous
experience making choices within a framework in which they are asked to
vote for one of a series of options, and are comfortable with this
format.

Reasons for Voting “No Policy”. This question provides information
that will be used by EPA to identify protest responses.

Respondent Certainty and Reasons for Voting.  This section is designed
to identify respondents who incorrectly interpreted the choice questions
or the uncertainty of outcomes.  Responses to these questions are
important to successfully control for hypothetical bias.   

Recreational Experience. This question elicits recreational experience
data to test if certain respondent characteristics influence responses
to the referendum questions.  This question will also allow EPA to
identify resource non-users, for purposes of estimating non-user WTP (to
gauge the relative importance of non-use values to overall benefits).

Demographics.  Responses to these questions will be used to estimate the
influence of demographic variables on respondents’ voting choices, and
ultimately, their WTP to prevent I&E mortality losses of fish.  This
information will allow EPA to use regression results to estimate WTP for
populations in different regions affected by the 316(b) rule for
existing facilities.

Comments.  This section is primarily intended to help identify protest
responses, i.e. responses from individuals who rejected the format of
the survey or the way the questions were phrased.

3.	Pretests and Pilot Tests

EPA conducted extensive pretests of the survey instrument during a set
of seven focus groups (EPA ICR # 2090-0028), in addition to the twelve
focus groups conducted for the Phase III survey.  These focus groups
included individual cognitive interviews with survey respondents
(Kaplowicz et al. 2004), and think-aloud or verbal protocol analyses
(Schkade and Payne 1994).  Individuals in these focus groups completed
draft survey questionnaires and provided comments and feedback about the
survey format and content, their interpretations of the questions, and
other issues relevant to stated preference estimation.  Particular
emphasis in these survey pretests was on testing for the presence of
potential biases associated with poorly-designed stated preference
surveys, including hypothetical bias, strategic bias, symbolic (warm
glow) bias, framing effects, embedding biases, methodological
misspecification, and protest responses (Mitchell and Carson 1989). 
Based on focus group and cognitive interview responses, EPA made various
improvements to the survey questionnaire including changes to ameliorate
and minimize these biases in the final survey instrument. 

4.	Collection Methods and Follow-up

4(a)	Collection Methods

The survey will be administered as a mail survey.  Respondents
identified by random digit dialing who agree to participate will be
mailed a copy of the survey questionnaire.  Respondents will be asked to
mail the completed survey back to EPA.  

4(b)	Survey Response and Follow-up

The target response rate for the mail survey is 40 percent.  That is, 40
percent of households which agreed to participate in the mail survey are
expected to return a completed survey.  However, the actual response
rate compared to the general population is lower, given that the
telephone screening process is expected to have a response rate of 60
percent among contacted households.  The resolution rate (determining
the status of a telephone number), the screener response rate and the
survey completion rate will contribute to the overall response rate in
the survey.  The actual number of telephone numbers selected for
inclusion will be much larger to account for the exclusion of
non-working and business numbers.  The number of telephone numbers
needed for selection in each region can be obtained by allocating the
total number in proportion to the number of households in that region. 

To improve the response rate, all of these respondents will receive a
reminder postcard approximately one week after the initial questionnaire
mailing. Then, approximately three weeks after the reminder postcard,
all those who have not responded will receive a second copy of the
questionnaire with a revised cover letter.  The following week, another
reminder postcard will be sent.  

As noted in Section 2(b), the survey sample will be selected using a
stratified selection process.  For the selection of households, the
population of households in the contiguous 48 states and the District of
Columbia will be stratified by the geographic boundaries of four EPA
study regions.  In addition, EPA will administer a national version of
the survey that does not require stratification.  We will keep track of
the response rates for each of regional surveys and the national version
of the survey to ensure that the rates are reasonable.  We will also
look at the frame characteristics of non-respondents to determine if
there are any substantial biases in the estimates because of an
imbalance in the distribution of certain important subgroups in the
sample.  

5.	Analyzing and Reporting Survey Results

5(a)	Data Preparation

Since the survey will be administered as a mail survey, survey responses
will manually entered into an electronic database after they are
returned.  After all responses have been entered, the database contents
will be converted into a format suitable for use with a statistical
analysis software package.  The mail survey, database management, and
data set conversion will be conducted by Abt Associates Inc.

All survey responses will be vetted for completeness.  Additionally,
respondents’ answers to the choice experiment questions will be tested
to ensure that they are internally consistent with respect to scope and
other expectations of neoclassical preference theory, such as
transitivity.  Responses which satisfy transitivity exhibit relational
relationships when separate choices among policy options are compared. 
For example, if values for policy 1 are greater than policy 2, and
values for policy 2 are greater than Policy 3, then values for policy 1
should also be greater than values for Policy 3.

5(b)	Analysis

	Once the survey data has been converted into a data file, it will be
analyzed using statistical analysis techniques.  The following section
discusses the model that will be used to analyze the stated preference
data from the survey.

Analysis of Stated Preference Data

	The model for analysis of stated preference data is grounded in the
standard random utility model of Hanemann (1984) and McConnell (1990). 
This model is applied extensively within stated preference research, and
allows well-defined welfare measures (i.e., willingness to pay) to be
derived from choice experiment models (Bennett and Blamey 2001; Louviere
et al. 2000).  Within the standard random utility model applied to
choice experiments, hypothetical policy alternatives are described in
terms of attributes that focus groups (Johnston et al. 1995; Adamowicz
et al. 1998; Opaluch et al. 1993) reveal as relevant to respondents’
utility, or well-being.  One of these attributes would include a
mandatory monetary cost to the respondent’s household.  

	Applying this standard model to choices among policies to reduce I&E
mortality losses, EPA defines a standard utility function Ui(.) that
includes environmental attributes of an I&E reduction plan and the net
cost of the plan to the respondent.  Following standard random utility
theory, utility is assumed known to the respondent, but stochastic from
the perspective of the researcher, such that

εi

where:

Xi	=	a vector of variables describing attributes of I&E reduction plan
i;

D	=	a vector characterizing demographic and other attributes of the
respondent.

Y 	= 	disposable income of the respondent.

Fi	=	mandatory additional cost faced by the household under plan i;

v(.)	=	a function representing the empirically estimable component of
utility;

εi	=	stochastic or unobservable component of utility, modeled as an
econometric error.

≠i considered by the respondent.  In this case, the respondent’s
choice set of potential policies also includes maintaining the status
quo. The random utility model presumes that the respondent assesses the
utility that would result from each I&E reduction plan i (including the
status quo), and chooses the plan that would offer the highest utility. 


When faced with k distinct plans defined by their attributes, the
respondent will choose plan i if the anticipated utility from plan i
exceeds that of all other k-1 plans.  Drawing from (1), the respondent
will choose plan i if

) + εi) ≥ (v(Xk, D, Y-Fk) + εk) ( k≠i.	

	If the εi are assumed independently and identically drawn from a type
I extreme value  (Gumbel) distribution, the model may be estimated as a
conditional logit model, as detailed by Maddala (1983), Greene (2003)
and others.  This model is most commonly used when the respondent
considers more than two options in each choice set (e.g., Plan A, Plan
B, Neither Plan), and results in an econometric (empirical) estimate of
the systematic component of utility v(.), based on observed choices
among different policy plans.  Based on this estimate, one may calculate
welfare measures (willingness to pay) following the well-known methods
of Hanemann (1984), as described by Freeman (2003) and others. Following
standard choice experiment methods (Adamowicz et al. 1998; Bennett and
Blamey 2001), each respondent will consider questions including three
potential choice options (i.e., Plan A, Plan B, Neither Plan)—choosing
the option that provides the highest utility as noted above.  Following
clear guidance from the literature, a “neither plan” or status quo
option is always included in the visible choice set, to ensure that WTP
measures are well-defined (Louviere et al. 2000).

	EPA also anticipates that respondents will consider more than one
choice question within the same survey, to increase information obtained
from each respondent.  This is standard practice within choice
experiment and dichotomous choice contingent valuation surveys (Poe et
al. 1997; Layton 2000).  While respondents will be instructed to
consider each choice question as independent of other choice questions,
it is nonetheless standard practice within the literature to allow for
the potential of correlation among questions answered within a single
survey by a single respondent.  That is, responses provided by
individual respondents may be correlated even though responses across
different respondents are considered independent and identically
distributed (Poe et al. 1997; Layton 2000; Train 1998).  

	There are a variety of approaches to such potential correlation. 
Following standard practice, EPA anticipates the estimation of a variety
of models to assess their performance.  Models to be assessed include
random effects and random parameters (mixed) discrete choice models, now
common in the stated preference literature (Greene 2003; McFadden and
Train 2000; Poe et al. 1997; Layton 2000).  Within such models, selected
elements of the coefficient vector are assumed normally distributed
across respondents, often with free correlation allowed among parameters
(Greene 2002). If only the model intercept is assumed to include a
random component, then a random effects model results.  If both slope
and intercept parameters may vary across respondents, then a random
parameters model is estimated.  EPA anticipates that such models will be
estimated using standard maximum likelihood for mixed conditional logit
techniques, as described by Train (1998), Greene (2002) and others. 
Mixed logit model performance of alternative specifications will be
assessed by EPA using standard statistical measures of model fit and
convergence, as detailed by Greene (2002, 2003) and Train (1998).

Advantages of Choice Experiments

	Choice experiments following the random utility model outlined above
are favored by many researchers over other variants of stated preference
methodology (Adamowicz et al. 1998; Bennett and Blamey 2001), and may be
viewed as a “natural generalization of a binary discrete choice CV
[contingent valuation]” (Bateman et al. 2002, p. 271).  Advantages of
choice experiments include a capacity to address choices over a wide
array of potential policies, grounded in well-developed random utility
theory, and the similarity of the discrete choice context to familiar
referendum or voting formats (Bennett and Blamey 2001).  Compared to
other types of stated preference valuation, choice experiments are
better able to measure the marginal value of changes in the
characteristics or attributes of environmental goods, and avoid response
difficulties and biases (Bateman et al. 2002).  For example, choice
experiments may reduce the potential for ‘yea-saying’ and symbolic
biases (Blamey et al. 1999; Mitchell and Carson 1989), as many pairs of
multi-attribute policy choices (e.g., Plan A, Plan B, Neither) will
offer no clearly superior choice for a respondent wishing to express
solely symbolic environmental motivations.  For similar reasons choice
experiments may ameliorate protest responses (Bateman et al. 2002).  An
additional advantage of such methods is that they permit straightforward
assessments of the impact of resource scope and scale on respondents’
choices.  This will enable EPA to easily conduct scope tests and other
assessments of the validity of survey responses (Bateman et al. 2002, p.
296-342).  Finally, such methods are well-established in the stated
preference literature (Bennett and Blamey 2001).  Additional details of
choice experiment methodology (also called choice modeling) are provided
by Bennett and Blamey (2001), Adamowicz et al. (1998), Louviere et al.
(2000) and many other sources in the literature.  

An additional advantage of choice experiments in the present application
is that they are commonly applied to assess WTP for ecological resource
improvements of a type quite similar to those at issue in the 316(b)
policy case.  Examples of the application of choice experiments to
estimate WTP associated with changes in aquatic life and habitat include
Hoehn et al. (2004), Johnston et al. (2002b), and Opaluch et al. (1999),
among others.  EPA has drawn upon these and other examples of successful
choice experiment design to provide a basis for survey design in the
present case.

A final key advantage of choice experiments in the present application
is the ability to estimate respondents’ WTP for a wide range of
different potential outcomes of 316(b) policies, differentiated by their
attributes.  The proposed choice experiment survey versions will allow
different respondents to choose among a wide variety of hypothetical
policy options, some with larger and other with very small changes in
the presented attributes (annual fish losses, long-term fish
populations, recreational and commercial catch, ecosystem condition, and
household cost).  That is, because the survey is to be implemented as a
choice experiment survey, levels of attributes in choice scenarios will
vary across respondents (Louviere et al. 2000).  The experimental design
will also explicitly allow for variation in baseline population and
harvest levels, following standard practice in the literature (Louviere
et al. 2000; Bateman et al. 2002).  

Aside from providing the capacity to estimate WTP for a wide range of
policy outcomes, it also frees EPA from having to predetermine a single
policy outcome for which WTP will be estimated.  Given the potential
biological uncertainty involved in the 316(b) policy case, the ability
to estimate values for a wide range of potential outcomes is critical.

The ability to estimate WTP for a wide range of different policy
outcomes is a fundamental property of the choice experiment method
(Bateman et al. 2002; Louviere et al. 2000; Adamowicz et al. 1998).  For
the purpose of stated preference survey implementation, EPA will use
four geographic regions: Northeast, Southeast, Inland, and Pacific.  The
Northeast regional survey is included in this ICR as Attachment 1.  In
addition, EPA will administer a national version of the survey that is
included as Attachment 2.  EPA emphasizes that the survey versions
included in this ICR are for illustration only; they are but two of what
will ultimately be a large number of different survey versions covering
a wide range of potential policy outcomes as described in Attachment 4.
The experimental design (see below) will allow for survey versions
showing a range of different baseline and resource improvement levels,
where these levels are chosen to (almost certainly) bound the
“actual” levels.  Given that there will almost certainly be some
biological uncertainty regarding the specifics of the “actual”
baselines and improvements, the resulting valuation estimates will allow
flexibility in estimating WTP for a wide range of different
circumstances.  Additional details on the statistical (experimental)
design of the choice experiment is provided in later sections of this
ICR.

Comment on Survey Preparation and Pretesting

	Following standard practice in the stated preference literature
(Johnston et al. 1995; Desvousges and Smith 1988; Desvousges et al.
1984; Mitchell and Carson 1989), all survey elements and methods were
subjected to extensive development and pretesting in focus groups to
ameliorate the potential for survey biases (cf. Mitchell and Carson
1989), and to ensure that respondents have a clear understanding of the
policies and goods under consideration, such that informed choices may
be made that reflect respondents’ underlying preferences.  Following
the guidance of Arrow et al. (1993), Johnston et al. (1995), and
Mitchell and Carson (1989), focus groups were used to ensure that
respondents are aware of their budget constraints, the scope of the
resource changes under consideration, and the availability of substitute
environmental resources.  

As noted above, survey pretests included individual cognitive interviews
with survey respondents (Kaplowicz et al. 2004), and think-aloud or
verbal protocol analyses (Schkade and Payne 1994).  Individuals in these
pretests completed draft survey questionnaires and provided comments and
feedback about the survey format and content, their interpretations of
the questions, and other issues relevant to stated preference
estimation.  Based on their responses, EPA made improvements to the
survey questionnaire.  Of particular emphasis in these survey pretests
was testing for the presence of potential biases including hypothetical
bias, strategic bias, symbolic (warm glow) bias, framing effects,
embedding biases, methodological misspecification, and protest responses
(Mitchell and Carson 1989).  Based on focus group and cognitive
interview responses, EPA made various improvements to the survey
questionnaire including changes to ameliorate and minimize these biases
in the final survey instrument. Results from focus groups and cognitive
interviews provided evidence that respondents answer the stated
preference survey in ways appropriate for stated preference WTP
estimation, and that their responses generally do not reflect the biases
noted above.  

The number of focus groups used in survey design, seven (excluding the
12 focus groups conducted for the Phase III survey), exceeds the number
of focus groups used in typical applications of stated preference
valuation.  Moreover, EPA incorporated cognitive interviews as detailed
by Kaplowicz et al. (2004).  We note that the current survey instrument
is built upon an earlier version that was peer reviewed in January 2006
(Versar 2006) and it incorporates recommendations received from that
peer review panel.  Given this extensive effort in survey
design—applying the most state-of-the-art methods available in the
literature—EPA believes that survey design far exceeds standards that
are typical in the published literature. The details of focus groups
conducted for the previous Phase III survey are discussed by EPA in a
prior ICR (#2155.01).

Econometric Specification

	Based on prior focus groups, expert review, and attributes of the
policies under consideration, EPA anticipates that four attributes will
be incorporated in the vector of variables describing attributes of an
I&E reduction plan (vector Xi), in addition to the attribute
characterizing unavoidable household cost Fi.   These attributes will
characterize the annual reduction in I&E losses (x1), anticipated
effects on fish populations (all fish) (x2), anticipated effects on
commercial fish populations (x3), and anticipated effects on aquatic
ecosystem condition (x4).  These variables will allow respondents’
choices to reveal the potential impact of both annual fish losses and
long-term population effects on utility.  Based on results of focus
groups and expert opinion, these will be presented as averages across
identified aggregate species groups.  The survey will also allow for
changes in baseline population levels, to assess whether WTP depends on
the “starting point” of fish populations.

	Although the literature offers no firm guidance regarding the choice of
specific functional forms for v(.) within choice experiment estimation,
in practice, linear forms are often used (Johnston et al. 2003b), with
some researchers applying more flexible (e.g., quadratic) forms
(Cummings et al. 1994).  Standard linear forms are anticipated as the
simplest form to be estimated by EPA, from which more flexible
functional forms (able to capture interactions among model variables)
will be derived and compared.  Anticipated extensions to the simple
linear model include more fully-flexible forms that allow for systematic
variations in slope and intercept coefficients associated with
demographic or other attributes of respondents.  Such variations may be
incorporated by appending the simple linear specification with quadratic
interactions between variables in vector D and the variables Xi and Fi
(cf. Johnston et al. 2003b).  

	One may also incorporate quadratic interactions between policy
attributes Xi and Fi, (cf. Johnston et al. 2002b).  Such quadratic
extensions of the basic linear model allow for additional flexibility in
modeling the relationship between policy attributes (including cost) and
utility, as suggested by Hoehn (1991) and Cummings et al. (1994).  EPA
anticipates estimating both simple linear specifications, as well as
more fully-flexible quadratic specifications following Hoehn (1991) and
Cummings et al. (1994), to identify those models which provide the most
satisfactory statistical fit to the data and correspondence to theory. 
EPA anticipates estimating all models within the mixed logit framework
outlined above.  Model fit will be assessed following standard practice
in the literature (e.g., Greene 2003; Maddala 1983).  Linear and
quadratic functional forms discussed here, as they are common practice
in the literature, are presented and discussed in many existing sources
(e.g., Hoehn 1991, Cummings et al. 1994, Johnston et al. 1999, and
Johnston et al. 2003b).

	For example, for each choice occasion, the respondent may choose Option
A, Option B, or Neither, where “neither” is characterized by 0
values for all attributes (except Baseline population levels). Assuming
that the model is estimated using a standard approximation for the
observable component of utility, an econometric specification of the
desired model (within the overall multinomial logit model) might appear
as:

v(()  = 	(0 + (1(Fish Saved) + (2(Change in Populations of All Fish) +
(3(Change in Commercial Fish Populations) + (4(Change in Condition of
Aquatic Ecosystem) + (5(Cost) + (6(Fish Saved)(Baseline) + (7(Change in
Populations of All Fish)(Baseline) + (8(Change in Commercial Fish
Populations)(Baseline) + (9(Change in Aquatic Ecosystem)(Baseline) +
(10(Cost)(Baseline) + (11(Fish Saved)(Change in Populations of All Fish)
+ (12(Fish Saved)(Change in Commercial Fish Populations) + (13(Fish
Saved)(Change in Aquatic Ecosystem) + (14(Change in Populations of All
Fish)(Change in Commercial Fish Populations) + (15(Change in Populations
of All Fish)(Change in Aquatic Ecosystem) + (16(Change in Commercial
Fish Populations)(Change in Aquatic Ecosystem)

Main effects are in bold.  Interactions are in italics.  This sample
specification—one of many to be estimated by EPA—allows one to
estimate the relative “main effects” of policy attributes (annual
reduction in I&E losses, long-term effects on populations of all fish,
long-term effects on commercial fish populations) on utility, effects on
aquatic ecosystem condition, as well as interactions between these main
effects.  This specification also allows EPA to assess the impact of
baseline fish populations on the marginal value of changes in other
model attributes.  In sum, specifications such as this allow WTP to be
estimated for a wide-range of potential policy outcomes, and allow EPA
to test for a wide-range of main effects and interactions within the
utility function of respondents.  Such flexible utility specifications
for stated preference estimation are recommended by numerous sources in
the literature, including Johnston et al. (2002b), Hoehn (1991), and
Cummings et al. (1994), and follow standard practice in choice modeling
outlined by Louviere et al. (2000) and others.

Experimental Design

	Experimental design for the choice experiment surveys will follow
established practices.  Fractional factorial design will be used to
construct choice questions with an orthogonal array of attribute levels,
with questions randomly divided among distinct survey versions (Louviere
et al. 2000).  Based on standard choice experiment experimental design
procedures (Louviere et al. 2000), the number of questions and survey
versions will be determined by, among other factors: a) the number of
attributes in the final experimental design and complexity of questions,
b) the extent to which estimation of interactions and higher-level
effects is desired, and c) pretests revealing the number of choice
experiment questions that respondents are willing/able to answer in a
single survey session, and the number of attributes that may be varied
within each question while maintaining respondents’ ability to make
appropriate neoclassical tradeoffs.  

Based on the models proposed above and recommendations in the
literature, EPA anticipates an experimental design that allows for an
ability to estimate main effects, quadratic effects, and two-way
interactions between policy attributes (Louviere et al. 2000).  Choice
sets (Bennett and Blamey 2001), including variable level selection, will
be designed by EPA based on the goal of illustrating realistic policy
scenarios that “span the range over which we expect respondents to
have preferences, and/or are practically achievable” (Bateman et al.
2002, p. 259), following guidance in the literature.  This includes
guidance with regard to the statistical implications of choice set
design (Hanemann and Kanninen 1999) and the role of focus groups in
developing appropriate choice sets (Bennett and Blamey 2001).

Based on these guiding principles, the following experimental design
framework is proposed by EPA.  The experimental design will be conducted
by Abt Associates Inc.  The experimental design will allow for both main
effects and selected interactions to be efficiently estimated, based on
a choice experiment framework.  For a more detailed discussion of the
experimental design, refer to Attachment 4.

Each treatment (survey question) includes two choice Options (A and B),
characterized by four attributes and a cost variable that vary across
the two choice options (Commercial Fish Populations, Fish Populations
(all fish), Fish Saved per Year, Condition of Aquatic Ecosystems, and
Increase in Cost of Living of Your Household).  Hence, there are a total
of ten attributes for each treatment.  Based on focus groups and
pretests, and guided by realistic ranges of attribute outcomes, EPA
allows for three different potential levels for Commercial Fish
Populations, Fish Populations (all fish), Fish Saved per Year, and
Condition of Aquatic Ecosystems, and allows for six different levels of
annual Household Cost for the regional or national choice questions. 
The survey versions also include two separate treatments of location, or
geographic scope; the first version refers to policies only within the
respondent’s region and the second version refers to national
policies. 

The number of combinations for each attribute may be summarized as
follows:

LocationA, LocationB								(2 levels)

Commercial Fish PopulationsA, Commercial Fish PopulationsB		(3 levels)

Fish Populations (all fish)A, Fish Populations (all fish)B			(3 levels)

Fish Saved per YearA, Fish Saved per YearB					(3 levels)

Condition of Aquatic EcosystemsA, Condition of Aquatic EcosystemsB	(3
levels)

CostA, CostB									(6 levels)

Beyond the levels specified above, each question will include a “no
policy” option, characterized by baseline levels for each attribute
including a household cost of $0.

Following standard practice, EPA constrained the design somewhat in
response to findings in seven focus groups and the prior literature. 
For example, the focus groups showed that respondents react negatively
and often protest when offered choices in which one option dominates the
other in all attributes.  Given that such choices provide negligible
statistical information compared to choices involving
non-dominant/dominated pairs, they are typically avoided in choice
experiment statistical designs. For example, Hensher and Barnard (1990)
recommend eliminating profiles including dominating or dominated
profiles, because such profiles generally provide no useful information.
 Following this guidance, EPA constrained the design to eliminate such
dominant/dominating pairs.  EPA also constrained the design to eliminate
the possibility of pairs in which, when looking across two options, one
of the options offers both a greater reduction in fish losses and a
smaller increase in the population.  The elimination of such nonsensical
(or non-credible) pairs is common practice, and is done to avoid protest
bids and confusion among respondents (Bateman et al. 2002).

The resulting experimental design is characterized by 72 unique A vs. B
option pairs, where attribute levels for option A and B differ across
each of the pairs.  Each pair represents a unique choice modeling
question—with a unique set of attribute levels distinguishing options
A and B.  Following standard practice for mail surveys, these questions
will be randomly assigned to survey respondents, with each respondent
considering three questions. 

Information Provision

	According to Arrow et al. (1993, p. 4605), if “surveys are to elicit
useful information about willingness to pay, respondents must understand
exactly what it is they are being asked to value.”  It is also well
known that the provided information can influence WTP estimates derived
from stated preference survey instruments and that respondents must be
provided with sufficient information to make an informed assessment of
policy impacts on utility (e.g., Bergstrom and Stoll 1989; Bergstrom et
al. 1989; Hoehn and Randall 2002).  As stated clearly by Bateman et al.
(2002, p. 122), “[d]escribing the good and the policy context of
interest may require a combination of textual information, photographs,
drawings, maps, charts and graphs. …[V]isual aids are helpful ways of
conveying complex information…while simultaneously enhancing
respondents’ attention and interest.”  Given that many respondents
may not be fully familiar with the details of programs to reduce I&E
mortality losses and potential impacts on aquatic life, the survey will
include introductory figures to aid respondents’ comprehension of the
goods and policies addressed by the survey instrument, and to encourage
appropriate neoclassical tradeoffs in responding to choice experiment
questions. 

Following this guidance of Bateman et al. (2002) and prior examples of
Opaluch et al. (1993) and Johnston et al. (2002a), among others, EPA
extensively pretested all graphics used in the draft mail survey, to
ensure that these graphical elements were not prejudicial, and that they
did not bias responses.  Graphics judged to be prejudicial or confusing
to respondents during the seven focus groups and cognitive interviews
were revised or replaced. EPA acknowledges that certain types of
graphics can be prejudicial in certain contexts—and hence all
graphical elements were pretested extensively.  EPA found that focus
group respondents endorsed the use of graphics in the survey booklet and
indicated that they helped them to visualize how fish are entrained and
impinged, technological solutions, facilities locations, and ecosystem
effects.  Participants made such statements as, “Yeah, I’d rather
have them” and “I like on page 2 the graph and illustration because
the adage a picture is worth a thousand words”.  EPA also emphasizes
that there is no precedent or support in the literature for the total
elimination of graphics in survey instruments.  To the contrary, the
literature explicitly indicates that pictures and graphics may be
necessary and useful components of survey instruments in many cases
(Bateman et al. 2002).  EPA highlights that numerous peer-reviewed
surveys described in the literature include pictures and graphics both
in survey instruments and in introductory materials such as slide shows.
 For example, see Horne et al. (2005), Ready et al. (1995), Powe and
Bateman (2004), Duke and Ilvento (2004), Opaluch et al. (1993), Johnston
et al. (1999, 2002a, 2002b), and Mazzotta et al. (2002).  Bateman et al.
(2002) also includes examples of various types of survey materials
including pictures and graphical elements.  

Amelioration of Hypothetical Bias

EPA considers the amelioration of hypothetical bias to be a paramount
concern in survey design.  However, the agency acknowledges—based on
prior evidence from the literature—that hypothetical bias is not
unavoidable.  For example, not all research finds evidence of
hypothetical bias in stated preference valuation (Champ and Bishop 2001;
Smith and Mansfield 1998; Vossler and Kerkvliet 2003; Johannesson 1997),
and some shows that hypothetical bias may be ameliorated using
cheap-talk, certainty adjustments, or other mechanisms (Champ et al.
1997; Champ et al. 2004; Cummings and Taylor 1999; Loomis et al. 1996). 


To obtain reliable estimates of WTP, the Agency tested and designed all
survey elements to promote incentive compatible preference elicitation
mechanisms. Incentive compatible stated preference surveys provide no
incentive for non-truthful preference revelation (Carson and Groves
2007). The literature is clear regarding the importance of incentive
compatibility in stated preference value elicitation and the role of
both question format and scenario consequentiality in ensuring this
property (Carson et al. 2000; Carson and Groves 2007; Collins and
Vossler 2009; Herriges et al. 2010; Johnston 2006; Vossler and Evans
2009).  It has been established that referendum-type stated preference
choices are incentive compatible given that certain conditions are met,
including the condition that responses are believed by respondents to be
consequential, or potentially influencing public policy decisions
(Carson and Groves 2007; Herriges et al. 2010). 

The survey is explicitly designed to emphasize the importance of the
budget constraint and program cost.  For example, the survey asks
respondents to compare protecting aquatic ecosystems to other policy
issues which the government could potentially ask households to pay
costs.  The survey itself includes explicit reminders of program cost
and the budget constraint. 

The survey has also been explicitly designed to maximize the
consequentiality of choice experiment questions, thereby maximizing
incentive compatibility (i.e., reducing strategic and hypothetical
biases), following clear guidance of Carson et al. (2000).  Elements
specifically designed to maximize consequentiality include: a)
explicitly mentioning that this survey is associated with assessment of
proposed policies that are being considered, b) numerous details
provided in the survey concerning specifics of the proposed policies,
and c) emphasis that the type of policy enacted will depend in part on
survey results and that their vote is important.  Johnston and Joglekar
(2005) show the capacity of such information to eliminate hypothetical
bias in choice-based stated preference WTP estimation.  

Focus groups and cognitive interviews indicated that respondents viewed
choices as consequential, that they considered their budget constraints
when responding to all questions, and that they would answer the same
way were similar questions asked in a binding referendum.  When asked if
they thought about the program cost in the same way as “money coming
out of their pocket,” the vast majority of focus group and interview
respondents indicated that they treated program costs the same way that
they would have if there were actual money consequences.  For example,
respondents made statements such as “No. [My vote] would have been the
same actually” and “If I believed that it was gonna affect
regulations, I think I would have voted the exact same way.”

EPA does not anticipate significant hypothetical bias in the proposed
survey based on focus group results. Focus groups respondents took the
survey questions seriously and indicated that they though that their
choices would actually influence policy.  Regarding the potential use of
cheap talk mechanisms or other devices to further address the potential
for hypothetical bias, the Agency emphasizes that the literature is
mixed as to their performance.  For example, the seminal work by
Cummings and Taylor (1999) shows that cheap talk is able to reduce
hypothetical biases.  Similar results are shown by Aadland and Caplan
(2003).  However, other authors (e.g., Cummings et al. 1995; List 2001;
Brown et al. 2003) find that a cheap talk script is only effective under
certain circumstances, and for certain types of respondents.  For
example, Cummings et al. (1995) find that a relatively short cheap talk
script actually worsens hypothetical bias, while a longer script appears
to ameliorate bias.  Brown et al. (2003) finds cheap talk only effective
at higher bid amounts—a result mirrored by Murphy et al. (2004). 
Still other authors find no effect of cheap talk, including Poe et al.
(2002).  Given the clearly mixed experiences with such mechanisms, EPA
is not convinced that cheap talk scripts are likely to provide a panacea
for hypothetical bias in the present case—although they appear to
reduce bias in a limited set of circumstances – and cheap talk is not
included in the survey. 

Amelioration of Symbolic Biases and Warm-Glow Effects

Following clear guidance of Arrow et al. (1993) and others, EPA has
taken repeated steps to ensure that survey responses reflect the value
of the affected fish resources only, and do not reflect symbolic or warm
glow concerns (Mitchell and Carson 1989).  Following explicit guidance
of the NOAA Blue Ribbon Panel on Contingent Valuation (Arrow et al.
1993, p. 4609), EPA has explicitly designed all elements of the survey
to “deflect the general ‘warm glow’ of giving or the dislike of
‘big business’ away from the specific program that is being
valued.”  This was done in a variety of ways, based on prior examples
in the literature, such as asking respondents to reflect on importance
of attributes before making selections, and using a payment vehicle that
doesn’t raise trust issues (cost of living increase rather than an
electric bill increase). The focus group and cognitive interview results
indicated that most participants answered the choice questions based on
the effects discussed in the survey, not on a desire to help the
environment in general.

	The survey includes clear language to instruct respondents only to
consider the specific attributes in the survey, and not to base answers
on broader environmental concerns including the statement that:

 “Scientists expect that effects on the environment and economy not
shown explicity will be small.  For example, studies of industry suggest
that effects on employment will be close to zero.”

This is also consistent with the statement from Arrow et al. (1993) that
a referendum-type format may limit the warm-glow effect.  Some focus
group participants indicated that they were inclined to support
environmental causes and would like “to do a good thing” but still
considered the cost and effects under the policy options. For example,
respondents stated, “[…] if we can do something to help as long as
the price is right, then do it” and “I feel if it’s going to be
benefit everyone and be better for the economy, I’m OK with paying a
little bit more.” 

This evidence notwithstanding, EPA believes that it is important to
include follow-up questions to ensure that responses do not reflect
symbolic biases.  Question 9 in the survey instrument—which addresses
the rationale for choice responses given earlier in the draft
survey—explicitly tests for the presence of symbolic or warm-glow
biases.  Follow-up questions such as these are common in stated
preference survey instruments, to assess the underlying reasons for the
observed valuation responses (e.g., Mitchell and Carson 1984).

Assessing Scope Sensitivity

The NOAA Blue Ribbon Panel on Contingent Valuation (CV) (Arrow et al.
1993, p. 4605) states clearly that if “surveys are to elicit useful
information about willingness to pay, respondents must understand
exactly what it is they are being asked to value (or vote upon)…”
They further indicate that surveys providing “sketchy details” about
the results of proposed policies call “into question the estimates
derived there from,” and hence suggest a high degree of detail and
richness in the descriptions of scenarios.  Similar guidance is provided
by other key sources in the CV literature (e.g., Mitchell and Carson
1989; Louviere et al. 2000).  Among the reasons for this guidance is
that such descriptions tend to encourage appropriate framing and
sensitivity to scope.

Following Arrow et al. (1993), Mitchell and Carson (1989), and others,
while noting the clear limitations in scope tests discussed by Heberlein
et al. (2005), EPA believes that it is important that survey responses
in this case show sensitivity to scope.  This is one of the primary
reasons for the use of the choice experiment methodology, which is
better able to capture WTP differentials related to changes in resource
scope (Bateman et al. 2002).  Unlike open-ended questions, in which
scope insensitivity is a primary concern, EPA emphasizes that choice
experiments generally have shown much less difficulty with respondents
reacting appropriately to the scope and scale of resource changes. 
Moreover, as clearly noted by Bennett and Blamey (2001, p. 231),
“internal scope tests are automatically available from the results of
a [choice modeling] exercise.”   That is, within choice experiments,
sensitivity to scope is indicated by the statistical significance and
sign of parameter estimates associated with program attributes (Bennett
and Blamey 2001).  Internal scope sensitivity will therefore be assessed
through model results for the variables Commercial Fish Populations,
Fish Populations (all fish), Fish Saved per Year, and Condition of
Aquatic Ecosystems.  Statistical significance of these variables—along
with a positive sign—indicates that respondents, on average, are more
likely to choose plans with larger quantities of these variables.  

In addition to internal scope tests implicit in all choice experiment
statistical analysis, EPA will also conduct external scope tests (cf.
Giraud et al. 1999).  The primary difference between internal and
external tests is that the former assess sensitivity to scope across
choices of a single respondent, while the latter involves split-sample
assessments across different respondents.  Within a choice modeling
context, external scope tests are generally considered “stronger,”
although also more likely to be confounded by differences in the implied
choice frame (Bennett and Blamey 2001).  A variety of options for
external scope tests exist, depending on the structure of the stated
choice questions under consideration.

In the present case, attribute-by-attribute external scope tests will be
conducted over a split sub-sample of respondents considering a specific
set of choices, with all held constant across the considered choices
except the scope of the attribute for which the test is to be conducted.
 For example, to conduct an external scope test for reductions in annual
fish losses, one would consider a set of choices that is identical over
two respondent groups, except that one considers a choice with a greater
reduction in fish losses.  Assessing the choices over this split sample
allows for an external test of scope.  To illustrate this test, consider
the following stylized choice between Option A and Option B.  The
generic labels “Level 0”, “Level 1”, and “Level 2” are used
to denote attribute levels, where for all attributes Level 2 > Level 1 >
Level 0.

							  

Table B2:  Illustration of an External Scope Test

Variable	Option A	Option B

Fish Saved per Year 

	Sample 1:  Fish Saved Level 1

Sample 2:  Fish Saved Level 2	Fish Saved Level 0

Commercial Fish Populations	Commercial Fish Populations

Level 0	Commercial Fish Populations

Level 0

Fish Populations (all fish)	Population (all fish)  Level 0	Population
(all fish)  Level 0

Condition of Aquatic Ecosystems	Aquatic Ecosystem Condition  

Level 0	Aquatic Ecosystem Condition  

Level 0

Increase in Cost of Living for Your Household	Cost Level 1	Cost Level 0



In the above example, only Fish Saved per Year and Cost vary across the
choice options.  Because both Fish Saved per Year (at Level 1 and Level
2) and Cost are higher in Option A than in Option B, neither option is
dominant.  In the illustrated split-sample test, respondent sample 1
views the choice with Fish Saved per Year at Level 1, while respondent
sample 2 views an otherwise identical choice with Fish Saved per Year at
Level 2, where Level 2 > Level 1.  If responses are externally sensitive
to scope in Fish Saved per Year, this will manifest in a greater
proportion of sample 2 respondents choosing Option A than sample 1
respondents.  This hypothesis may be easily assessed using a test of
equal proportions across the two sub-samples, and provides a simple
attribute-by-attribute test of external scope.  Analogous tests may be
conducted for all attributes within the choice experiment design, using
parallel methods. EPA emphasizes that the formal applicability of the
above-noted scope test is contingent upon the specific choice frame
implied by levels of other attributes in the choice question.  This is a
characteristic of nearly all external scope tests applied in choice
experiment frameworks (Bennett and Blamey 2001).

Split-sample tests such as those proposed above often require the
addition of question versions to the experimental design, to accommodate
the specific structural needs of the attribute-by-attribute external
scope test.  Otherwise, confounding effects of other varying attributes
(including demographic information) can render results of scope tests
ambiguous.  In the present case, the proposed tests would require the
addition of up to six unique question versions to the experimental
design, enabling scope tests for the three non-cost attributes within
the 316(b) choice experiment scenarios.  If scope tests in additional
question frames are desired (e.g., the same scope test illustrated
above, but given Level 1 for commercial fish populations, fish
population (all fish), and aquatic ecosystem condition attributes),
still additional question versions would be added.  While small numbers
of questions added to the experimental design should have minimal
impacts on overall efficiency (e.g., orthogonality of the design),
larger numbers may have a more significant impact.  Hence, given
constraints on the total number of survey respondents, there is a
potential empirical tradeoff between the number of external scope tests
that may be conducted and the efficiency of the experimental design and
statistical analysis.

Communicating Uncertainty to Respondents

EPA believes that the role of risk and uncertainty is an important issue
to be addressed in the development of benefits estimates, and points out
that the literature provides numerous examples of cases in which
appropriate survey design, including focus groups, was used to
successfully address such concerns.  For example, as stated by
Desvousges et al. (1984), “using contingent valuation to estimate the
benefits of hazardous waste management regulations requires detailed
information on how and the extent to which respondents understand risk
(or probability) and how government regulatory actions might change
it… Using focus groups helped make this determination…”  EPA also
emphasizes that all regulatory analyses involve uncertainty of some type
(Boardman et al. 2001).  

The ecological outcome of I&E reductions is subject to considerable
uncertainty.  EPA believes that it is important that survey respondents
be aware of this uncertainty, and that their responses reflect the
knowledge that the resource changes reflected in the survey are
scientific estimates.  However, EPA is also aware of the clear advice
from the choice modeling literature (e.g., Bennett and Blamey 2001;
Louviere et al. 2000) to avoid cognitive burden on respondents.  Hence,
the proposed survey materials clearly indicate the uncertainty involved
with the described resource changes in choice modeling scenarios, yet do
so in a way designed to minimize cognitive burden.

For example, prior to answering choice experiment questions, respondents
are told:

 “Although scientists can predict the number of fish saved each year,
the effect on fish populations is uncertain. This is because scientists
do not know the total number of fish in Northeast waters and because
many factors – such as cooling water use, fishing, pollution and water
temperature – affect fish populations.” 

This statement clearly indicates the uncertainty involved with
scientific estimates of the outcomes of I&E regulations.  This is
followed by a further reminder of uncertainty:

 “Depending on the type of technology and other factors, effects on
fish and ecosystems may be different – even if the annual reduction in
fish losses is similar.”

Focus groups and cognitive interviews participants understood that the
ecological changes described in the survey were uncertain, and most
participants were comfortable making decisions in the presence of this
uncertainty. Their responses indicated that they understood this
uncertainty based on the information presented in the introductory
material and considered it when evaluating policy options. Respondents
made such statements as: “My guess is that it did come from studies
but I have a healthy dose of skepticism about the accuracy of it. I
don’t think it’s been in any way skewed purposefully, but I know
that this is a best guess, reasonable guess perhaps”, “it shows me
that they are being honest for the most part. You know, you can't
obviously be accurate on everything, but this is a kind of a best
guess”, “[…] They had more numbers on the commercial fish
population. The rest was more of a guesstimate”, and “you don’t
know the exact number and nobody knows.”

In previous focus groups conducted for the Phase III survey, EPA tested
alternative versions of the Phase III survey instrument in which choice
experiment attributes were presented as 90% confidence ranges, rather
than as point estimates.  Focus group respondents were explicitly asked
whether the ranges were helpful in understanding the uncertainty of
estimates presented in the choice question or whether they were a source
of confusion.  Seven out of the eight respondents interviewed on that
occasion indicated that the use of ranges was more confusing than the
use of point estimates.  Furthermore, respondents were comfortable
making decisions in the presence of this uncertainty. 

5(c)	Reporting Results

The results of the survey will be made public as part of the benefits
analysis for the 316(b) regulation for existing facilities.  Provided
information will include summary statistics for the survey data,
extensive documentation for the statistical analysis, and a detailed
description of the final results.  The survey data will be released only
after it has been thoroughly vetted to ensure that all potentially
identifying information has been removed.

REFERENCES

Aadland, D., and A.J. Caplan. 2003. “Willingness to Pay for Curbside
Recycling with Detection and Mitigation of Hypothetical Bias.”
American Journal of Agricultural Economics 85(2): 492-502.

Adamowicz, W., P. Boxall, M. Williams, and J. Louviere. 1998. “Stated
Preference Approaches for Measuring Passive Use Values: Choice
Experiments and Contingent Valuation.” American Journal of
Agricultural Economics 80(1): 64-75.

Akter, S., R. Brower, L. Brander, and P. Van Beukering. 2009.
“Respondent Uncertainty in a Contingent Market for Carbon Offsets.”
Ecological Economics 68(6): 1858-1863.

Arrow, K. , R. Solow, E. Leamer, P. Portney, R. Rander, and H. Schuman.
1993. “Report of the NOAA Panel on Contingent Valuation.” Federal
Register 58 (10): 4602-4614.

Bateman, I.J., R.T. Carson, B. Day, M. Hanemann, N. Hanley, T. Hett, M.
Jones-Lee, G. Loomes, S. Mourato, E. Ozdemiroglu, D.W. Pierce, R.
Sugden, and J. Swanson. 2002. Economic Valuation with Stated Preference
Surveys: A Manual. Northampton, MA:  Edward Elgar.

Bennett, J. and R. Blamey, eds. 2001. The Choice Modelling Approach to
Environmental Valuation. Northampton, MA: Edward Elgar.

Bergstrom, J.C. and J.R. Stoll. 1989. “Aplication of experimental
economics concepts and precepts to CVM field survey procedures.”
Western Journal of Agricultural Economics 14(1): 98-109.

Bergstrom, J.C., J.R. Stoll, and A. Randall. 1989. “Information
effects in contingent markets.” American Journal of Agricultural
Economics 71(3): 685-691.

Besedin, Elena, Robert Johnston, Matthew Ranson, and Jenny Ahlen, Abt
Associates Inc. 2005.  “Findings from 2005 Focus Groups Conducted
Under EPA ICR #2155.01.” Memo to Erik Helm, U.S. EPA/OW, October 18,
2005. See docket for EPA ICR #2155.02

Blamey, R.K., J.W. Bennett, and M.D. Morrison. 1999. “Yea-saying in
Contingent Valuation Surveys.”  Land Economics 75: 126-141.

Boardman, A.E., D.H. Greenberg, A.R. Vining, and D.L. Weimer. 2001.
Cost-Benefit Analysis: Concepts and Practice, 2nd edition. Upper Saddle
River, NJ: Prentice Hall.

Boyle, K.J. 2003. “Contingent valuation in practice.” In A Primer on
Nonmarket Valuation. Edited by P.A. Champ, K.J. Boyle, and T.C. Brown,
Kluwer Academic Publishers.

Brown, T. C., I. Ajzen, and D. Hrubes. 2003. “Further Tests of
Entreaties to Avoid Hypothetical Bias in Referendum Contingent
Valuation.” Journal of Environmental Economics and Management 46(2):
353-361.

Bunch, D.S., and R.R. Batsell. 1989. “A Monte Carlo Comparison of
Estimators for the Multinomial Logit Model.” Journal of Marketing
Research 26: 56-68.  SEQ CHAPTER \h \r 1 

Cameron, T.A., and D.D. Huppert. 1989. “OLS versus ML Estimation of
Non-market Resource Values with Payment Card Interval Data.” Journal
of Environmental Economics and Management 17: 230-246.

Carson, R.T., and T. Groves. 2007. “Incentives and informational
properties of preference questions.” Environmental and Resource
Economics 37(1): 181-210.

Carson, R.T., T. Groves, and M.J. Machina. 2000.  “Incentive and
Informational Properties of Preference Questions.” Working Paper,
Department of Economics, University of California, San Diego.

Champ P.A., and R.C. Bishop.  2001.  “Donation Payment Mechanisms and
Contingent Valuation:  An Empirical Study of Hypothetical Bias.”
Environmental and Resource Economics 19(4): 383-402.

Champ, P.A., R.C. Bishop, T.C. Brown, and D.W. McCollum. 1997. “Using
Donation Mechanisms to Value Non-use Benefits from Public Goods.”
Journal of Environmental Economics and Management 33(2): 151-162.

Champ, P.A., R. Moore, and R. C. Bishop. 2004. “Hypothetical Bias: The
Mitigating Effects of Certainty Questions and Cheap Talk.” Selected
paper prepared for presentation at the American Agricultural Economics
Association Annual Meeting, Denver, Colorado.

Champ, P.A., R. Moore, and R.C. Bishop. 2009. “A Comparison of
Approaches to Mitigate Hypothetical Bias.” Agricultural and Resource
Economics Review 38(2): 166-180.

Collins, J.P., and C.A. Vossler. 2009. “Incentive compatibility tests
of choice experiment value elicitation questions.” Journal of
Environmental Economics and Management 58(2): 226-235.

Croke, K., R.G. Fabian, and G. Brenniman. 1986. “Estimating the Value
of Improved Water Quality in an Urban River System.” Journal of
Environmental Systems 16(1): 13-24.  

Cronin, F.J. 1982. “Valuing Nonmarket Goods Through Contingent
Markets.” Pacific Northwest Laboratory, PNL 4255, Richland, WA.

Cummings, R.G., and G.W. Harrison. 1995. “The Measurement and
Decomposition of Non-use Values:  A Critical Review.” Environmental
and Resource Economics 5: 225-247.

Cummings, R. G., G.W. Harrison, and L.L. Osborne. 1995. “Can the Bias
of Contingent Valuation Surveys Be Reduced?” Economics working paper,
Columbia, SC: Division of Research, College of Business Administration,
Univ. of South Carolina.

Cummings, R.G., P.T. Ganderton, and T. McGuckin. 1994. “Substitution
Effects in CVM Values.” American Journal of Agricultural Economics
76(2): 205-214.

Cummings, R.G., and L.O. Taylor. 1999. “Unbiased Value Estimates for
Environmental Goods: A Cheap Talk Design for the Contingent Valuation
Method.” American Economic Review 89(3): 649-665.

Desvousges, W.H., and V.K Smith. 1988. “Focus Groups and Risk
Communication:  the Science of Listening to Data.” Risk Analysis 8:
479-484.

Desvousges, W.H., V.K. Smith, D.H. Brown, and D.K. Pate. 1984. “The
Role of Focus Groups in Designing a Contingent Valuation Survey to
Measure the Benefits of Hazardous Waste Management Regulations.”
Research Triangle Institute:  Research Triangle Park, NC.

Dillman, D.A. 2000. Mail and Internet Surveys: The Tailored Design
Method.  New York: John Wiley and Sons.

Duke, J.M., and T.W. Ilvento. 2004. “A Conjoint Analysis of Public
Preferences for Agricultural Land Preservation.” Agricultural and
Resource Economics Review 33(2): 209-219.

Entergy Corp. v. Riverkeeper Inc., 129 S. Ct. 1498, 1505 (2009)

Freeman, A.M., III. 2003.  The Measurement of Environmental and Resource
Values: Theory and Methods. Washington, DC: Resources for the Future.

Giraud, K.L., J.B. Loomis, and R.L. Johnson. 1999. “Internal and
external scope in willingness-to-pay estimates for threatened and
endangered wildlife.” Journal of Environmental Management 56: 221-229.

Greene, W.H. 2002. NLOGIT Version 3.0 Reference Guide.  Plainview, NY: 
Econometric Software, Inc.

Greene, W.H. 2003. Econometric Analysis.  5th ed., Prentice Hall, Upper
Saddle River, NJ.  

Haab, T.C., and K.E. McConnell. 2002. Valuing Environmental and Natural
Resources:  The Econometrics of Non-market Valuation. Cheltenham, UK:
Edward Elgar.

Hanemann, W.M. 1984. “Welfare Evaluations in Contingent Valuation
Experiments with Discrete Responses.” American Journal of Agricultural
Economics 66(3): 332-41.

Hanemann, W.M., and B. Kanninen. 1999. “The Statistical Analysis of
Discrete-Response CV Data.”  In Valuing Environmental Preferences:
Theory and Practice of the Contingent Valuation Method in the US, EU,
and Developing Countries. Edited by I.J. Bateman and K.G. Willis, Oxford
University Press, Oxford, UK.

Hanley, N., S. Colombo, D. Tinch, A. Black, and A. Aftab. 2006a. " 
HYPERLINK "http://ideas.repec.org/a/oup/erevae/v33y2006i3p391-413.html" 
Estimating the benefits of water quality improvements under the Water
Framework Directive: are benefits transferable? ,"   HYPERLINK
"http://ideas.repec.org/s/oup/erevae.html"  European Review of
Agricultural Economics  33(3):391-413.

Hanley, N., R. E. Wright, and B. Alvarez-Farizo. 2006b. “Estimating
the Economic Value of Improvements in River Ecology using Choice
Experiments: An Application to the Water Framework Directive.” Journal
of Environmental Management 78(2):183-193.

Heberlein, T.A., M.A. Wilson, R.C. Bishop, and N.C. Schaeffer. 2005.
“Rethinking the Scope Test as a Criterion in Contingent Valuation.”
Journal of Environmental Economics and Management 50(1): 1-22.

Heckman, J. J. 1979. “Sample Selection Bias as a Specification
Error.” Econometrica 47(1): 153-161.

Hensher, D. A., and Barnard, P. O. (1990). "The Orthogonality Issue in
Stated Choice Designs." In Fischer, M., Nijkamp, P., and Papageorgiou,
Y. (eds.), Spatial Choices and Processes. North-Holland, Amsterdam,
265-278.

Herriges, J., C. Kling, C. Lieu, and J. Tobias. 2010. “What are the
consequences of consequentiality?” Journal of Environmental Economics
and Management 59(1): 67-81.

Hoehn, J. P. 1991. “Valuing the Multidimensional Impacts of
Environmental Policy: Theory and Methods.”  American Journal of
Agricultural Economics 73(2): 289-299.

Hoehn, J.P., F. Lupi, and M.D. Kaplowitz. 2004.  Internet-Based Stated
Choice Experiments in Ecosystem Mitigation:  Methods to Control Decision
Heuristics and Biases.  In Proceedings of Valuation of Ecological
Benefits: Improving the Science Behind Policy Decisions, a workshop
sponsored by the US EPA National Center for Environmental Economics and
the National Center for Environmental Research.

Hoehn, J.P., and A. Randall. 2002. “The Effect of Resource Quality
Information on Resource Injury Perceptions and Contingent Values.”
Resource and Energy Economics 24: 13-31.

Horne, P., P.C. Boxall, and W.L. Adamowicz. 2005.  “Multiple-use
management of forest recreation sites: a spatially explicit choice
experiment.”  Forest Ecology and Management 207(1/2): 189-99.

Johannesson, M.  1997.  “Some Further Experimental Results on
Hypothetical Versus Real Willingness to Pay.” Applied Economics
Letters 4: 535-536.

Johnston, R.J., E.T. Schultz, K. Segerson and E.Y. Besedin. 2010.
“Bioindicator-Based Stated Preference Valuation for Aquatic Habitat
and Ecosystem Service Restoration”, In Bennett, J. ed. International
Handbook on Non-Marketed Environmental Valuation.  Cheltenham, UK:
Edward Elgar, forthcoming.

Johnston, R.J. 2006. “Is Hypothetical Bias Universal? Validating
Contingent Valuation Responses Using a Binding Public Referendum.”
Journal of Environmental Economics and Management 52(1):469-481.

Johnston, R.J., and D.P. Joglekar. 2005. “Validating Hypothetical
Surveys Using Binding Public Referenda: Implications for Stated
Preference Valuation.” American Agricultural Economics Association
(AAEA) Annual Meeting, Providence, July 24-27.

Johnston, R.J., J.J. Opaluch, M.J. Mazzotta, and G. Magnusson. 2005. 
“Who Are Resource Non-users and What Can They Tell Us About Non-use
Values?  Decomposing User and Non-user Willingness to Pay for Coastal
Wetland Restoration.”  Water Resources Research 41(7),
doi:10.1029/2004WR003766.

Johnston, R.J., E.Y. Besedin, and R.F. Wardwell. 2003a. “Modeling
Relationships Between Use and Non-use Values for Surface Water Quality:
A Meta-Analysis.” Water Resources Research 39(12): 1363.

Johnston, R.J., S.K. Swallow, T.J. Tyrrell, and D.M. Bauer. 2003b.
“Rural Amenity Values and Length of Residency.” American Journal of
Agricultural Economics 85(4): 1000-1015.

Johnston, R.J., G. Magnusson, M. Mazzotta, and J.J.Opaluch. 2002a.
“Combining Economic and Ecological Indicators to Prioritize Salt Marsh
Restoration Actions.” American Journal of Agricultural Economics
84(5): 1362-1370.

Johnston, R.J., S.K. Swallow, C.W. Allen, and L.A. Smith. 2002b.
“Designing Multidimensional Environmental Programs: Assessing
Tradeoffs and Substitution in Watershed Management Plans.” Water
Resources Research 38(7): IV1-13.

.Johnston, R.J., S.K Swallow and T.F. Weaver. 1999. “Estimating
Willingness to Pay and Resource Trade-offs With Different Payment
Mechanisms:  An Evaluation of a Funding Guarantee for Watershed
Management.” Journal of Environmental Economics and Management 38(1):
97-120.

Johnston, R.J., T.F. Weaver, L.A. Smith, and S.K. Swallow. 1995.
“Contingent Valuation Focus Groups:  Insights From Ethnographic
Interview Techniques.” Agricultural and Resource Economics Review
24(1): 56-69.

Just, R.E., D.L. Hueth, and A. Schmitz. 2004. The Welfare Economics of
Public Policy: A Practical Approach to Project and Policy Evaluation.
Edward Elgar Publishing, Cheltenham, UK and Northampton, MA.

Kaplowicz, M.D., F. Lupi, and J.P. Hoehn. 2004. “Multiple Methods for
Developing and Evaluating a Stated-Choice Questionnaire to Value
Wetlands.”  Chapter 24 in Methods for Testing and Evaluating Survey
Questionnaires, eds. S. Presser, J.M. Rothget, M.P. Coupter, J.T.
Lesser, E. Martin, J. Martin, and E. Singer.  New York:  John Wiley and
Sons.

Kobayashi, M.K. Rollins, and M.D.R Evans. 2010. “Sensitivity of WTP
Estimates to Definition of 'Yes': Reinterpreting Expressed Response
Intensity.” Agricultural and Resource Economics Review 39(1): 37-55.

Kuhfeld, W.F. 2009. “Experimental Design:  Efficiency, Coding and
Choice Designs.”  SAS Institute.  HYPERLINK
"http://support.sas.com/techsup/tnote/tnote_stat.html" \l "market"
http://support.sas.com/techsup/tnote/tnote_stat.html#market . 

Layton, D.F. 2000.  “Random coefficient models for stated preference
surveys.” Journal of Environmental Economics and Management 40(1):
21-36.

List, J.A. 2001. “Do Explicit Warnings Eliminate the Hypothetical Bias
in Elicitation Procedures? Evidence from Field Auctions for
Sportscards.” American Economic Review 91(5): 1498-1507.

Loomis, J., T. Brown, B. Lucero, and G. Peterson. 1996. “Improving
Validity Experiments of Contingent Valuation Methods: Results of Efforts
to Reduce the Disparity of Hypothetical and Actual Willingness to
Pay.” Land Economics 72(4): 450-461.

Louviere, J.J., D.A. Hensher, and J.D. Swait. 2000. Stated Preference
Methods:  Analysis and Application. Cambridge, UK: Cambridge University
Press.

Maddala, G.S. 1983. “Limited-Dependent and Qualitative Variables in
Econometrics.”  Econometric Society Monographs No. 3, Cambridge
University Press, Cambridge.

Mazzotta, M.J., J.J. Opaluch, G. Magnuson, and R.J. Johnston. 2002.
“Setting Priorities for Coastal Wetland Restoration: A GIS-Based Tool
That Combines Expert Assessments And Public Values.” Earth System
Monitor 12(3): 1-6.

McConnell, K.E. 1990. “Models for Referendum Data:  The Structure of
Discrete Choice Models for Contingent Valuation.” Journal of
Environmental Economics and Management 18(1): 19-34.

McFadden, D., and K. Train. 2000. “Mixed Multinomial Logit Models for
Discrete Responses.” Journal of Applied Econometrics 15(5): 447-470.

  SEQ CHAPTER \h \r 1 Mitchell, R.C., and R.T. Carson. 1981. An
Experiment in Determining Willingness to Pay for National Water Quality
Improvements. Preliminary draft of a report to the U.S. Environmental
Protection Agency. Resources for the Future, Inc., Washington.

Mitchell, R.C., and R.T. Carson. 1984. A Contingent Valuation Estimate
of National Freshwater Benefits: Technical Report to the U.S.
Environmental Protection Agency.  Washington, DC: Resources for the
Future.

Mitchell, R.C., and R.T. Carson. 1989. Using Surveys to Value Public
Goods: The Contingent Valuation Method. Resources for the Future,
Washington, D.C.

Morrison, M., and J. Bennett. 2004. Valuing New South Wales rivers for
use in benefit transfer. Australian Journal Of Agricultural And Resource
Economics 48(4): 591-611.

Murphy, J.J., T. Stevens, and D. Weatherhead. 2004. “Is Cheap Talk
Effective at Eliminating Hypothetical Bias in a Provision Point?”
Working Paper No. 2003-2. Department of Resource Economics, University
of Massachusetts, Amherst.

  SEQ CHAPTER \h \r 1 Olsen, D., J. Richards, and R.D. Scott. 1991.
“Existence and Sport Values for Doubling the Size of Columbia River
Basin Salmon and Steelhead Runs.” Rivers 2(1): 44-56.

Opaluch, J.J., T.A. Grigalunas, M. Mazzotta, R.J. Johnston, and J.
Diamantedes. 1999. Recreational and Resource Economic Values for the
Peconic Estuary.  Prepared for the Peconic Estuary Program.  Peace Dale,
RI: Economic Analysis Inc.  124 pp.

Opaluch, J.J., S.K. Swallow, T. Weaver, C. Wessells, and D. Wichelns.
1993. “Evaluating impacts from noxious facilities:  Including public
preferences in current siting mechanisms.” Journal of Environmental
Economics and Management 24(1): 41-59.

Poe, G. L., J.E. Clark, D. Rondeau, and W.D. Schulze. 2002. “Provision
Point Mechanisms and Field Validity Tests of Contingent Valuation.”
Environmental and Resource Economics 23: 105-131.

Poe, G.L., M.P. Welsh, and P.A. Champ. 1997. “Measuring the Difference
in Mean Willingness to Pay when Dichotomous Choice Contingent Valuation
Responses are not Independent.” Land Economics 73(2): 255-267.

Powe, N.E. 2007. Redesigning Environmental Valuation: Mixing Methods
within Stated Preference Techniques. Cheltenham, UK: Edward Edgar.

Powe, N.A., and I.J. Bateman. 2004. “Investigating Insensitivity to
Scope:  A Split-Sample Test of Perceived Scheme Realism.” Land
Economics 80(2): 258-271.

Ready, R.C., P.A. Champ, and J.L. Lawton. 2010. “Using Respondent
Uncertainty to Mitigate Hypothetical Bias in a Stated Choice
Experiment.” Land Economics 86(2): 363-381.

Ready, R.C., J.C. Whitehead, and G.C. Blomquist. 1995. “Contingent
Valuation When Respondents are Ambivalent.” Journal of Environmental
Economics and Management 29(2): 181-196.

Smith, V. K., and C. Mansfield. 1998. “Buying Time: Real and
Hypothetical Offers.” Journal of Environmental Economics and
Management 36: 209-224.

Schkade, D.A. and J.W. Payne. 1994. “How People Respond to Contingent
Valuation Questions:  A Verbal Protocol Analysis of Willingness to Pay
for an Environmental Regulation.” Journal of Environmental Economics
and Management 26: 88-109.

Train, K. 1998. “Recreation Demand Models with Taste Differences Over
People.” Land Economics 74(2): 230-239. 

  SEQ CHAPTER \h \r 1 U.S. Department of Labor, Bureau of Labor
Statistics.  2009.  Table 1: Civilian workers, by major occupational and
industry group.  September 2009.    HYPERLINK
"http://www.bls.gov/news.release/ecec.t01.htm" 
http://www.bls.gov/news.release/ecec.t01.htm .

U.S. EPA. 2000.  Guidelines for Preparing Economic Analyses. (EPA
240-R-00-003). U.S. EPA, Office of the Administrator, Washington, DC,
September 2000.

U.S. EPA. 2006.  Peer Review Handbook 3rd Edition. (EPA 100-B-06-002).
U.S. EPA, Science Policy Council, Washington, DC, 2006.

Versar. 2006. Comments Summary Report: Peer Review Package for
"Willingness to Pay Survey Instrument for §316(b) Phase III Cooling
Water Intake Structures.” Prepared by Versar Inc., Springfield, VA. 
SEQ CHAPTER \h \r 1 

Viscusi, W. Kip, Joel Huber, and Jason Bell.  2008. “The Economic
Value of Water Quality.”  Environmental and Resource Economics 41(2):
169-187.

Vossler, C.A., and M.F. Evans. 2009. “Bridging the gap between the
field and the lab: Environmental goods, policy maker input, and
consequentiality.” Journal of Environmental Economics and Management
58(3):338-345.

Vossler, C.A., and J. Kerkvliet. 2003. “A Criterion Validity Test of
the Contingent Valuation Method:  Comparing Hypothetical and Actual
Voting Behavior for a Public Referendum.” Journal of Environmental
Economics and Management 45(3): 631-649.

Whitehead, J.C., G.C. Blomquist, T.J.Hoban, and W.B. Clifford. 1995.
“Assessing the Validity and Reliability of Contingent Values: A
Comparison of On Site Users, Off Site Users, and Non-users.” Journal
of Environmental Economics and Management 29(2): 238-251.

Whitehead, J.C., and P.A. Groothuis. 1992. “Economic Benefits of
Improved Water Quality: a Case Study of North Carolina's Tar Pamlico
River.” Rivers 3: 170-178.

Attachment 1: Full Text of Regional Stated Preference Survey Component
(Northeast Regional Example)

Fish and Aquatic Habitat

A Survey of Northeast Residents

(CT, DC, DE, MA, MD, ME, NH, NJ, NY, PA, RI, VT)

Human Activities, Aquatic Habitat and Fish

This survey asks for your opinions regarding policies that would affect
fish and habitat in the Northeast U.S.  Your answers will help the
government decide which policies will be enacted. Background information
in this survey was provided by the National Marine Fisheries Service,
U.S. Environmental Protection Agency, U.S. Geological Survey and other
state and federal offices.

Northeast fresh and salt waters support billions of fish. These include
fish that are used by humans, as well as forage fish that are not used
by humans, but serve as food for larger fish, birds, and animals. 

This survey concerns proposed policies that would reduce fish losses
caused by cooling water use by industrial facilities, including
factories and power plants. These policies would benefit aquatic
ecosystems but would increase the costs of some goods and services you
buy, including electricity and common household products.

How Does Cooling Water Affect Fish?

The water that industrial facilities use to cool equipment is pumped
from bays, rivers, and lakes. The largest amount is used by power plants
that produce electricity.

what kinds of Fish are Affected?

Cooling water use is not the largest cause of fish loss in most areas
(fishing causes greater losses), but has affected some fish populations.

About 1/6 of the fish lost are species caught by commercial and
recreational fishermen. Examples include striped bass, flounder, and
cod.

The other 5/6 of the fish lost are forage species not caught by humans
but are part of the food web. Examples include killifish, silverside,
and stickleback.

Question 1. When thinking about how industrial facilities use cooling
water, please rate the importance of the following to you.  Check one
box for each.

	Not Important	

	Somewhat Important	







Preventing the loss of fish that are caught by humans	(1	(2	(3	(4	(5

Preventing the loss of fish that are not caught by humans	(1	(2	(3	(4	(5

Maintaining the ecological health of rivers, lakes and bays	(1	(2	(3	(4
(5

Keeping the cost of goods and services low 	(1	(2	(3	(4	(5

Making sure there is enough government regulation of industry	(1	(2	(3
(4	(5

Making sure there is not too much government regulation of industry	(1
(2	(3	(4	(5



how many fish are affected?

After accounting for the number of eggs and larvae that would be
expected to survive to adulthood, scientists estimate that the
equivalent of about 1.1 billion young adult fish (the equivalent of one
year old) are lost each year in Northeast coastal and fresh waters due
to cooling water use.  

Scientists can predict the number of these fish that will be saved under
different policies.  This number ranges from 0.3 to 1.0 billion fish
saved per year. 

For commercial fish species, losses of young fish in cooling water
intakes vary by species, from the equivalent of less than 0.1% to about
10% of total populations.

Yearly effects on fish species are between these levels. For example,
the number of young fish lost in cooling water intakes relative to the
total number of fish in the water is relatively high for some species,
but low for others.

 

Although scientists can predict the number of fish saved each year, the
effect on fish populations is uncertain. This is because scientists do
not know the total number of all fish in Northeast waters and because
many factors – such as cooling water use, fishing, pollution and water
temperature – affect fish populations. 

New Regulations are Being Proposed to protect fish

Advanced filters and closed cycle cooling are already in use at many
facilities and are proven technologies. New regulations would require a
mix of advanced filters and closed cycle cooling at all
facilities—with reductions in fish losses between 25% and 95%.



How important are these issues to you?

While these policies would reduce fish losses, they would also increase
the costs of producing many goods and services — these costs would be
passed on to consumers like you.

Question 2. Compared to other issues that the government might
address—such as public safety, education and health—how important is
protecting aquatic ecosystems to you? Check one box. 

	Not Important	

	Somewhat Important	







Protecting aquatic ecosystems is	(1	(2	(3	(4	(5



The government needs to know whether households are willing to pay the
costs of these new policies. 

This survey will ask you to compare policies with different effects on
cooling water use, fish, and costs to your household. You will be asked
to vote for the options you prefer.

You will also have the opportunity to support the current situation,
with no new policies, and no new costs to your household.

This survey is similar to a public vote

The next part of this survey will ask you to consider different types of
policies to protect fish, and indicate how you would vote. Effects of
each possible policy will be described using the following scores:

Commercial Fish Populations

Fish Populations

Fish Saved

Condition of

Aquatic Ecosystems

A score between 0 to 100 showing the ecological condition of affected
areas, compared to the most natural waters in the Northeast. The score
is determined by many factors including water quality and temperature,
the health of aquatic species, and habitat conditions.  Higher scores
mean the area is more natural. The current score in Northeast waters is
48.

$

Cost per Year

How much the policy will cost your household, in unavoidable price
increases for products and services you buy, including electricity and
common household products.



How Would you rate tHE IMPoRTANCE OF TheSE EFFECTS?

Question 3. When considering policies that affect how facilities use
cooling water, how important to you are effects on each of the following
scores?  Check one box for each.  (For reminders of what the scores
mean, please see page 7).

	Not Important	

	Somewhat Important	







Effect on commercial fish populations	(1	(2	(3	(4	(5

Effect on the fish populations (for all fish)	(1	(2	(3	(4	(5

Effect on fish saved	(1	(2	(3	(4	(5

Effect on the condition of aquatic ecosystems	(1	(2	(3	(4	(5

Effect on cost to my household 	(1	(2	(3	(4	(5



The next questions will ask you to choose between different policy
options that would affect fish losses in cooling water systems. You will
be given choices and asked to vote for the choice you prefer by checking
the appropriate box. Questions will look similar to the sample on the
next page. 

SAMPLE QUESTION

Questions will look like the sample below.

Policy Effect

Current Situation

Commercial Fish Populations

(in 3-5 Years)

43%

(100% is populations that allow for maximum harvest)	45%

(100% is populations that allow for maximum harvest)	47%

Fish Populations

(all fish)

(in 3-5 Years)

31%

(100% is populations without human influence)	34%

(100% is populations without human influence)	38%

Fish Saved per Year

(Out of 1.1 billion fish lost in water intakes)

0%

No change in status quo	25%

0.3 billion fish saved	50%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

48%

(100% is pristine condition) 	48%

(100% is pristine condition)	50%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$24 

per year

($2 per month)

 	$36 

per year

($3 per month)



HOW WOULD YOU VOTE?

(

I would vote for

(

I would vote for

(

I would vote for

As you vote please remember

The map below shows the facilities and areas that would be affected by
the proposed policies.

The policy options (A and B) given to you each require a different mix
of advanced filters and closed cycle cooling in different areas, so
effects on fish are different.  

You will be shown different questions, with different combinations of
technology and different costs

Depending on the policies chosen, costs to your household could range
from $0 per month to a maximum of $6 per month.

Depending on the type of technology required and other factors, effects
on fish and ecosystems may be different—even if the annual reduction
in fish losses is similar.

Consider each pair of policy options separately—do not add them up or
compare programs from different pages.

Scientists expect that effects on the environment and economy not shown
explicitly will be small. For example, studies of industry suggest that
effects on employment will be close to zero.

Your votes are important. Answer all questions as if this were a real,
binding vote.

Question 4. Assume that Options A and B would require a different mix of
filters and closed cycle cooling in different areas.  Assume all types
of fish are affected. How would you vote?

Policy Effect

NE Waters

Current Situation

(No policy)	Option A

NE Waters	Option B

Commercial Fish Populations

(in 3-5 Years)

43%

(100% is populations that allow for maximum harvest)	44%

(100% is populations that allow for maximum harvest)	45%

Fish Populations

(all fish)

(in 3-5 Years)

31%

100% is populations without human influence)	32%

(100% is populations without human influence)	35%

Fish Saved per Year

(Out of 1.1  billion fish lost in water intakes)

0%

No change in status quo	25%

0.3 billion fish saved	50%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

48%

(100% is pristine condition)	49%

(100% is pristine condition)	50%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$24 

per year

($2 per month)

 	$36 

per year

($3 per month)



HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)

(

I would vote for

 NO POLICY	(

I would vote for

OPTION A	(

I would vote for

POLICIES COULD REQUIRE DIFFERENT COMBINATIONS OF TECHNOLOGY 

Now you will be asked to consider a new set of policy options for
Northeast waters.  As you vote, please remember—

Questions 5 and 6 present new sets of policy options. These options
require a different mix of technologies in different areas. 

Each question is a separate vote. Questions 5 and 6 cannot be directly
compared to each other, or to Question 4.

Do not add up effects or costs across different questions.

Policy costs and effects depend on many factors. Saving more fish does
not necessarily mean that all effects will improve.

Question 5. Assume that Options A and B would require a different mix
of filters and closed cycle cooling in different areas.  Assume all
types of fish are affected. How would you vote?

Policy Effect

NE Waters

Current Situation

(No policy)	Option A

NE Waters	Option B

Commercial Fish Populations

(in 3-5 Years)

43%

(100% is populations that allow for maximum harvest)	45%

(100% is populations that allow for maximum harvest)	47%

Fish Populations

(all fish)

(in 3-5 Years)

31%

(100% is populations without human influence)	33%

(100% is populations without human influence)	36%

Fish Saved per Year

(Out of 1.1  billion fish lost in water intakes)

0%

No change in status quo	25%

0.3 billion fish saved	75%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

48%

(100% is pristine condition)	48%

(100% is pristine condition)	52%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$36 

per year

($3 per month)

 	$60 

per year

($5 per month)



HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)

(

I would vote for

 NO POLICY	(

I would vote for

OPTION A	(

I would vote for

OPTION B



Question 6. Assume that Options A and B would require a different mix
of filters and closed cycle cooling in different areas.  Assume all
types of fish are affected. How would you vote?

Policy Effect

NE Waters

Current Situation

(No policy)	Option A

NE Waters	Option B

Commercial Fish Populations

(in 3-5 Years)

43%

(100% is populations that allow for maximum harvest)	50%

(100% is populations that allow for maximum harvest)	47%

Fish Populations

(all fish)

(in 3-5 Years)

31%

(100% is populations without human influence)	39%

(100% is populations without human influence)	36%

Fish Saved per Year

(Out of 1.1  billion fish lost in water intakes)

0%

No change in status quo	95%

1.0 billion fish saved	50%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

48%

(100% is pristine condition)	52%

(100% is pristine condition)	52%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$72 

per year

($6 per month)

 	$24 

per year

($2 per month)



HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)

(

I would vote for

 NO POLICY	(

I would vote for

OPTION A	(

I would vote for

OPTION B



Question 7.  If you always voted for NO POLICY in questions 4-6, what
was the primary reason?  Check one. (Skip this question if you voted for
Option A or B in any question above.)

____ The cost to my household was too high

____ Preventing fish losses is not important to me

____ I do not trust the government to fix the problem

____ I would rather spend my money on other things

____ I did not believe the choices were realistic

____ Since the problem was created by private facilities, they should
fix it without passing costs on to consumers

Question 8.  Indicate how strongly you agree with the following
statements about questions 4 - 6 and the information provided. Check one
box for each.

Question 9.  How much did the following factors affect your answers to
questions 4 – 6?  Check one box for each row.



Moderate Effect

Very Large Effect

Question 10. How many days did you participate in the following during
the last year?  For trips longer than one day, please count each day
separately. Check one box for each row.

	Number of days you did the activity during the past year

	0	1-5	6-10	11-15	16+







	Boating / Canoeing / Kayaking	(1	(2	(3	(4	(5

Swimming / Going to the Beach	(1	(2	(3	(4	(5

Recreational Fishing (Fresh Water)	(1	(2	(3	(4	(5

Recreational Fishing (Salt Water)	(1	(2	(3	(4	(5

Shellfishing / Crabbing	(1	(2	(3	(4	(5

Scuba Diving / Snorkeling	(1	(2	(3	(4	(5



The following questions ensure that all groups are fairly represented.
All answers are anonymous and confidential.

What is your age?	years

What is your gender?   Male    Female

What is the highest level of education that you have completed?

Less than high school               	One or more years of
college

High school or equivalent	Bachelor’s Degree

High school + technical school  Graduate Degree

How many people live in your household?	

How many of these people are 16 years of age or older? ____

How many of these people are 6 years of age or younger? ____

What is your zip code?	

What town and state do you live in? Town:			State:____

Are you currently employed?	Yes	No

What category comes closest to your total household income?

( Less than $10,000	( $60,000 to $79,999

( $10,000 to $19,999	( $80,000 to $99,999

( $20,000 to $39,999	( $100,000 to $249,999

( $40,000 to $59,999	( $250,000 or more



If you have any comments on this survey, please write them below:

Thank you for your participation in this important survey!Attachment
2: Full Text of National Stated Preference Survey Component

Fish and Aquatic Habitat

A Survey of US Households

Human Activities, Aquatic Habitat and Fish

This survey asks for your opinions regarding policies that would affect
fish and habitat in the U.S.  Your answers will help the government
decide which policies will be enacted. Background information in this
survey was provided by the National Marine Fisheries Service, U.S.
Environmental Protection Agency, U.S. Geological Survey and other state
and federal offices.

Northeast fresh and salt waters support billions of fish. These include
fish that are used by humans, as well as forage fish that are not used
by humans, but serve as food for larger fish, birds, and animals. 

This survey concerns proposed policies that would reduce fish losses
caused by cooling water use by industrial facilities, including
factories and power plants. These policies would benefit aquatic
ecosystems but would increase the costs of some goods and services you
buy, including electricity and common household products.

How Does Cooling Water Affect Fish?

The water that industrial facilities use to cool equipment is pumped
from bays, rivers, and lakes. The largest amount is used by power plants
that produce electricity.

what kinds of Fish are Affected?

Cooling water use is not the largest cause of fish loss in most areas
(fishing causes greater losses), but has affected some fish populations.

About 1/3 of the fish lost are species caught by commercial and
recreational fishermen. Examples include striped bass, flounder, and
cod.

The other 2/3 of the fish lost are forage species not caught by humans
but are part of the food web. Examples include killifish, silverside,
and stickleback.

Question 1. When thinking about how industrial facilities use cooling
water, please rate the importance of the following to you.  Check one
box for each.

	Not Important	

	Somewhat Important	







Preventing the loss of fish that are caught by humans	(1	(2	(3	(4	(5

Preventing the loss of fish that are not caught by humans	(1	(2	(3	(4	(5

Maintaining the ecological health of rivers, lakes and bays	(1	(2	(3	(4
(5

Keeping the cost of goods and services low 	(1	(2	(3	(4	(5

Making sure there is enough government regulation of industry	(1	(2	(3
(4	(5

Making sure there is not too much government regulation of industry	(1
(2	(3	(4	(5



how many fish are affected?

After accounting for the number of eggs and larvae that would be
expected to survive to adulthood, scientists estimate that the
equivalent of about 2.7 billion young adult fish (the equivalent of one
year old) are lost each year in U.S. coastal and fresh waters due to
cooling water use.  

Scientists can predict the number of these fish that will be saved under
different policies.  This number ranges from 0.7 to 2.6 billion fish
saved per year. 

For commercial fish species, losses of young fish in cooling water
intakes vary by species, from the equivalent of less than 0.1% to about
10% of total populations.

Yearly effects on fish species are between these levels. For example,
the number of young fish lost in cooling water intakes relative to the
total number of fish in the water is relatively high for some species,
but low for others.

 

Although scientists can predict the number of fish saved each year, the
effect on fish populations is uncertain. This is because scientists do
not know the total number of all fish in U.S. waters and because many
factors – such as cooling water use, fishing, pollution and water
temperature – affect fish populations. 

New Regulations are Being Proposed to protect fish

Advanced filters and closed cycle cooling are already in use at many
facilities and are proven technologies. New regulations would require a
mix of advanced filters and closed cycle cooling at all
facilities—with reductions in fish losses between 25% and 95%.



How important are these issues to you?

While these policies would reduce fish losses, they would also increase
the costs of producing many goods and services — these costs would be
passed on to consumers like you.

Question 2. Compared to other issues that the government might
address—such as public safety, education and health—how important is
protecting aquatic ecosystems to you? Check one box. 

	Not Important	

	Somewhat Important	







Protecting aquatic ecosystems is	(1	(2	(3	(4	(5



The government needs to know whether households are willing to pay the
costs of these new policies. 

This survey will ask you to compare policies with different effects on
cooling water use, fish, and costs to your household. You will be asked
to vote for the options you prefer.

You will also have the opportunity to support the current situation,
with no new policies, and no new costs to your household.

This survey is similar to a public vote

The next part of this survey will ask you to consider different types of
policies to protect fish, and indicate how you would vote. Effects of
each possible policy will be described using the following scores:

Commercial Fish Populations

Fish Populations

Fish Saved

Condition of

Aquatic Ecosystems

A score between 0 to 100 showing the ecological condition of affected
areas, compared to the most natural waters in the U.S.. The score is
determined by many factors including water quality and temperature, the
health of aquatic species, and habitat conditions.  Higher scores mean
the area is more natural. The current score in U.S. waters is 48.

$

Cost per Year

How much the policy will cost your household, in unavoidable price
increases for products and services you buy, including electricity and
common household products.



How Would you rate tHE IMPoRTANCE OF TheSE EFFECTS?

Question 3. When considering policies that affect how facilities use
cooling water, how important to you are effects on each of the following
scores?  Check one box for each.  (For reminders of what the scores
mean, please see page 7).

	Not Important	

	Somewhat Important	







Effect on commercial fish populations	(1	(2	(3	(4	(5

Effect on the fish populations (for all fish)	(1	(2	(3	(4	(5

Effect on fish saved	(1	(2	(3	(4	(5

Effect on the condition of aquatic ecosystems	(1	(2	(3	(4	(5

Effect on cost to my household 	(1	(2	(3	(4	(5



The next questions will ask you to choose between different policy
options that would affect fish losses in cooling water systems. You will
be given choices and asked to vote for the choice you prefer by checking
the appropriate box. Questions will look similar to the sample on the
next page. 

SAMPLE QUESTION

Questions will look like the sample below.

Policy Effect

Current Situation

Commercial Fish Populations

(in 3-5 Years)

53%

(100% is populations that allow for maximum harvest)	55%

(100% is populations that allow for maximum harvest)	57%

Fish Populations

(all fish)

(in 3-5 Years)

39%

(100% is populations without human influence)	42%

(100% is populations without human influence)	42%

Fish Saved per Year

(Out of 2.7 billion fish lost in water intakes)

0%

No change in status quo	25%

0.7 billion fish saved	50%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

46%

(100% is pristine condition) 	46%

(100% is pristine condition)	48%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$24 

per year

($2 per month)

 	$48 

per year

($3 per month)



HOW WOULD YOU VOTE?

(

I would vote for

(

I would vote for

(

I would vote for

As you vote please remember

The map below shows the facilities and areas that would be affected by
the proposed policies.

The policy options (A and B) given to you each require a different mix
of advanced filters and closed cycle cooling in different areas, so
effects on fish are different.  

You will be shown different questions, with different combinations of
technology and different costs

Depending on the policies chosen, costs to your household could range
from $0 per month to a maximum of $6 per month.

Depending on the type of technology required and other factors, effects
on fish and ecosystems may be different—even if the annual reduction
in fish losses is similar.

Consider each pair of policy options separately—do not add them up or
compare programs from different pages.

Scientists expect that effects on the environment and economy not shown
explicitly will be small. For example, studies of industry suggest that
effects on employment will be close to zero.

Your votes are important. Answer all questions as if this were a real,
binding vote.

Question 4. Assume that Options A and B would require a different mix of
filters and closed cycle cooling in different areas.  Assume all types
of fish are affected. How would you vote?

Policy Effect

US

Current Situation

(No policy)	Option A

US	Option B

Commercial Fish Populations

(in 3-5 Years)

53%

(100% is populations that allow for maximum harvest)	55%

(100% is populations that allow for maximum harvest)	57%

Fish Populations

(all fish)

(in 3-5 Years)

39%

100% is populations without human influence)	40%

(100% is populations without human influence)	43%

Fish Saved per Year

(Out of 2.7  billion fish lost in water intakes)

0%

No change in status quo	25%

0.7 billion fish saved	50%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

46%

(100% is pristine condition)	47%

(100% is pristine condition)	48%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$24 

per year

($2 per month)

 	$48 

per year

($3 per month)



HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)

(

I would vote for

 NO POLICY	(

I would vote for

OPTION A	(

I would vote for

POLICIES COULD REQUIRE DIFFERENT COMBINATIONS OF TECHNOLOGY 

Now you will be asked to consider a new set of policy options for U.S.
waters.  As you vote, please remember—

Questions 5 and 6 present new sets of policy options. These options
require a different mix of technologies in different areas. 

Each question is a separate vote. Questions 5 and 6 cannot be directly
compared to each other, or to Question 4.

Do not add up effects or costs across different questions.

Policy costs and effects depend on many factors. Saving more fish does
not necessarily mean that all effects will improve.

Question 5. Assume that Options A and B would require a different mix
of filters and closed cycle cooling in different areas.  Assume all
types of fish are affected. How would you vote?

Policy Effect

US

Current Situation

(No policy)	Option A

US	Option B

Commercial Fish Populations

(in 3-5 Years)

53%

(100% is populations that allow for maximum harvest)	55%

(100% is populations that allow for maximum harvest)	57%

Fish Populations

(all fish)

(in 3-5 Years)

39%

(100% is populations without human influence)	41%

(100% is populations without human influence)	43%

Fish Saved per Year

(Out of 2.7 billion fish lost in water intakes)

0%

No change in status quo	25%

0.7 billion fish saved	75%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

46%

(100% is pristine condition)	46%

(100% is pristine condition)	50%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$36 

per year

($3 per month)

 	$60 

per year

($5 per month)



HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)

(

I would vote for

 NO POLICY	(

I would vote for

OPTION A	(

I would vote for

OPTION B



Question 6. Assume that Options A and B would require a different mix
of filters and closed cycle cooling in different areas.  Assume all
types of fish are affected. How would you vote?

Policy Effect

US

Current Situation

(No policy)	Option A

US	Option B

Commercial Fish Populations

(in 3-5 Years)

53%

(100% is populations that allow for maximum harvest)	60%

(100% is populations that allow for maximum harvest)	57%

Fish Populations

(all fish)

(in 3-5 Years)

39%

(100% is populations without human influence)	47%

(100% is populations without human influence)	44%

Fish Saved per Year

(Out of 2.7 billion fish lost in water intakes)

0%

No change in status quo	95%

2.6 billion fish saved	50%

Condition of Aquatic Ecosystems 

(in 3-5 Years)

46%

(100% is pristine condition)	50%

(100% is pristine condition)	50%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household

$0

No cost increase

	$96 

per year

($6 per month)

 	$48 

per year

($2 per month)



HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)

(

I would vote for

 NO POLICY	(

I would vote for

OPTION A	(

I would vote for

OPTION B



Question 7.  If you always voted for NO POLICY in questions 4-6, what
was the primary reason?  Check one. (Skip this question if you voted for
Option A or B in any question above.)

____ The cost to my household was too high

____ Preventing fish losses is not important to me

____ I do not trust the government to fix the problem

____ I would rather spend my money on other things

____ I did not believe the choices were realistic

____ Since the problem was created by private facilities, they should
fix it without passing costs on to consumers

Question 8.  Indicate how strongly you agree with the following
statements about questions 4 - 6 and the information provided. Check one
box for each.

Question 9.  How much did the following factors affect your answers to
questions 4 – 6?  Check one box for each row.



Moderate Effect

Very Large Effect

Question 10. How many days did you participate in the following during
the last year?  For trips longer than one day, please count each day
separately. Check one box for each row.

	Number of days you did the activity during the past year

	0	1-5	6-10	11-15	16+







	Boating / Canoeing / Kayaking	(1	(2	(3	(4	(5

Swimming / Going to the Beach	(1	(2	(3	(4	(5

Recreational Fishing (Fresh Water)	(1	(2	(3	(4	(5

Recreational Fishing (Salt Water)	(1	(2	(3	(4	(5

Shellfishing / Crabbing	(1	(2	(3	(4	(5

Scuba Diving / Snorkeling	(1	(2	(3	(4	(5



The following questions ensure that all groups are fairly represented.
All answers are anonymous and confidential.

What is your age?	years

What is your gender?   Male    Female

What is the highest level of education that you have completed?

Less than high school               	One or more years of
college

High school or equivalent	Bachelor’s Degree

High school + technical school Graduate Degree

How many people live in your household?	

How many of these people are 16 years of age or older? ____

How many of these people are 6 years of age or younger? ____

What is your zip code?	

What town and state do you live in? Town:			State:____

Are you currently employed?	Yes	No

What category comes closest to your total household income?

( Less than $10,000	( $60,000 to $79,999

( $10,000 to $19,999	( $80,000 to $99,999

( $20,000 to $39,999	( $100,000 to $249,999

( $40,000 to $59,999	( $250,000 or more



If you have any comments on this survey, please write them below:

Thank you for your participation in this important survey!

Attachment 3: Federal Register Notice

ENVIRONMENTAL PROTECTION AGENCY

[EPA-HQ-OW-2010-0595; FRL - ] 

Agency Information Collection Activities; Submission to OMB for Review
and Approval;  Willingness to Pay Survey for §316(b) Existing
Facilities Cooling Water Intake Structures (New), EPA ICR No. 2402.01,
OMB Control No. 2040-NEW 

AGENCY:	Environmental Protection Agency (EPA).

ACTION:	Notice.

SUMMARY:	In compliance with the Paperwork Reduction Act (PRA) (44 U.S.C.
3501 et seq.), this document announces that an Information Collection
Request (ICR) has been forwarded to the Office of Management and Budget
(OMB) for review and approval.  This is a request for a new collection. 
The ICR, which is abstracted below, describes the nature of the
information collection and its estimated burden and cost. 

DATES:  Additional comments may be submitted on or before [insert date
30 days after publication in the Federal Register].  

ADDRESSES:  Submit your comments, identified by Docket ID No.
EPA-HQ-OW-2010-0595 to: 

(1) EPA by one of the following methods:

www.regulations.gov: Follow the on-line instructions for submitting
comments.

Email:  OW-Docket@epa.gov, Attention Docket ID No. EPA-HQ-OW-2010-0595

Mail: Water Docket, Environmental Protection Agency, Mailcode: 28221T,
1200 Pennsylvania Ave., NW., Washington, DC 20460, Attention Docket ID
No. EPA-HQ-OW-2010-0595. Please include a total of 3 copies.

Hand Delivery: Water Docket, EPA Docket Center, EPA West, Room 3334,
1301 Constitution Ave., NW., Washington, DC, Attention Docket ID No.
EPA-HQ-OW-2010-0595. Such deliveries are only accepted during the
Docket’s normal hours of operation and special arrangements should be
made.

(2) OMB by mail to: Office of Information and Regulatory Affairs, Office
of Management and Budget (OMB), Attention: Desk Officer for EPA, 725
17th Street, NW, Washington, DC 20503.

FOR FURTHER INFORMATION CONTACT:  Erik Helm, Office of Water, Office of
Science and Technology, Engineering and Analysis Division, Economic and
Environmental Assessment Branch, 4303T, Environmental Protection Agency,
1200 Pennsylvania Ave., NW, Washington, DC 20460; telephone number:
202-566-1049; fax number: 202-566-1053; email address:
Helm.Erik@epa.gov.

SUPPLEMENTARY INFORMATION:  EPA has submitted the following ICR to OMB
for review and approval according to the procedures prescribed in 5 CFR
1320.12. On July 21, 2010 (74 FR 42438), EPA sought comments on this ICR
pursuant to 5 CFR 1320.8(d).  EPA received five comments during the
comment period, which are addressed in the ICR.  Any additional comments
on this ICR should be submitted to EPA and OMB within 30 days of this
notice.

	EPA has established a public docket for this ICR under Docket ID No.
EPA-HQ-OW-2010-0595 which is available for online viewing at
www.regulations.gov, or in person viewing at the Water Docket in the EPA
Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Ave., NW,
Washington, DC.  The EPA/DC Public Reading Room is open from 8:30 a.m.
to 4:30 p.m., Monday through Friday, excluding legal holidays. The
telephone number for the Reading Room is 202-566-1744, and the telephone
number for the Water Docket is 202-566-1752.

	Use EPA’s electronic docket and comment system at
www.regulations.gov, to submit or view public comments, access the index
listing of the contents of the docket, and to access those documents in
the docket that are available electronically.  Once in the system,
select “docket search,” then key in the docket ID number identified
above.   Please note that EPA’s policy is that public comments,
whether submitted electronically or in paper, will be made available for
public viewing at www.regulations.gov as EPA receives them and without
change, unless the comment contains copyrighted material, CBI, or other
information whose public disclosure is restricted by statute.  For
further information about the electronic docket, go to
www.regulations.gov. 

Title:  Willingness to Pay Survey for Section 316(b) Existing Facilities
Cooling Water Intake Structures: Instrument, Pre-test, and
Implementation 

ICR numbers: EPA ICR No. 2402.01, OMB Control No. 2040-NEW.  

ICR status:  This ICR is for a new information collection activity.  An
Agency may not conduct or sponsor, and a person is not required to
respond to, a collection of information, unless it displays a currently
valid OMB control number.  The OMB control numbers for EPA's regulations
in title 40 of the CFR, after appearing in the Federal Register when
approved, are listed in 40 CFR part 9, are displayed either by
publication in the Federal Register or by other appropriate means, such
as on the related collection instrument or form, if applicable.  The
display of OMB control numbers in certain EPA regulations is
consolidated in 40 CFR part 9.

Abstract:   Section 316(b) of the Clean Water Act (CWA) requires EPA to
ensure that the location, design, construction, and capacity of cooling
water intake structures (CWIS) reflect the best technology available
(BTA) to protect aquatic organisms from being killed or injured by
impingement or entrainment. At question here is the regulation of the
existing steam electric and manufacturing facilities.  

Under Executive Order 12866, EPA is required to estimate the potential
benefits and costs to society of proposed rule options of significant
rules.  To assess the importance of the ecological gains from the
section 316(b) regulation, EPA requests approval from the OMB to conduct
a stated preference survey.  Data from the associated stated preference
survey will be used to estimate values (willingness to pay, or WTP)
derived by households for changes related to the reduction of fish
losses at CWIS, and to provide information to assist in the
interpretation and validation of survey responses.  EPA has designed the
survey to provide data to support the following specific objectives: [a]
the estimation of the total values that individuals place on preventing
losses of fish and other aquatic organisms caused by 316(b) facilities;
[b] to understand how much individuals value preventing fish losses,
increasing fish populations, and increasing commercial and recreational
catch rates; [c] to understand how such values depend on the current
baseline level of fish populations and fish losses, the scope of the
change in those measures, and the certainty level of the predictions;
and [d] to understand how such values vary with respect to
individuals’ economic and demographic characteristics. 

The key elicitation questions ask respondents whether or not they would
vote for policies that would increase their cost of living, in exchange
for specified multi-attribute changes in [a] impingement and entrainment
losses of fish, [b] commercial fish populations, [c] long-term
populations of all fish, and [d] condition of aquatic ecosystems.  The
respondents’ stated preferences with respect to levels of
environmental goods and cost to households, when used in conjunction
with other information collected in the survey on the use of the
affected aquatic resources, household income, and other demographics,
can be analyzed statistically (using a mixed logit framework) to
estimate total WTP for the quantified environmental benefits of the
316(b) rulemaking. Data analysis and interpretation is grounded in a
standard random utility model.

The welfare values that can be derived from this stated preference
survey along with those that are estimated apart from the survey effort
will offer insight into the composition of the value people place on the
316(b) environmental impacts.  WTP estimates derived from the survey may
overlap - to a potentially substantial extent - with estimates that can
be provided through some other methods. Therefore, particular care will
be given to avoid any possible double counting of values that might be
derived from alternative valuation methods.

Burden Statement: The annual public reporting and recordkeeping burden
for this collection of information is estimated to average 5 minutes per
telephone screening participant and 30 minutes per mail survey
respondent including the time necessary to complete and mail back the
questionnaire. Burden means the total time, effort, or financial
resources expended by persons to generate, maintain, retain, or disclose
or provide information to or for a Federal agency.  This includes the
time needed to review instructions; develop, acquire, install, and
utilize technology and systems for the purposes of collecting,
validating, and verifying information, processing and maintaining
information, and disclosing and providing information; adjust the
existing ways to comply with any previously applicable instructions and
requirements which have subsequently changed; train personnel to be able
to respond to a collection of information; search data sources; complete
and review the collection of information; and transmit or otherwise
disclose the information.   

	The ICR provides a detailed explanation of the Agency’s estimate,
which is only briefly summarized here:

	Respondents:  Individuals from US households

Estimated total number of potential respondents: 9,533 for telephone
screening and 2,288 for mailed questionnaires.

	Frequency of response:  one-time response.

	Estimated total average number of responses for each respondent: 
one-time response.

	Estimated total burden hours: 1,938 hours.

Estimated total costs: $ 39,583.  EPA estimates that there will be no
capital and operating and maintenance cost burden to respondents.

Dated: ________________      

__________________________________

John Moses, Director, Collection Strategies Division. 

Attachment 4: Description of Statistical Survey Design

The following represents an anticipated experimental design for survey
implementation, along with the associated number of completed surveys
that will be required. Part B of this supporting statement provides
detail on the sampling design. The proposed design and sampling plan is
based on standard design and sampling theory for choice experiments and
population surveys, as outlined by Louviere et al. (2000), Kuhfeld
(2009) and Dillman (2000).  EPA notes that the anticipated experimental
design described here is preliminary and it may be subject to
refinements during design evaluations to account for issues such as
dominate or dominated pairs, ecological feasibility, and to remove
attribute combinations which do not provide information for estimation.

The purpose of the 316(b) survey is to calculate average per household
parameters (e.g., willingness to pay and choice probabilities) within a
given survey population. No sub-population estimates are required. The
anticipated experimental design for the choice experiment includes two
multi-attribute choice options or alternatives, A and B, together with a
fixed status quo or “no policy” option.  Options A and B are
characterized by levels for the following six attributes:

Location in A and B (x1A; x1B) – 2 possible levels

Fish Saved per Year in A and B (x2A; x2B) – 3 possible levels

Commercial Fish Populations in A and B (x3A; x3B) – 3 possible levels

Fish Populations (all fish) in A and B (x4A; x4B) – 3 possible levels

Aquatic Ecosystem Condition in A and B (x5A; x5B) – 3 possible levels

Cost in A and B (x6A; x6B) -  6 possible levels

This implies an experimental design characterized by [2×34×6] for each
alternative, or [22×38×62] for alternatives A and B combined.

To construct a preliminary main effects design that is sufficiently
flexible to estimate alternative specific main effects and response
patterns (i.e., a non-generic design), we begin with the smallest
available 100% efficient linear main effects plan for a full 22 38 62
design. This treats each attribute of each alternative as a separate
design element.  The result is a design with 72 profiles, with
attributes labeled following the above notation, and levels indicated by
integers 1...N, where N for each attribute is the number of levels
identified above:



Table 1: Set of 72 Design Profiles

Obs	x1A	x2A	x3A	x4A	x5A	x6A	x1B	x2B	x3B	x4B	x5B	x6B

1	1	1	1	1	1	1	1	1	1	1	1	1

2	1	1	1	1	2	2	2	2	2	2	2	2

3	1	1	1	3	1	6	2	2	3	3	1	3

4	1	1	1	3	3	5	2	1	1	2	3	2

5	1	1	2	2	1	3	1	1	3	2	2	3

6	1	1	2	2	2	1	2	3	1	1	2	4

7	1	1	2	2	3	4	1	2	2	2	1	1

8	1	1	2	2	3	6	1	3	2	3	3	6

9	1	1	3	1	1	2	1	3	2	3	2	5

10	1	1	3	1	2	3	2	1	3	1	3	6

11	1	1	3	3	2	4	1	2	1	3	3	4

12	1	1	3	3	3	5	2	3	3	1	1	5

13	1	2	1	1	2	4	2	3	3	2	1	6

14	1	2	1	2	2	5	1	3	3	3	2	1

15	1	2	1	2	3	2	1	3	1	1	3	3

16	1	2	1	3	1	3	2	3	2	2	3	4

17	1	2	2	1	1	5	2	1	2	3	1	4

18	1	2	2	1	3	1	2	2	3	2	3	5

19	1	2	2	3	1	2	1	2	1	1	1	6

20	1	2	2	3	2	6	1	1	1	2	2	5

21	1	2	3	1	3	4	1	1	1	3	2	3

22	1	2	3	2	1	1	1	2	3	3	3	2

23	1	2	3	2	2	6	1	1	2	1	1	2

24	1	2	3	3	3	3	1	2	2	1	2	1

25	1	3	1	1	3	6	2	2	3	1	2	4

26	1	3	1	2	1	4	2	1	2	1	3	5

27	1	3	1	2	2	3	1	2	1	3	1	5

28	1	3	1	3	3	1	1	1	2	3	2	6

29	1	3	2	1	2	5	2	2	2	1	3	3

30	1	3	2	1	3	3	2	3	1	3	1	2

31	1	3	2	3	1	4	2	3	3	1	2	2

32	1	3	2	3	2	2	2	1	3	3	3	1

33	1	3	3	1	1	6	1	3	1	2	3	1

34	1	3	3	2	1	5	2	2	1	2	2	6

35	1	3	3	2	3	2	1	1	3	2	1	4

36	1	3	3	3	2	1	2	3	2	2	1	3

37	2	1	1	1	1	1	2	1	1	1	1	1

38	2	1	1	1	2	2	1	2	2	2	2	2

39	2	1	1	3	1	6	1	2	3	3	1	3

40	2	1	1	3	3	5	1	1	1	2	3	2

41	2	1	2	2	1	3	2	1	3	2	2	3

42	2	1	2	2	2	1	1	3	1	1	2	4

43	2	1	2	2	3	4	2	2	2	2	1	1

44	2	1	2	2	3	6	2	3	2	3	3	6

45	2	1	3	1	1	2	2	3	2	3	2	5

46	2	1	3	1	2	3	1	1	3	1	3	6

47	2	1	3	3	2	4	2	2	1	3	3	4

48	2	1	3	3	3	5	1	3	3	1	1	5

49	2	2	1	1	2	4	1	3	3	2	1	6

50	2	2	1	2	2	5	2	3	3	3	2	1

51	2	2	1	2	3	2	2	3	1	1	3	3

52	2	2	1	3	1	3	1	3	2	2	3	4

53	2	2	2	1	1	5	1	1	2	3	1	4

54	2	2	2	1	3	1	1	2	3	2	3	5

55	2	2	2	3	1	2	2	2	1	1	1	6

56	2	2	2	3	2	6	2	1	1	2	2	5

57	2	2	3	1	3	4	2	1	1	3	2	3

58	2	2	3	2	1	1	2	2	3	3	3	2

59	2	2	3	2	2	6	2	1	2	1	1	2

60	2	2	3	3	3	3	2	2	2	1	2	1

61	2	3	1	1	3	6	1	2	3	1	2	4

62	2	3	1	2	1	4	1	1	2	1	3	5

63	2	3	1	2	2	3	2	2	1	3	1	5

64	2	3	1	3	3	1	2	1	2	3	2	6

65	2	3	2	1	2	5	1	2	2	1	3	3

66	2	3	2	1	3	3	1	3	1	3	1	2

67	2	3	2	3	1	4	1	3	3	1	2	2

68	2	3	2	3	2	2	1	1	3	3	3	1

69	2	3	3	1	1	6	2	3	1	2	3	1

70	2	3	3	2	1	5	1	2	1	2	2	6

71	2	3	3	2	3	2	2	1	3	2	1	4

72	2	3	3	3	2	1	1	3	2	2	1	3



Following common examples in the environmental economics literature, we
anticipate three choice questions per survey.  This will allow the 72
profiles to be included (orthogonally blocked) in 24 unique survey
booklets. Monte Carlo experiments indicate that approximately 6
completed responses are required for each profile in order to achieve
large sample statistical properties for choice experiments (Louviere et
al. 2000, p. 104, citing Bunch and Batsell 1989).  Following this
guidance, the above design will require 24×6 = 144 completed surveys,
or 6 completed surveys for each unique survey booklet.  This will
provide a total of 432 profile responses.

Sample Sizes for Maximum Acceptable Sampling Error

The goal of the choice experiment is to estimate regression coefficients
from mixed or conditional logit models that may be used to estimate
willingness to pay for multi-attribute policy alternatives, or the
likelihood of choosing a given multi-attribute alternative, following
standard random utility modeling procedures (Haab and McConnell 2002).
Required sample sizes to estimate population parameters with given
degrees of freedom are drawn from standard statistical theory, as
described by Dillman (2000).

The maximum acceptable sampling error for predicting response
probabilities (the likelihood of choosing a given alternative) in the
present case is ±10%, assuming a true response probability of 50%
associated with a utility indifference point.  Given the survey
population size, this level of precision requires a minimum sample size
of approximately 96 observations. The number of observations (completed
surveys) required to obtain large sample properties for the choice
experiment design provide more than sufficient observations to obtain
this required precision for population parameters.  The choice
experiment design requires 432 completed profile responses from 144
surveys. Statistical theory shows that 400 assumed-independent profile
responses will allow a 50% choice probability to be estimated within
±4.9 percent, at a 95% confidence level. With only a single choice
question per survey (144 responses), the same probability and confidence
level can be obtained within approximately ±8.2  percent; this is also
within acceptable limits.

Experimental Design Allowing Two-Way Interactions with Location (x1A;
x1B)

In order to estimate a parallel design that allows for interactions with
policy location, and insulates main effects from these interactions, we
follow the suggested design strategies of Louviere et al. (2000). 
First, we estimate the smallest available 100% efficient linear main
effects plan for non-location factors (X2 – X6) across both
alternatives A and B, [38×62].  This results in a main effects design
with 36 profiles:

Table 2: Set of 36 Design Profiles

Obs	x2A	x3A	x4A	x5A	x6A	x2B	x3B	x4B	x5B	x6B

1	1	1	1	1	1	1	1	1	1	1

2	1	1	2	2	2	2	2	2	1	2

3	1	1	2	3	4	3	2	1	1	6

4	1	1	3	2	6	3	1	2	2	4

5	1	2	1	1	4	2	1	3	2	5

6	1	2	2	2	3	1	3	1	3	5

7	1	2	2	3	5	3	3	2	3	1

8	1	2	3	3	2	1	1	3	1	3

9	1	3	1	2	6	3	3	1	2	3

10	1	3	1	3	3	2	2	3	3	4

11	1	3	3	1	1	2	3	2	3	6

12	1	3	3	1	5	1	2	3	2	2

13	2	1	1	1	5	2	3	1	1	4

14	2	1	2	2	5	2	1	3	3	3

15	2	1	3	2	1	3	2	3	3	5

16	2	1	3	3	3	1	3	1	2	2

17	2	2	1	1	3	3	2	2	1	3

18	2	2	2	3	1	1	1	2	2	4

19	2	2	3	2	4	2	2	1	2	1

20	2	2	3	3	6	2	3	3	1	6

21	2	3	1	2	2	1	1	1	3	6

22	2	3	1	3	4	3	1	2	3	2

23	2	3	2	1	2	3	3	3	2	1

24	2	3	2	1	6	1	2	2	1	5

25	3	1	1	3	2	2	3	2	2	5

26	3	1	1	3	6	1	2	3	3	1

27	3	1	2	1	3	3	1	3	2	6

28	3	1	3	1	4	1	3	2	3	3

29	3	2	1	2	1	3	3	3	1	2

30	3	2	1	2	5	1	2	2	2	6

31	3	2	2	1	6	2	1	1	3	2

32	3	2	3	1	2	3	2	1	3	4

33	3	3	2	2	4	1	3	3	1	4

34	3	3	2	3	1	2	2	1	2	3

35	3	3	3	2	3	2	1	2	1	1

36	3	3	3	3	5	3	1	1	1	5



This design is then repeated for each of the four possible combinations
of values for x1A and x1B (i.e., 1,1; 2,2; 1,2; 2,1) creating a
candidate design of 36×4 = 144 profiles.  The resulting design fully
insulates all main effects (eliminates aliasing) from interactions with
location attributes x1A and x1B, and allows associated location
interactions to be estimated independent of main effects.  The number of
required survey versions (48) is calculated by dividing the 144 profiles
by the three choice questions per survey.  The total number of completed
surveys required (288) is based on a requirement of six completed
surveys per survey version (48) to estimate the main effects and
interactions.  As above, the number of observations (completed surveys)
required to obtain large sample properties for the choice experiment
design provide sufficient observations to obtain required precision for
population parameters.  Also as above, this candidate design may be
subject to refinements based on design evaluations, as described by
Louviere et al. (2000) and Kuhfeld (2009), but this is unlikely to alter
design dimensions.

Attachment 5: Description of Internet-based Sampling Alternative

EPA is also considering the use of an internet-based (i.e., web-based)
sampling approach as an alternative to the mail survey described in Part
B of the supporting statement.  The survey content, pre-testing, review
process, and data analysis would be identical to the mail survey
described in Parts A and B of the supporting statement.  To support
implementation of the web-based alternative, EPA would retain a vendor
that maintains a preexisting representative national panel.  This
attachment provides a brief summary of the web-based alternative,
focusing on differences from the mail survey approach described in Part
B of the supporting statement.

Sampling Frame

The sampling frame for the web-based survey is the panel of individuals
previously recruited by a vendor to participate in online surveys.  The
overall sampling frame would be the set of all individuals in
continental U.S. households who are 18 years of age or older and who
have listed phone numbers.  Individuals in the panel most likely be
recruited using a list-assisted random digit dialing telephone
methodology, thus providing a probability-based starting sample of U.S.
telephone households.  Panels of this type are usually routinely
supplemented to account for panel attrition.  The preexisting panel
would have to closely tracks the U.S. population on age, race, Hispanic
ethnicity, geographical region, employment status, and other demographic
elements.  If differences between the panel and the U.S. population do
exist, sample weights using U.S. Census demographic benchmarks could be
developed to reduce error due to non-coverage of non-telephone
households and to reduce bias due to non-response and other non-sampling
errors.   

Sample Size

The intention is to obtain 2,288 completed web-based surveys from the
preexisting panel.  The same geographically stratified design will be
implemented following the same the same regions and sample sizes as
described in Part B of the supporting statement under the mail survey
approach.  Within each region, the sample will be stratified by
demographic variables including age, education, Hispanic ethnicity,
race, gender, and household income.  The actual number of panel members
contacted for this survey will be determined by the vendor’s knowledge
of within panel response rates.  Most likely, the stratification design
used in the web-based survey will systematically oversamples groups that
tend to have lower response and consistency rates so that the
demographic characteristics of respondents who provide valid completed
surveys will mirror the Census demographic benchmarks more closely. 
This potential oversampling correction technique will reduce error due
to non-coverage of non-telephone households in the original panel member
recruitment process, and will also reduce bias due to non-response and
other non-sampling errors.

Survey Response

The target response rate for the survey among panel members that are
sampled is 80 percent.  However, the actual response rate compared to
the general population is lower, given that the telephone recruitment
process for a preexisting panel has an expected response rate in the 30
to 40 percent range.  This will produce a lower cumulative response
rate.  Panel members who are selected for the survey will be contacted
to inform them that the survey is available online and reminders will be
sent to panel members who do not complete the survey within a short
period of time.  

Although the cumulative response rate for this survey may be low
compared to response rates typically achieved using other survey methods
(e.g., phone and mail surveys), the web-based format facilitates
geographic and demographic stratification, which will substantially
improve the quality of the survey.

Non-response and Other Non-Sampling Error Corrections

Approaches to minimize non-response bias in internet surveys are
established in the literature (Dillman 2000) and include initial RDD
probability sampling, careful panel management, and effective survey
sampling procedures.  The representativeness of preexisting panel survey
sample is generally comparable to those telephone surveys sponsored by
various Federal agencies.  Nonetheless, EPA would conduct several
additional activities to minimize the potential for non-response bias if
the survey were implemented through the use of a preexisting panel
including: (1) use of a stratified survey sample to mitigate the effect
of non-response bias, (2) use of non-response follow-up interviews to
increase the overall response rate, and (3) statistical correction for
unobserved heterogeneity that might be present in the data from
respondents interviewed for this study

Non-response Interviews

In order to minimize non-response bias from individuals who chose not to
respond during some stage of panel recruitment, EPA will conduct
non-response follow-up interviews within the initial sample for all
stages of panel attrition.  Respondents will include individuals who
were contacted but did not join the preexisting panel, who agreed to
join the panel but did not connect to the vendors web-site, who
connected but did not complete the first survey, who completed the first
survey but did not complete any following surveys, and who completed
following surveys but did not complete the 316(b) survey.  The end
result of this effort will be a direct measurement of non-response bias
and EPA will use the information to increase the effective cumulative
response rate by using a weighted response rate formulation.  

Statistical Correction for Unobserved Heterogeneity

EPA is also considering the use of a statistical technique to correct
for unobserved heterogeneity, based on work by Heckman (Heckman, 1979). 
This correction would be conducted as a complement to a non-response
study effort.  The technique will provide the probabilities of
participation for each subpopulation grouping (as defined by Census long
form data, voting data, and other data sources) included in the initial
sample frame.  These probabilities of participation will demonstrate the
extent of self-selection bias in the actual survey sample.  The
selection correction technique will take fully into account each stage
of the panel recruitment and sampling process: that is, all stages in
between construction of the sample frame for panel recruitment and
interviewing of the panel respondents.  

The technique could be used to produce both adjusted and unadjusted
estimates of WTP.  Because the technique yields regression estimates
that explain survey outcomes using subpopulation data related to
neighborhood characteristics, it can be used to identify the sources of
differences between the adjusted and unadjusted estimates.

As described above, EPA will use statistical techniques to correct for
unobserved heterogeneity in the survey sample and non-response follow-up
interviews to increase the effective survey response rate.  Combined,
these techniques will reduce error due to non-coverage of non-telephone
households in the original panel recruitment process, and also reduce
bias due to non-response and other non-sampling errors.  For an example
of a similar research effort that successfully used this web-based
survey approach to evaluate WTP for changes in water quality, see
Viscusi et al. (2008). 

Data Preparation

Since the survey will be administered as an online survey, survey
responses will be automatically entered into an electronic database at
the time they are submitted.  After all responses are submitted, the
database contents will be converted into a format suitable for use in
EPA’s statistical analysis. 

Attachment 6: Telephone Screener Script

Fish and Aquatic Habitat Telephone Screener

Hello, this is _________________calling from the Abt Associates.  We are
conducting an important survey of U.S. residents for the Environmental
Protection Agency, or EPA. This is not a sales call. 

SL1	[ASK IF SAMPLE=LANDLINE]

Could you please tell me how many people, age 18 and older, live in this
household?

IF NEEDED:  This study will help us to better understand the value of
environmental protection and public programs. Any answers you give are
kept strictly confidential.

IF ASKED:  This is a short survey which should take no more than five
minutes. 

0	None					THANK AND SCREEN OUT

1	One					

__  Number of persons 16+		SKIP TO SL1c		

98	Refused (VOL)			THANK AND TERMINATE – Soft Refusal

99	Refused (VOL)			THANK AND TERMINATE – Hard Refusal

SL1b	[ASK IF SAMPLE=LANDLINE]

May I speak with that person?

1	Rspn on line		SKIP TO SL2

2	Rspn called to phone	SKIP TO SL1d

3	Rspn unavailable		SCHEDULE CALLBACK

8	Refused			THANK AND TERMINATE – Soft Refusal

9	Refused  			THANK AND TERMINATE – Hard Refusal

SL1c	[ASK IF SAMPLE=LANDLINE]

In order to select just one person to interview, may I please speak to
the person in your household, age 18 or older, who (has had the most
recent/will have the next) birthday?

1	Rspn on line		SKIP TO SL2

2	Rspn called to phone		

3	Rspn unavailable		SCHEDULE CALLBACK

8	Refused			THANK AND TERMINATE – Soft Refusal

9	Refused			THANK AND TERMINATE – Hard Refusal

SL1d	[ASK IF SAMPLE=LANDLINE]

Hello, this is _________________calling from the Abt Associates.  We are
conducting an important survey of U.S. residents for the Environmental
Protection Agency, or EPA.  Could we begin now?

IF ASKED:  This is a short survey which should take no more than five
minutes. 

1	Yes			

2	No	time			SCHEDULE CALLBACK

8	Refused			THANK AND TERMINATE – Soft Refusal

9	Refused  			THANK AND TERMINATE – Hard Refusal

SL2	[ASK IF SAMPLE=LANDLINE]

Do you have a cell phone in addition to the line on which we’re
speaking right now?

1	Yes, also have cell phone

2	No, this is only phone		SKIP TO SA2

8	(VOL) Don’t know			THANK AND END, screen out

9	Refused				THANK AND END, soft refusal

SC1	[ASK IF SAMPLE = CELL]

Are you in a safe place to talk right now?

IF NEEDED: Are you Driving?  [IF RESPONDENT ANSWERS YES, DRIVING,
SCHEDULE CALLBACK AND TERMINATE - CODE AS 2]

IF ASKED:  This is a short survey which should take no more than five
minutes. 

1	Yes, safe place to talk					

2	No, call me later			SCHEDULE CALLBACK

3	No, CB on land-line			RECORD NUMBER, schedule call back

4	Cell phone for business only	THANK & END - BUSINESS#

8	Refused				THANK AND TERMINATE – Soft Refusal

9	Refused				THANK AND TERMINATE – Hard Refusal

SC2	[ASK IF SAMPLE=CELL]

Are you at least 18 years old?

IF NEEDED:  This study will help us to better understand the value of
environmental protection and public programs. Any answers you give are
kept strictly confidential.

1	Yes					

2	Yes, no time				SCHEDULE CALLBACK

3	No					SCREEN OUT

8	Refused				THANK AND TERMINATE – Soft Refusal

9	Refused				THANK AND TERMINATE – Hard Refusal

SC3	[ASK IF SAMPLE=CELL]

Do any other people age 18 or older regularly ANSWER your cell phone, or
just you?

INTERVIEWER: This question refers to the physical phone and not to their
calling plan

1	Yes, others

2	No, just respondent			SKIP TO SC4

9	Don’t know/Refused		SKIP TO SC4

SC3b	[ASK IF SAMPLE=CELL]

How many other people age 18 or older regularly answer your cell phone? 

	

__	[ENTER NUMBER 1-10]

99	Don’t know/Refused

SC4	[ASK IF SAMPLE=CELL]

Not counting any that are used strictly for business purposes, are there
other cell phones that you use regularly, or is it just the one? 

1	Yes, use other cell phones

2	No					SKIP TO SC5

9	Don’t know/Refused		SKIP TO SC5

SC4b	[ASK IF SAMPLE=CELL]

How many other cell phones do you use regularly, excluding those used
only for business purposes? 

__	[ENTER NUMBER 1-10]

99	Don’t know/Refused

 

SC5	[ASK IF SAMPLE=CELL]

Not counting (this/these) cell phone(s), do you also have a regular
land-line phone at home?

1	Yes, has a regular phone at home	

2	No, cell is only phone		SKIP TO SA2

7	Don’t know (VOL)			THANK AND TERMINATE – Screen out

8	Refused (VOL)			THANK AND TERMINATE – Soft Refusal

9	Refused (VOL)			THANK AND TERMINATE – Hard Refusal

SA1	[ASK BOTH SAMPLES]

Of all of the phone calls that you or your family receives, are…(Read
List) 

1	all or almost all calls received on cell phones,

2	some received on cell phones and some received on land lines, or 

3	very few or none on cell phones.

8	(VOL) Don’t know

9	(VOL) Refused 

SA2	[ASK BOTH SAMPLES]

Record gender from observation.  (Ask only if Necessary) 

1	Male

2	Female

Q1	[ASK BOTH SAMPLES]

First, compared to other issues that the government might address –
such as public safety, education and health – how important is
protecting aquatic ecosystem to you, on a scale of 1 to 5 where 1 is
“not important” and 5 is “very important”? (MATCHES QUESTION 2
IN THE MAIL SURVEY) 

1     2     3     4     5	(VOL) Don’t Know	(VOL) Refused

Q2	[ASK BOTH SAMPLES]

People have ideas about the extent to which the government should be
involved in protecting the environment.  On a scale of 1 to 5 where 1 is
“not at all involved” and 5 is “highly involved”, how involved
do you think the government should be in environmental protection? 

1     2     3     4     5	(VOL) Don’t Know	(VOL) Refused



Q3	[ASK BOTH SAMPLES]

Could you please tell me if you participated in each of following
activities during the last year? (DO NOT ROTATE)

1	Yes

2	No

8	(VOL) Don’t Know

9	(VOL) Refused

Boating, canoeing, or kayaking		1	2	8	9

Swimming/going to the beach		1	2	8	9

Fresh Water Recreational fishing 	1	2	8	9

Salt Water Recreational fishing		1	2	8	9

Fresh Water Recreational fishing 	1	2	8	9

Shell fishing or crabbing		1	2	8	9

Scuba diving or snorkeling		1	2	8	9

Now, I have just a few questions for classification purposes. 

D1	[ASK BOTH SAMPLES]

What is your age?

________ Years        Don’t Know       Refused

D2	[ASK BOTH SAMPLES]

What is the highest level of education that you have completed?  Is
it… (READ LIST)

1	Less than high school

2	High school or equivalent

3	High school and technical school

4	One or more years of college

5	Bachelor’s degree

6	Graduate degree

D3	[ASK BOTH SAMPLES]

Including everyone living in your household, which of the following
categories best describes your total household income before taxes?  Is
it … (READ LIST)

1	$10,000 or Less,

2	Between $10,001 and $20,000,

3	Between $20,001 to $35,000,

4	Between $35,001 to $50,000,

5	Between $50,001 to $75,000,

6	Between $75,001 to $100,000, or

7	More than $100,000

8	(VOL) Don’t know

9	(VOL) Refused

D4	[ASK BOTH SAMPLES]

Are you of Hispanic or Latino origin?

1	Yes

2	No

8	(VOL) Don’t know

9	(VOL) Refused

D5	[ASK BOTH SAMPLES]

Which of the following racial categories describes you?  You may select
more than one.  Would it be… (READ LIST – MULTIPLE RECORD)

1	American Indian or Alaskan Native,

2	Asian,

3	Black or African American,

4	Native Hawaiian or Other Pacific Islander, or

5	White

6	(VOL) Hispanic / Latino

8	(VOL) Other

9	(VOL) Refused

D6	[ASK BOTH SAMPLES]

Do you… (READ LIST)

Rent your home or apartment

Own your own home

Live with family or friends and pay part of the rent or mortgage

Live with family or friends and do not pay rent

7 	(VOL) Other, Specify

8	(VOL) Other

9	(VOL) Refused

RECRUITMENT:

In addition to this telephone survey, we are also conducting a mail
survey to help U.S. policymakers to better understand your priorities
for the environment and the use of public funds.  This survey will be
mailed to households along with a free reply envelope

The survey takes most people about 30 minutes to complete.

I1	[ASK BOTH SAMPLES]

Would you be willing to participate in this mail survey?

	

1	Yes

2	No			THANK FOR PARTICIPATION AND TERMINATE

3	Don’t know		THANK FOR PARTICIPATION AND TERMINATE

4	Refused		THANK FOR PARTICIPATION AND TERMINATE

I2	[ASK BOTH SAMPLES]

To what name and address should we mail the survey?  (RECORD NAME,
ADDRESS, CITY, STATE, AND ZIP CODE AND REPEAT BACK)

I3	[ASK BOTH SAMPLES]

And is this the best number to reach you at if we should have a problem
mailing your survey?

	

1	Yes			SKIP TO CLOSING

2	No			

	

I4	[ASK BOTH SAMPLES]

What is the best number to reach you? (RECORD NUMBER AND REPEAT BACK)

CLOSING:

Thank you.  The survey will be sent to you in approximately one to two
weeks.  The packet will include information about the study, how to
return your survey, and additional contact information in case you have
questions.  Thank you very much for your time, and have a great
evening/day.

Attachment 7: Cover Letter to Mail Survey Recipients

<date>

<given name> <surname> 

<address>

<city>, <state>  <zip code>-<zip+4>

Dear <title> <surname>:

A few days ago you agreed to take a short survey regarding environmental
protection and government regulations in the Northeast U.S.  Thank you
for agreeing to participate—the survey booklet is enclosed with this
letter.  Your answers to the survey will help officials from the
Environmental Protection Agency (EPA) to better understand the value of
policies which would affect the future of fish and aquatic habitat in
the Northeast.  By filling out this survey, you will be participating in
an important study that will help government officials understand your
priorities for the environment and regulations.

Your responses to this survey are very important.  Over time, human
activities have caused many changes in Northeast’s rivers, streams and
bays.  The Environmental Protection Agency is considering policies that
could impact the quality of fish and aquatic habitat in these areas. 
These policies can have different effects and costs.  Because of this,
it is important to know what types of policies are supported by
Northeast residents.

All answers to the survey are confidential. Once we have received your
survey, we will delete your name from all lists, so that your responses
can never be traced back to you.  Of course, your participation is
voluntary and you may refuse to answer any or all questions.  

 

We hope that you find this survey important and interesting, and thank
you for your assistance in this important project. We would greatly
appreciate if you could return the survey in the near future.

Sincerely,

Mary T. Smith, Director	

U. S. Environmental Protection Agency

Engineering and Analysis Division	Attachment 8: Post Card Reminder to
Mail Survey Recipients

FRONT

BACK

The stated preference format was chosen because it is the only method
that allows the estimation of non-use values, which are potentially
significant in this case.

 Non-user values are by definition non-use values.  Users can also hold
non-use values. However, user’s non-use values may differ from
non-users values due to familiarity with the resource. Thus, non-user
values are not necessarily representative of average non-use values.  

 The environmental attributes to be compared against the cost of living
increases where designed based  on the Johnston et al. (2009)
Bioindicator-Based Stated Preference Valuation (BSPV) method which was
developed to promote ecological clarity and closer integration of
ecological and economic information within SP studies. In contrast to
traditional SP valuation, BSPV employs a more structured and formal use
of ecological indicators to characterize and communicate
welfare-relevant changes. It begins with a formal basis in ecological
science, and extends to relationships between attributes in
respondents’ preference functions and those used to characterize
policy outcomes. Specific BSPV guidelines ensure that survey scenarios
and resulting welfare estimates are characterized by: (1) a formal basis
in established and measurable ecological indicators, (2) a clear
structure linking these indicators to attributes influencing
individuals’ well-being, (3) consistent and meaningful interpretation
of ecological information, and (4) a consequent ability to link welfare
measures to measurable and unambiguous policy outcomes. The welfare
measures provided by BSPV method can be unambiguously linked to models
and indicators of ecosystem function, are based on measurable ecological
outcomes, and are more easily incorporated into benefit cost analysis. 
This methodology was developed in part to address the EPA Science
Advisory Board’s call for improved quantitative linkages between
ecological services and economic valuation of those services.

 The assumed response rate is based on a similar stated preference
studies conducted by Johnston et al. (2010) in the Northeast U.S. 
Actual response rates could vary across study regions. 

 EPA plans to complete focus group testing of the instrument before the
second Federal Register notice of this information collection request. 
The inclusion of all four attributes is an important aspect of focus
group testing.  If focus groups find this number cognitively
challenging, the number will be reduced, and if cognitive issues are
minimal as identified by randomly selected focus group participants, all
four will remain.

 Alternatively, the smallest possible main effects plan for all
attributes (72 profiles) may be folded over to make another 72 profiles,
for a combined set of 144 profiles.  This protects main effects from all
unobserved 2-way interactions, but does not guarantee that specific
interactions may be estimated.  The required sample size is the same as
that which would guarantee that location interactions can be estimated.

  PAGE  i 

  PAGE  4 

 PAGE   

  PAGE  90 

 PAGE   

  PAGE  119 

18

19

18

19

18

17

16

15

14

13

12

11

11

10

If do not want A or B, check this box.

9

If you prefer Option A, check this box

If you do not want A or B, check this box

If you prefer Option B,

check this box

X

X

X

8

7

6

5

The government is considering policies that would require new measures
to protect fish.

One policy would require advanced filters that block fish from entering
cooling water facilities. Requiring advanced filters could reduce fish
losses about 25%. 

Another possibility is closed cycle cooling that recycles and reuses
cooling water, so that less water is needed. Requiring closed cycle
cooling could reduce fish losses by 95% and also reduces thermal
discharge. However, costs are higher than for advanced filters. 

Smaller effect on Striped Bass

Larger effect on Winter Flounder

4

4

3

3

2

Cooling water use affects fresh and salt waters throughout the US, but
almost all fish losses are in salt waters such as coastal bays. 

18

17

OMB Control No. 2040-XXXX    

Approval expires XX/XX/XX   

15

14

13

12

11

11

10

If do not want A or B, check this box.

9

If you prefer Option A, check this box

If you do not want A or B, check this box

If you prefer Option B,

check this box

X

X

X

8

7

6

Abt SRBI

Government Services Divison

8403 Colesville Road, Suite 820

Silver Spring, MD, 20910

5

The government is considering policies that would require new measures
to protect fish.

One policy would require advanced filters that block fish from entering
cooling water facilities. Requiring advanced filters could reduce fish
losses about 25%. 

Another possibility is closed cycle cooling that recycles and reuses
cooling water, so that less water is needed. Requiring closed cycle
cooling could reduce fish losses by 95% and also reduces thermal
discharge. However, costs are higher than for advanced filters.

Smaller effect on Striped Bass

Larger effect on Winter Flounder

4

4

3

3

Smallest Fish Losses per Year

Losses

< 0.1%

Total Fish Population

Largest Fish Losses per Year

1

1

Micro-organisms

Fish

Birds/Mammals

Food Web

Natural factors such as weather have always influenced fish, but in
recent years human activities have had an increasing effect.

Activities that affect fish include fishing, pollution, commercial and
residential development, and the extraction of cooling water at
industrial facilities.  

Declines in fish can affect the condition of ecological systems, food
webs, and related human uses such as fishing.

How Fish Are Affected by Water Intake

The equipment that pumps the cooling water kills small fish and fish
eggs.

Juvenile fish and eggs move through screens and into the cooling system
where they are killed by high temperature.

Large fish may be injured or killed against screens or filters. 

Pumping warm water back into the environment (called thermal discharge)
also affects ecological systems.

s

Þ

ß

6

	

	



(

)

*

+

/

0

Q

R

S

m

n

o

p

q

r

s

t

u

‘

’

“

”

˜

™

¼

½

¾

#¾

Ø

Ù

Ú

Û

Ü

Ý

Þ

ß

à

á

ý

þ

ÿ

 

 

0

1

2

3

4

5

6

7

8

T

#T

U

V

W

[

\

i

j

k

…

†

‡

‰

Š

‹

Œ

Ž

ª

«

¬



±

²

Õ

Ö

×

ñ

ò

ó

õ

ö

÷

ø

ù

ú

j;

#6

Œ

ø

摧వæĀ'ᰀú

j/

j¸

j¬

#

j

#

h

 hòK

hòK

h

h

hþ

h˜

h˜

 hë

h˜

hþ

h

hþ

h

hþ

摧жD

h

 

6

7

8

Ä

É

Ó

Õ

Ø

Ø

Ù

ã

å

 

1

Œ

–

—

©

«

¬

Â

Î

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

h›

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

h!

 h!

h!

 h!

 h!

h!

 h!

h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

 h!

h!

 h!

h

h

h

h

hõ

h

@

@

@

@

@

@

@

ᔉഀۆ쀐!搒Ũ

hv

h

hv



 

9

:

;

V

c

“

˜

™

è

³

´

h

hv

h

h

hv

hd

 h”:

h”:

h”:

h”:

 h”:

h”:

h”:

 h”:

h”:

h”:

h”:

̀Ĥꐔʀ␱㜀$␸愀Ĥ摧ᐎ×	㄀$␷㠀$摧ᐎ×

摧檞ÆЀ1摧檞Æ

# h

 h

# h

 h

 h

 h

# h

. h

+ h

 h

. h

. h

+ h

$ h

 h

 h

# h

 h

 h

 h

̀Ĥꐔʀ␱㜀$␸愀Ĥ摧ᐎ×

B*

愀Ĥ摧ᐎ×	㄀$␷㠀$摧ᐎ×

☊଀၆᐀㄀$␷㠀$摧ᐎ×

B*

Œ

h

# h

 h

# h

 h

 h

 h

# h

. h

+ h

 h

. h

. h

+ h

$ h

 h

 h

# h

 h

 h

 h

B*

D

º

Œ

â

ã

ä

å

æ

ç

é

h

ࠀ

kdÌ

kdY

Œ

ሀā愀$摧曹

 h©

摧曹

B*

葞ː摧曹

FfL

最둤ᬁ氀჆

最둤ᬁ氀჆

  h

# h

ᔕ葨繰ᘀ畨ﹶ䈀ࠪ桰￿ÿᰀ

最๤휔氀჆

# h

  h

᠀Cooling Water Intake Screen

1

1

Natural factors such as weather have always influenced fish, but in
recent years human activities have had an increasing effect.

Activities that affect fish include fishing, pollution, commercial and
residential development, and the extraction of cooling water at
industrial facilities.  

Declines in fish can affect the condition of ecological systems, food
webs, and related human uses such as fishing.

Food Web

Micro-organisms

Fish

Birds/Mammals

Last week a survey was mailed to you concerning environmental protection
and government regulations in the Northeast U.S.  If you have already
returned your completed survey, please accept our sincere thanks.  

If you have not yet completed your survey, we ask that you please do so
today.  You are one of a select few who have been chosen to
participate—your answers will help us understand your priorities for
the environment and regulations in the Northeast U.S. 

If you have misplaced your survey, please contact Ryan Stapler at 

(617) 520-3524 or ryan_stapler@abtassoc.com for a replacement.  

Regards,

Mary T. Smith

Environmental Protection Agency

2

Cooling water use affects fresh and salt waters throughout the Northeast
US, but the majority of all fish losses are in salt waters such as
coastal bays. 

How Fish Are Affected by Water Intake

The equipment that pumps the cooling water kills small fish and fish
eggs.

Juvenile fish and eggs move through screens and into the cooling system
where they are killed by high temperature.

Large fish may be injured or killed against screens or filters. 

Pumping warm water back into the environment (called thermal discharge)
also affects ecological systems.

Cooling Water Intake Screen

Total Fish Population

Losses

10%

4

Larger effect on Winter Flounder

Smaller effect on Striped Bass

Smallest Fish Losses per Year

Losses

< 0.1%

Total Fish Population

Largest Fish Losses per Year

Total Fish Population

Losses

10%

OMB Control No. 2040-XXXX    

Approval expires XX/XX/XX   

4

Larger effect on Winter Flounder

Smaller effect on Striped Bass

16

