                           Supporting Statement for:
                                       
Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona


1. Identification of the Information Collection.

	1(a) Title of the Information Collection

Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona (New), EPA #2484.01, OMB #2080-NEW

	1(b) Short Characterization/Abstract

The USEPA Office of Research and Development is investigating public values for scenarios of change for perennial reaches of the effluent-dominated Santa Cruz River, Arizona. These values will be estimated via a willingness to pay mail survey instrument. There are two effluent-dominated perennial reaches considered in the survey. A "South" reach which starts at an outfall in Rio Rico, AZ, flowing northward through Tumacácori National Historical Park. A "North" reach, fed by two outfalls in northwest Tucson, Arizona, flows northwest through Marana, AZ. In other locations north of where the channel crosses the border with Mexico about 5 miles east of Nogales the Santa Cruz River is ephemeral. For each of the South and North reaches, two different scenarios of change are considered. The first is a reduction in flow length, and associated decreases in cottonwood-willow riparian forest, a rare forest type in the region. The second is an increase in water quality to allow full contact recreation, such as submersion, at normal flow levels. The baseline flow length and forest acreages, as well as the acreages of forest that would be associated with reduced flow lengths, are derived from natural scientists. A choice experiment framework is used with statistically designed tradeoff questions, where options to maintain flow length and forest, or increase effluent water quality, are posed as increases in a yearly household tax. Each choice question allows a zero cost "opt out" option. The choice experiment is designed to allow isolation of the value of marginal change for each reach. A few additional questions to further understand the motivations for respondent choices, as well as their river-related recreation behavior, are also included. Several pages of background introduce the issue to respondents. Limited sociodemographic questions are included to gauge how well the sample respondents represent the target population. Samples of the two major metropolitan areas in southern Arizona, Phoenix and Tucson, will receive the survey. 

2. Need for and use of the Collection 

	2(a) Need/Authority for the Collection
Current ORD research revolves around the theme of sustainability (USEPA, 2013a). An overarching goal cited on the EPA website for sustainability research is:
	"EPA Sustainable communities research is providing decision tools and data for 	communities to make strategic decisions for a prosperous and environmentally 	sustainable future, and providing the foundation to better understand the balance between 	the three pillars of sustainability- environment, society and economy" (USEPA, 2013b). 
As part of including public input for finding the "balance" of sustainability, this survey research will estimate public values for potential management changes to the Santa Cruz River in southern Arizona. The Santa Cruz watershed is a subject of continuing research collaboration between USGS, ORD, and other partners, with a peer-reviewed research plan published by Norman et al. (2010). ORD is also collaborating with recipients of a recent National Science Foundation grant engaging in Santa Cruz River natural and social science research, led by the University of Arizona (NSF, 2010). The survey will gather public value information on Santa Cruz River management scenarios to complement partnering natural science research.    

	2(b) Practical Utility/Users of the Data

The primary reason for the proposed survey is public value research. The Santa Cruz River is a case study of a waterway highly impacted by human modifications. However it still represents potentially valuable ecological goods such as rare riparian habitat and recreational opportunities for the regional population. As a secondary purpose, the survey results may be informative to local decision-makers considering Santa Cruz River management options. Water scarcity in the region raises periodic debates on the best uses of effluent.

3. Non duplication, Consultations, and Other Collection Criteria 

	3(a) Non duplication

Willingness to pay survey research has occurred for changes to other US rivers, as well as other southwestern US rivers (e.g. Weber and Stewart, 2009; Berrens et al., 2000). While techniques of benefit transfer could be applied to the results of prior studies in an attempt to gain a limited sense of the public value for Santa Cruz River management options, there are a number of hurdles to value estimates derived by benefit transfer (e.g. Desvousges et al., 1992, Brouwer, 2000). Differences in the ecological good being considered, local availability of substitute resources, and local tastes and preferences are some of the limitations in transferring estimates from one study or group of studies to a new valuation context. The proposed survey incorporates natural science modeling of the relationship between surface water and riparian forest. The ecological changes respondents would value are tailored to the Santa Cruz River, have been explicitly defined in the survey, and the survey has been extensively pretested on the regional population to minimize cognitive problems.

EPA has not identified any other studies that would consider public values for Santa Cruz River options considered in the proposed survey. The survey options were specifically designed to encompass changes to both the South and North effluent-dominated perennial reaches. The options were also carefully defined to specify miles of flow, acreage of riparian forest, and safety of water contact for different types of recreation. The language, graphics, and question formats in the survey were carefully pretested. However, there is prior willingness to pay survey research pertaining to the South reach of the Santa Cruz River (Frisvold and Sprouse, 2006). Although full documentation and results of that survey are neither available nor published, further information on this survey was sought before initiating plans for this new collection (personal communication with G. Frisvold, August, 2009). It was determined that the prior survey, even if results become available, would not be sufficient for the Santa Cruz research goals of this study. This survey will ask respondents to choose between numerically defined levels of environmental attributes matching natural science quantitative modeling (alongside varying cost levels). In contrast, the prior survey asked a more general question of willingness to pay "to permanently preserve the Santa Cruz River habitat as it is today", using photos to describe a baseline perennial stream habitat, and a changed habitat without perennial flow. Unfortunately the specific marginal change to be valued was not defined, limiting the reliability of respondent data as well as clouding how the data could be applied to a specific management scenario. Furthermore, a major geographic difference is that the prior survey focused on the South Santa Cruz only, whereas this survey considers management changes to both the South and North Santa Cruz River, and the reaches have significant differences in both vegetation and proximity to the Tucson urban area. 

	3(b) Public Notice Required Prior to ICR submission to OMB

This is the first of two federal register notices.

	3(c) Consultations

The principal investigator for this effort is Matthew Weber, postdoctoral researcher at USEPA, ORD, Western Ecology Division, Corvallis, OR. The principal investigator has past direct experience with willingness to pay survey research, with a study estimating public values for management changes for the river and riparian area of the Rio Grande in Albuquerque, New Mexico (Weber and Stewart, 2009). Previously approved OMB surveys were consulted in designing this survey, in particular a NOAA coral reef valuation study (OMB # 0648-0585) and an EPA study on fish and aquatic habitat impacts from cooling water intake structures (OMB # 2020-0283). The survey instrument booklet format and several questions were adapted from the EPA study. The principal investigator participated in a workshop amongst stated preference survey practitioners working on federal government projects, convened by NOAA and Stratus Consulting in June of 2012 (NOAA and Stratus Consulting, 2012). This workshop was a helpful forum for comparing notes in willingness to pay survey design, with an emphasis on strategies for presenting ecological goods in a way meaningful to the lay public. Several completed or working draft willingness to pay survey instruments were presented for group discussion. 

This survey uses an explicit approach to defining ecological commodities to be valued, following guidance in Boyd and Banzhaf (2007). Dr. Paul L. Ringold, research ecologist at at USEPA, ORD was consulted for his experience identifying publicly valued stream commodities and metrics (Ringold et al., 2009; Ringold et al., 2013). Although the need to clearly define the ecological good to be valued is established advice for stated preference research (Arrow et al., 1993), Boyd and Krupnick (2009) found that practitioners sometimes use only vague language to define water-related ecological commodities.

This survey includes natural science modeling outcomes which quantify the relationship between riparian forest acreage and surface flow extent in both the South and North Santa Cruz River, two river areas with different hydrogeological conditions. The natural science modeling is based on dissertation research at Arizona State University (White, 2011) and further analysis by Arizona State University Professor Dr. Juliet Stromberg (personal communication, May, 2013). Drs. White and Stromberg were consulted throughout development of the natural science background provided in the survey instrument as well as the ecological changes the survey poses. 

Additional consultations were conducted to ensure changes associated with river flow and forest extent posed in the survey are realistically aligned with potential changes for Santa Cruz River management, to facilitate respondent acceptance of the scenarios. Dr. Thomas Meixner and Ph.D. student Rewati Niraula at the University of Arizona were consulted on potential surface streamflow extents under different effluent release scenarios for the North and South reaches of the Santa Cruz River. The most recent Arizona Department of Environmental Quality (2010) report summarizing water quality status for the South and North reaches of river were reviewed. Persons with either research or management interests in the Santa Cruz River (USGS, University of Arizona, Pima County, and Tumacácori National Historical Park) were consulted on the range of likely treated wastewater releases into the South and North reaches of the Santa Cruz River, and the current state of water quality.  

	3(d) Effects of Less Frequent Collection

Without this collection the public values for Santa Cruz River management scenarios being modeled by collaborating natural scientists could not be estimated. There would then be a gap in understanding the public welfare relevance of the associated natural science studies. Furthermore, results from the case study would not be available to the willingness to pay literature. Finally, there would also be less information on public values regarding Santa Cruz River management available to the local community for their planning purposes. 

	3(e) General Guidelines

The survey will not violate any of the general guidelines described in 5 CFR 1320.5 or in
EPA's ICR handbook.

	3(f) Confidentiality

All responses to the survey will be kept confidential. The surveys will be processed, including data entry, by the principal investigator; nobody else will have a record of who has responded or the answers of any given respondent. A list of the addresses of the members of the sample who have responded versus those who have not will be maintained in order to more efficiently mail reminders and replacement surveys. This will be a single file, accessible to and updated only by the principal investigator. To protect confidentiality in survey results, each respondent will be identified by a numeric code in that file rather than their name or address. The survey questions do not ask for any personally identifiable information and personally identifiable information will not be entered in the results even if volunteered by the respondent, for example in the comments section. In the cover letter, respondents will be informed that their responses will be kept confidential. After the data collection is complete, the respondent status file will be deleted, and only the numeric code assigned to each respondent will remain. After data entry is complete, the surveys themselves will be destroyed.

The USEPA office location (the Western Ecology Division of USEPA) and USEPA electronic file system used by the principal investigator are highly secure. A keycard possessed only by USEPA employees and contractors is necessary to enter the building. The principal investigator is then in a separate keyed office space within the secure building. The computer system where the personal names and addresses associated with respondent numeric codes will be stored during the process of data entry is a secure server requiring principal investigator personal login username and password. At the conclusion of data entry, this file linking personal names and addresses to respondent codes will destroyed (along with hard copy survey responses themselves) at the conclusion of data entry and only respondent codes will remain. 

	3(g) Sensitive Questions

In focus groups two questions were found to be sensitive by some of the participants; the question of racial category, and the question of income category. However it was found that describing the research need for these questions, of gauging how well different groups are represented in survey results, was accepted by these participants as a worthwhile reason for asking these questions. The reason for these questions now prominently appears preceding those questions within the survey: "We need the following questions to ensure votes from all groups have been fairly represented in this survey". Confidentiality of responses is then reiterated.

4. The Respondents and the Information Requested 

	4(a) Respondents/SIC Codes

The target respondents for this survey are representatives 18 yrs or older of households in the two most populated urban areas of Arizona, the Phoenix metro area, and the Tucson metro area. A sample of household representatives 18 yrs or older in each metro area will be contacted by mail following multiple contact protocol in Dillman (2009). A response rate of 30% will be targeted. To increase response rates from the sample, several contacts will be used, including a prenotice to all recipients, a reminder postcard, and followup mailing. The target responses from the Phoenix and Tucson metro areas is 250 households each, or 500 households total.  

	4(b) Information Requested

		(i) Data items, including record keeping requirements

The current draft survey has also been uploaded to the federal register (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). The survey is divided into 4 main parts. The first part is background for the choice questions. The second part is the choice questions themselves. The third part is questions designed to understand the context for why respondents responded to the choice questions as they did. These questions include attitudinal questions as well as recreational preferences questions. The fourth part is designed to assess whether major sociodemographic categories of the received sample are representative of the population sampled. There are no record keeping requirements.

		(ii) Respondent Activities

The following respondent activities are envisioned. Participants will read the cover letter and survey, respond to the survey questions, and return the survey using a provided postage paid envelope. Focus group and cognitive interview participants typically took no longer than 30 minutes to complete the survey, so 30 minutes per response is the estimated burden for the average respondent. 

5. The Information Collected - Agency Activities, Collection Methodology, and Information Management

	5(a) Agency Activities

Development of the survey questionnaire through focus group and cognitive interview pretesting is occurring under the separate ICR# 2090-0028. Pretest techniques follow standard approaches in the qualitative methods literature (Morgan and Krueger, 1998; Rubin and Rubin, 2005), as well as guidance in the economics literature for the specific purposes of pretesting a willingness to pay survey (Johnston et al., 1995; Kaplowitz et al. 2001, Hoehn et al. 2003).

Under this ICR, agency activities will include: 
      * Develop the choice experiment design
      * Obtain a representative sample mailing list for each of the two target metro area populations, Phoenix and Tucson
      * Printing of questionnaires
      * Mailing of prenotices
      * Mailing of cover letters and questionnaires
      * Reminder mailings
      * Follow-up mailings and replacement questionnaires to non-respondents as needed
      * Data entry and quality assurance of data file
      * Analysis of survey results, including characterization of nonresponse and potential degree of nonresponse bias
      * Modeling choice experiment results with a standard multinomial logit approach
      * Reporting survey results

	5(b) Collection Methodology and Management

The proposed survey is a choice experiment questionnaire delivered and returned by mail. Standard multi-contact mail survey methods will be used to increasing response rate (Dillman, 2009, pg. 242). The desired number of completed surveys is 250 in each of the Phoenix and Tucson metro areas with a target response rate of 30% in each area. Thus it will be necessary to contact approximately 834 households in each metro area.

Data quality will be monitored by checking returned survey responses for consistency, and by assessing any comments made on the survey or returned with the survey that signal strategic responses or respondent confusion. Coded survey data will not include any identifying information of the respondents. Returned survey data will be coded and used as the dataset for multinomial logit regression modelling.

	5(c) Small Entity Flexibility

This survey will be administered to individuals, not businesses. Thus, no small entities
will be affected by this information collection.

	5(d) Collection Schedule
			
A breakdown of the expected collection schedule is as follows:

   * Week 1: Printing surveys
   * Week 2: First contact mailing for pilot survey, notifying that a survey will be mailed in 1-2 weeks
   * Week 3 and 4: Pilot survey mailing
   * Week 5 and 6: Pilot survey reminder postcards mailing
   * Week 7 through 9: Data entry of pilot survey results. Revising estimation of the beta vector (coefficients on utility variables, see part B of the supporting statement). Adjusting the choice experiment and cost levels for the main survey mailing based on the beta vector estimated from the pilot survey
   * Week 10: First contact mailing for main survey mailing, notifying that a survey will be mailed in 1-2 weeks
   * Week 11 and 12: Main survey mailing
   * Week 13 and 14: Main survey reminder postcards mailing
   * Week 15 through 18: Main survey additional reminders and replacement surveys as necessary to reach target response rate
   * Week 19 to 20: Data entry

The schedule above is staged such that if response rates are higher or lower than expected, the appropriate number of replacement surveys will be printed and mailed to most efficiently use funds.

6. Estimating The Burden and Cost of the Collection

	6(a) Estimating Respondent Burden

For a typical respondent, a conservative estimate of their time to review and respond to survey questions is 30 minutes. Assuming the target of 500 people total respond to the survey, the burden is 250 hours. This would be a one-time expenditure of their time. 

	6(b) Estimating Respondent Costs
		(i) Estimating Labor Costs

The Bureau of Labor Statistics reports average wage rates for some metropolitan areas, with the most recent data being May 2011 (Bureau of Labor Statistics, 2011). The average hourly wage for all occupations in the Phoenix metro area was $21.61, or an average cost per participant of $10.81. The average hourly wage for all occupations in the Tucson metro area was $20.55, or an average cost per participant of $10.28. Assuming 250 participants in each metro area fill out the survey, the total estimated respondent labor cost is $5,270. 

		(ii) Estimating Capital and Operations and Maintenance Costs

There are no anticipated capital, operations or maintenance costs associated with this collection.

		(iii) Capital/Start-up Operating and Maintenance (O&M) Costs

There are no anticipated capital, operations or maintenance costs associated with this collection.

		(iv) Annualizing Capital Costs

There are no anticipated capital, operations or maintenance costs associated with this collection.

	6(c) Estimating Agency Burden and Cost

The various aspects of the survey mailing are assumed to be done by the principal investigator, with an associated hourly wage rate of $32.50. Preparing survey mailings, tracking nonrespondents, sending new mailings as needed, and data entry are anticipated to amount to 8 weeks total or 320 hours of work. Agency labor cost would be 320 hours times $32.50 per hour or $10,400. 

	6(d) Estimating the Respondent Universe and Total Burden and Costs

Assuming 250 participants in each of the Phoenix and Tucson metro areas fill out the survey, the total labor cost will be $5,270.

	6(e) Bottom Line Burden Hours and Cost Tables

Item
Quantity
Cost
Public Burden
Time burden: 0.5 hours per respondent
500 persons
250 hours; $5,270 labor
Agency Burden
Time burden
Entire project
320 hrs; $10,400 labor
Mailing list
1,000 names in Tucson area & 1,000 names in Phoenix area (accounts for % of incorrect addresses)
$800
Prenotice letter paper and printing
1,700 pieces
$100
Prenotice envelopes
1,700 pieces
$150
Prenotice postage (bulk mail)
1,700 pieces
$700
Color surveys paper and printing
2,000 pieces
(includes estimated replacements)
$2,900
Printing return envelopes 10.5" x 7.5"
2,000 pieces
(includes estimated replacements)
$450
Outgoing envelopes 11.5" x 8.75"
2,000 pieces
(includes estimated replacements)
$300
Outgoing survey postage (bulk mail)
2,000 pieces
(includes estimated replacements)
$1,900
Return survey postage (bulk mail)
500 pieces
$500
Reminder postcard paper & printing
1,700 pieces
$100
Total

$23,570

The estimated respondent burden for this study is 250 hours and $5,270. The estimated agency cost for this study is 320 hours and $10,400. Agency costs besides labor hours total $7,900 for the mailing list, paper, printing, and postage.

	6(f) Reasons for Change in Burden

The survey is a one-time data collection activity.

	6(g) Burden Statement

The annual public reporting and recordkeeping burden for this collection of information is estimated to average 0.5 hours per response.  Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency.  This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information.  An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number.  The OMB control numbers for EPA's regulations are listed in 40 CFR part 9 and 48 CFR chapter 15.     

  To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this ICR under Docket ID Number EPA-HQ-ORD-2013-0282, which is available for online viewing at www.regulations.gov, or in person viewing at the Office of Research & Development (ORD) Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Avenue, NW, Washington, D.C.  The EPA Docket Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays.  The telephone number for the Reading Room is (202) 566-1744, and the telephone number for the ORD Docket is (202) 566-1752.  An electronic version of the public docket is available at www.regulations.gov.  This site can be used to submit or view public comments, access the index listing of the contents of the public docket, and to access those documents in the public docket that are available electronically.  When in the system, select "search," then key in the Docket ID Number identified above.  Also, you can send comments to the Office of Information and Regulatory Affairs, Office of Management and Budget, 725 17th Street, NW, Washington, D.C. 20503, Attention: Desk Officer for EPA.  Please include the EPA Docket ID Number EPA-HQ-ORD-2013-0282 and OMB Control Number 2080-NEW in any correspondence.


Part B of Supporting Statement

1. Survey Objectives, Key Variables, And Other Preliminaries
(a) Survey Objectives

The objectives of the survey are bulleted below:
   * To estimate public values for changing the extent of the flow mileage and associated forest vegetation acreage along the effluent-dominated Santa Cruz River.
   * To estimate public values for full contact recreation in the effluent-dominated Santa Cruz such as submersion, as a change from partial body contact recreation such as wading.
   * To provide a case study for estimating public values for modifying river attributes of a waterway highly impacted by urban processes. 
   * To compare estimated public values for changing attributes of two different reaches of the Santa Cruz River, the South and the North.
   * To compare estimated public values between two population centers, the Phoenix metro area and the Tucson metro area. 
                                                          
(b) Key Variables

The survey asks respondents whether they would choose a permanent tax increase for their household in exchange for changes in Santa Cruz River attributes. The key variables are:

North Santa Cruz Flow and Forest: This is a bundled variable with two numeric values for each level specific to the North Santa Cruz. One number shows miles of surface flow, and another number shows acres of cottonwood/willow forest associated with that flow. The survey considers three possible levels of this variable. The future baseline level is termed the "expected future". Two different increases from the expected future are posed, the largest of which is the same as the current condition. 

North Santa Cruz Full Body Contact: This is a binary "Yes" or "No" variable specific to the surface flows in the North Santa Cruz. A "Yes" means the surface flow would be considered safe for full body contact at normal flow levels, including submersion. A "No" means the surface flow would be considered safe only for partial body contact, i.e. wading, and normal flow levels. The Expected Future and Current Condition are both "No", but the survey poses the possibility of a "Yes".

South Santa Cruz Flow and Forest: This is a bundled variable with two numeric values for each level specific to the South Santa Cruz. One number shows miles of surface flow, and another number shows acres of cottonwood/willow forest associated with that flow. The survey considers three possible levels of this variable. The future baseline level is termed the "expected future". Two different increases from the expected future are posed, the largest of which is the same as the current condition. 

South Santa Cruz Full Body Contact: This is a binary "Yes" or "No" variable specific to the surface flows in the South Santa Cruz. A "Yes" means the surface flow would be considered safe for full body contact at normal flow levels, including submersion. A "No" means the surface flow would be considered safe only for partial body contact, i.e. wading, and normal flow levels. The Expected Future and Current Condition are both "No", but the survey poses the possibility of a "Yes".

Tax increase per year: Each change varying from the expected future of reduced flows and cottonwood/willow acreages in the North and South has an associated cost. These cost levels currently vary from $0 for the Expected Future, to as high as $50 (subject to change based on pilot survey results). These cost levels are not tied to actual costs estimates for the changes, but rather are designed to bracket values. That is, the design goal is to set cost levels such that some people agree to them and some people don't agree to them. A pilot survey will be used to test whether cost levels posed in the survey should be revised.

(c) Statistical Approach

The statistical approach to analyzing survey results will be a multinomial logit model (Ben-Akiva and Lerman, 1985).

(d) Feasibility

The survey has been extensively pretested, as described below, to reduce cognitive difficulties for respondents. The principal investigator has research funding to cover the costs of the survey. 

2. Survey Design
(a) Target Population And Coverage

The target population is households of the Phoenix and Tucson metropolitan areas in southern Arizona. Respondent coverage will be such that each household in each respective area has an equal probability of being selected to receive a survey, based on the mailing list.

(b) Sample Design
(i) Sampling Frame

The sample frame is households of the Phoenix and Tucson metropolitan areas in southern Arizona. A sample will be such that each household in each respective area has an equal probability of being selected to receive a survey, based on best available mailing lists, available from companies specializing in preparing mailing lists for surveys.

(ii) Sample Size

There is no single sample size choice since there will be uncertainty with estimates regardless of sample size. This survey design will utilize the rule of thumb of available from the developers of Sawtooth Software (Orme, 1998), a software popular for designing choice sets. This formula was also recently utilized by NOAA (OMB # 0648-0585). The rule of thumb formula for a minimum sample size is:

(n x t x a)/c > = 500

Where:
n = minimum number of respondents
t = number choice questions
a = number of alternatives per task (not including the "status quo" option)
c = number of "analysis cells."

When considering main effects, c is equal to the largest number of levels for any single attribute. If considering all two-way interactions, c is equal to the largest product of levels of any two
attributes (Orme, 1998). The sample size will be based on a main effects model. A minimum of 3 choice questions will be in each survey, with two options each (not counting "expected future"), and the maximum number of levels for any single attribute is 3. Thus the minimum sample size "n" is 250 for each population to be sampled, and this is our target number of respondents for both the Phoenix and Tucson metro areas. This value is larger than the 200 minimum suggested when the intent is to compare subgroups (Orme, pg 67). Bateman et al. (2002; pg. 110) recommend a sample size of 500 to 1,000 (for each subgroup) for close-ended contingent valuation questions, but also note a smaller sample size can be used if one collects more information per respondent (as with replications in choice experiments). With a target response rate of 30%, this means approximately 834 households in each metro area will receive a survey.

(iii) Stratification Variables

The Phoenix and Tucson metro areas will be treated as different populations.

(iv) Sampling Method

A sample mailing list for each of the two metro areas will be purchased from a mail survey support company. The company will be given instructions to prepare the sample such that each household in each metro area has an equal chance of being chosen (a simple random sample approach).

(v) Multi-Stage Sampling

Not applicable.

(c) Precision Requirements
(i) Precision Targets

Louviere et al. (2000) provide a formula, based on elementary statistical theory, for the minimum sample size "n" for target accuracy and confidence interval of predicting a proportion, assuming a large sample frame population, a simple random sample strategy is used, and choice occasions from each respondent can be assumed to be independent:

n > = (q/rpa[2])Φ[-1]((1+α)/2)

where "p" is the population proportion, as predicted within "a" percent of the true value with probability "α" or greater, "q" = 1-p, "r" is replications (choice occasions per respondent) and "Φ[-1] (.)" is the inverse cumulative normal distribution function. This study will use a target of plus or minus 10% for predicting population proportions to be estimated, with a probability of 0.95. Assuming a population proportion of 0.3, and a minimum r of 3, the minimum sample size is 174 respondents, less than the 250 sample size planned for each metro area.

(ii) Nonsampling error

With a target response rate of 30% there will be a large percentage of nonrespondents. If preferences of nonrespondents differ markedly from respondents, nonresponse bias will affect the results. A nonresponse analysis will be conducted by comparing the sociodemographics of the respondents with the sampling frame. A description of any sociodemographics that were less represented in the responses will be included in the results writeup. If respondents and nonrespondents have markedly different sociodemographics, the sociodemographic vector of the sampling frame (as opposed to the respondents) will be fed into the model to obtain valuation estimates for an "average" household (Morrison, 2000).


(d) Questionnaire Design

The current draft survey is also uploaded to the federal register (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). Below is a description of the sections and questions.

PART 1
Background. The cover photo shows the result of a reduction in perennial stream length that has been recently documented on the Santa Cruz River. This change has already occurred and is not a change considered in the survey, but does pictorially shows a "before" and "after" view at a single location. Page 2 shows the Santa Cruz River within the landscape of the broader river network in Arizona, for perspective. Page 3 shows where the treated wastewater flow in the South and North reaches are in southern Arizona, background on how they came to exist, and introduces why these resources might be relevant to the respondent. Page 4 describes the partitioning of treated wastewater releases and the relatively small fraction that is consumptively used by plants or that evaporates in the perennial river ecosystem. Page 5 describes the type of riparian forest that will be a topic in the survey, and some of the wildlife that are found in that type of habitat. Pages 6, 7, and 8 describe the different attributes that will be in the later choice questions. The "Expected Future" with no intervention is described as opposed to potential management changes. The first attribute of management changes are those that would retain more of the current condition flow miles and riparian forest acres in the North or South. Representative photos are shown of the riparian area with and without perennial water, along with specified numeric changes of flow mileage and forest acres that the survey will consider.  Representative photos are shown for both South and North since the vegetation character is different; this difference is also described in the text and numerically in terms of acres of forest. Page 8 describes the second attribute of management changes, the safety of direct contact with the water in the South and North. The term Full Body Contact is equivalent to "swimmable" water quality but since the river reaches are typically not deep enough to swim in, we did not use the more familiar "swimmable" term since it could be misleading. Page 9 is an example vote showing the format of the choice question and page 10 describes how the attribute levels will be described.

PART 2
Question 1: This question begins the respondent's thought process of weighing the relative importance of the different attributes within the choice experiment in questions 2 - 6.

Questions 2 through 6: These questions comprise the choice experiment portion of the survey, where respondents choose between different cost levels and different marginal changes in river-related attributes. There is always an opt-out zero cost option. Following standard techniques of choice experiments, the options (also known as profiles) participants choose between will be a fraction of the theoretically possible combinations of attributes, selected to yield the most informative preference information.  There will be different survey versions, with five questions per survey (also known as "replications"), as an efficient method of allowing sufficient number of tradeoffs for model estimation. Different survey versions allow "blocking" the still large number of tradeoff questions posed into different groups. These practices save expense and also reduce the sample size and associated public burden.

Choice Experiment Design
All possible choice profile tradeoffs could be presented to respondents, but would be an inefficient way to gauge preferences. Instead, "fractional factorial" models are a standard approach (Louviere et al., 2000). A statistical software will be used to develop the most efficient choice experiment design, given a total number of design choice sets to manipulate, as well as a provisional beta vector (Kuhfeld, 2010). Essentially the software will search for the questions, given the constraint on choice sets, likely to yield the most preference information. The computer generated design will be manually checked for any potentially dominating choices, or scenarios that may seem unlikely to respondents. The total number of choice sets must be at least as large as the number of parameters to be estimated and is typically much more. A number of choice sets that allows each level to occur an equal number of times is also desirable, for balance (Kuhfeld, 2010; pg. 78).

PART 3
Questions 7 through 14: These questions shed light on motivations for respondents' answers, and also test for inconsistencies in their responses. Question 9 helps identify "protest bids", that is, occasions when people choose not to pay for philosophical reasons rather than a price that is too high. Recreation behavior questions 12 and 13 allow insight into recreational preferences specific to the resources being considered. 

PART 4
Questions 15 through 22: These are sociodemographic questions that allow comparing the received sample with the sample population as a gauge of representativeness. 

3. PRETESTS AND PILOT TESTS
Pretests

The survey content and format has undergone extensive pretesting. Several phases of qualitative survey development have occurred and are ongoing under a different ICR (ICR # 2090-0028). The earliest phases of survey development did not present study participants with a survey, instead the intention of those research sessions was to identify the most relevant variables to include. The first phase of qualitative cognitive interviews occurred in October 2011. Interviews were held with a convenience sample of 12 neighborhood presidents in Tucson. In the spring of 2012, there were 10 focus groups in southern Arizona, with 8 in Tucson, 1 in Rio Rico, and 1 in Tubac. Focus group participants were recruited from the general public by a market research contractor using standard market research methods, including paying participants an incentive fee as compensation for their opportunity cost of time. At this stage participants were still not asked to take or react to a survey, but instead were asked to help ORD identify attributes important to them about southern Arizona rivers and streams, as well as for the Santa Cruz River in particular. 

The information from the above qualitative research was used to develop a first draft of a survey focusing on the North Santa Cruz, the reach nearest to southern Arizona population centers. This version was pretested in the fall of 2011. Cognitive interviews were conducted with 17 persons, all recruited from the general population of Tucson by a market research contractor, again paying incentive fees. The survey draft included key variables identified by earlier qualitative research, but the list of attributes seemed to be near the limit for respondents to effectively consider. In addition there were two attributes both dealing with forest acreage that were easily confused. At the conclusion of the pretest, participants were asked to comment on the relative appeal of preserving the North Santa Cruz River versus the South Santa Cruz River, based on a map of their respective locations and representative photos of the two areas. Many participants preferred maintaining the South Santa Cruz location despite it being further away from Tucson. Based on these pretests it was decided to narrow the scope of attribute types, and to feature both the North and South so as to include two different effluent-dominated sections of the same river.

For the revised survey, the principal investigator requested and received updated natural science modeling of the relationship between surface water and forest acreage for the North and South Santa Cruz River (personal communication, J. Stromberg, May 2013). In the spring of 2013, there were an additional 2 focus groups and 8 cognitive interviews, with one cognitive interview still pending. One of the focus groups and 2 of the interviews were with people living in the Phoenix area, the pending interview is also with someone living in the Phoenix area, with one focus group and the remaining interviews being with persons living in the Tucson area. The survey booklet format and improved debriefing questions were adapted from a prior EPA survey (OMB # 2020-0283). The current draft survey instrument has also been uploaded to the federal register (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). 

The current draft reflects several edits that were made during the latest round of pretests (which is still ongoing). A primary change was revising the description of the full contact vs. partial contact recreation. It was originally posed as "Safe" for contact recreation or "Unsafe" for contact recreation. In the early sessions it was found that people frequently felt compelled to vote for "Safe", presuming there to be a significant public health hazard if the water was left "Unsafe". The labeling in the choice experiment matrix was thus changed to the current "Yes" vs. "No" language, along with a fuller description of the attribute, which helped participants have a more accurate grasp of the issue. Another change was inclusion of more background describing "expected future" and the competing options. In particular, the revisions emphasized the difference with North and South forest which was not well-conveyed for some respondents despite photo visual aids. In addition, color coding for attributes that remain at "expected future" levels was used to make differences between choice questions clear at a glance; previously, many participants described confusion that the series of choice questions appeared to be the same. 

Pilot Test 

A pilot survey will be mailed to a subset of the Phoenix and Tucson samples. This will not represent an additional burden, but will be some designated fraction of the total mailing. This will allow for the possibility of adjusting the survey for any problems that may surface after this initial wave of survey returns before committing to the full mailing. The entire beta parameter vector (see econometric specification section below) may be revised based on analysis of the results, in particular the cost levels may need to be adjusted to efficiently bracket values. Costs will not vary between the Phoenix and Tucson samples to guard against "starting point bias" as being a potential reason for any statistically significant changes in values between Phoenix and Tucson respondents.

4. COLLECTION METHODS AND FOLLOW-UP.
(a) Collection Methods

A mail survey collection method is selected due to its frequent and successful use in the choice experiment literature and its relatively low cost.

(b) Survey Response And Follow-up

Multiple contact methods (Dillman, 2009; pg 242) will be used. Thus, those who have already responded will be tracked in a spreadsheet to ensure follow-up mailings are only sent to those who have not yet responded.

5. ANALYZING AND REPORTING SURVEY RESULTS
(a) Data Preparation

All data entry will be conducted by the Principal Investigator. Debriefing question responses that are at odds with voting question responses will be used to flag respondent confusion, and these data will not be used. Responses that indicate protest to the payment vehicle used in this survey, i.e. philosophical objection to increased taxes, will not be used. Responses from persons less than 18 yrs of age as indicated from the `what year were you born' question will not be used.  

(b) Analysis

A standard multinomial logit model, as described by Ben-Akiva and Lerman (1985), will be fit to the data. Let U = utility of household (well-being). Consider U to be a function of a vector zin of attributes for alternative i, as perceived by household respondent n. The variation of preferences between individuals is partially explained by a vector Sn of sociodemographic characteristics for person n.

Uin = V(zin, Sn) + ε(zin, Sn) = Vin + εin 

The "V" term is known as indirect utility and "ε" is an error term treated as a random variable (McFadden 1974), making utility itself a random variable. An individual is assumed to choose the option that maximizes their utility. The choice probability of any particular option (Expected Future, Option A, or Option B) is the probability that the utility of that option is greatest across the choice set Cn:

P (i│Cn) = Pr[Vin + εin  >=  Vjn + εjn , for all j ∈ Cn, j not equal to i]

If error terms are assumed to be independently and identically distributed, and if this distribution can be assumed to be Gumbel, the above can be expressed in terms of the logistic distribution:
Pn(i) = e[μVin] / ∑ e[μ][Vj][n]  
The summation occurs over all options Jn in a choice set. The assumption of independent and identically distributed error terms implies independence of irrelevant attributes, meaning the ratio of choice probabilities for any two alternatives is unchanged by addition or removal of other unchosen alternatives (Blamey et al. 2000). The "μ" term is a scale parameter, a convenient value for which may be chosen without affecting valuation results if the marginal utility of income is assumed to be linear. The analyst must specify the deterministic portion of the utility equation ``V,'' with subvectors z and S. The vector z comes from choice experiment attributes, and the vector S comes from attitudinal, recreational, and sociodemographic questions in the survey. An econometrics software will be used to estimate the regression coefficients for z and S, with a linear-in-parameters model specification. These coefficients are used in estimating average household value for a change in one level to another level of a particular attribute for welfare estimation. Welfare of a change is given by (Holmes & Adamowicz 2003):

$ Welfare = (1/βc)[V[0] - V[1]] 

where βc is the coefficient on cost, V[0] is an initial scenario, and V[1] is a change scenario.

Econometric Specification

A main effects utility function is hypothesized for each of the Phoenix and Tucson metro areas. A generic format of the indirect utility function to be modeled is:

V = βo  + β1(North Flow & Forest Change) + β2(North Full Contact Recreation Change) + β3(South Flow & Forest Change) + β4(South Full Contact Recreation Change) + β5(Cost)

(c) Reporting Results
The results will be written up and submitted to a peer-reviewed environmental journal. 



References

Arizona Department of Environmental Quality. 2010. Draft 2010 Status of Water Quality in Arizona 305(b) Assessment and 303(d) Listing Report. http://www.azdeq.gov/environ/water/assessment/assess.html. Retrieved March, 2013.

Arrow, K., Solow, R., Leamer, E., Portney, P., Rander, R., Schuman, H., 1993. Report of the NOAA panel on contingent valuation. Federal Register 58(10): 4602-14. 

Bateman, I.J., R.T. Carson, B. Day, M. Hanemann, N. Hanley, T. Hett, M. Jones-Lee, G.
Loomes, S. Mourato, E. Ozdemiroglu, D.W. Pierce, R. Sugden, and J. Swanson. 2002.
Economic Valuation with Stated Preference Surveys: A Manual. Northampton, MA:
Edward Elgar.

Ben-Akiva, M., and S. R. Lerman. 1985. Discrete choice analysis. MIT Press, Cambridge, Massachusetts.

Berrens, R. P., A. K. Bohara, C. L. Silva, D. Brookshire, and M. McKee. 2000. Contingent values for New Mexico instream flows with test of scope, group-size reminder and temporal reliability. Journal of Environmental Management 58:73 - 90.

Blamey, R. K., J. W. Bennett, J. J. Louviere, M. D. Morrison, and J. Rolfe. 2000. A test of policy labels in environmental choice modelling studies. Ecological Economics 32:269 - 286.

Boyd, J., Banzhaf, S., 2007. What are ecosystem services? The need for standardized environmental accounting units. Ecological Economics 63 (2 - 3), 616 - 626.

Boyd, J., Krupnick, A., September, 2009. The Definition and Choice of Environmental
Commodities for Nonmarket Valuation Resources For the Future Discussion Paper 09-35. 60p.

Brouwer, R. 2000. Environmental value transfer: state of the art and future prospects. Ecological Economics 32(1): 137-152.

Bureau of Labor Statistics. 2011. http://www.bls.gov/oes/current/oessrcma.htm. Retrieved March, 2013.

Desvousges, W.H., Naughton, M.C., and G.R. Parsons. 1992. Benefit transfer: conceptual problems in estimating water quality benefits using existing studies. Water Resources Research 28 (3), 675 - 683.

Dillman, D.A., J.D. Smyth, and L.M. Christian. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Third Edition. 2009. John Wiley & Sons, Inc., Hoboken, N.J.

Frisvold, G., and T.W. Sprouse. 2006. Willingness to Pay for Binational Effluent. Water Sustainability Program. http://wsp.arizona.edu/node/277. Retrieved May, 2012. 

Hoehn, J.P., Lupi, F., Kaplowitz, M.D., July, 2003. Untying a Lancastrian bundle: valuing ecosystems and ecosystem services for wetland mitigation. Journal of Environmental Management 68(3): 263-272. 

Holmes, T. P., and W. L. Adamowicz. 2003. Attribute-based methods. Pages 171 - 220 in P. A. Champ, K. J. Boyle, and T. C. Brown, editors. A primer on nonmarket valuation. Chap . 6. Kluwer Academic Publishers, The Netherlands.

Johnston, R.J., Weaver, T.F., Smith, L.A., Swallow, S.K., April, 1995. Contingent Valuation Focus Groups: Insights from Ethnographic Interview Techniques. Agricultural and Resource Economics Review, 56-68.

Kaplowitz, M.D., Hoehn, J.P., February, 2001. Do focus groups and individual interviews reveal the same information for natural resource valuation? Ecological Economics 36(2): 237-247. 

Kuhfeld, W.F. 2010. Marketing Research Methods in SAS. SAS 9.2 Edition, MR-2010. Available for download at: http://support.sas.com/techsup/technote/mr2010.pdf. 

McFadden, D. 1974. Conditional logit analysis of qualitative choice behavior. Pages 105 - 142 in P. Zarembka, editor. Frontiers in econometrics. Chap. 4. Academic Press, New York.

Morrison, M. 2000. Aggregation biases in stated preference studies. Australian Economic Papers 39:215 - 230.

Louviere, J.J., D.A. Hensher, and J.D. Swait. 2000. Stated Choice Methods: Analysis and Application. Cambridge University Press. 402 p. 

Morgan, D.L., and R.A. Krueger. 1998. Focus Group Kit (6 volumes). Sage Publications, Thousand Oaks, CA.

NOAA Office of Habitat Conservation and Office of Response and Restoration, and Stratus Consulting. 2012. Ecosystem Valuation Workshop (binder prepared for workshop participants). Dates: June 6-7, 2012. Location: Asheville, N.C. 

Norman, L.M.; N. Tallent-Halsell, W. Labiosa, M. Weber, A. McCoy, K. Hirschboeck, J. Callegary, C. van Riper III, and F. Gray. 2010. Developing an Ecosystem Services Online Decision Support Tool to Assess the Impacts of Climate Change and Urban Growth in the Santa Cruz Watershed; Where We Live, Work, and Play. Sustainability 2(7):2044-2069. 

NSF. 2010. Press Release 10-182: NSF Awards Grants for Study of Water Sustainability and Climate. http://www.nsf.gov/news/news_summ.jsp?cntn_id=117819. Retrived April, 2013.

Orme, B. 1998. Sample Size Issues for Conjoint Analysis Studies. Sawtooth Software Research Paper Series, Sawtooth Software, Inc.

Ringold, P.L., Boyd, J.W., Landers, D., Weber, M., Meeting Date: July 13 to 16, 2009. Report from the Workshop on Indicators of Final Ecosystem Services for Streams. EPA/600/R-09/137. 56 p. http://www.epa.gov/nheerl/arm/streameco/index.html

Ringold, P.L., J. Boyd, D. Landers, and M. Weber. 2013. What data should we collect? A framework for identifying indicators of ecosystem contributions to human well-being. Frontiers in Ecology and the Environment 11: 98 - 105.

Rubin, H.J., Rubin, I.S, 2005. Qualitative Interviewing. 2nd Edition. Sage Publications. Thousand Oaks, CA.

USEPA. 2013a. Research Programs: Science for a Sustainable Future. http://www.epa.gov/ord/research-programs.htm. Retrieved April, 2013.

USEPA. 2013b. Sustainability. http://www.epa.gov/sustainability/. Retrieved April, 2013.

Weber, M., Stewart, S., 2009. Public Valuation of River Restoration Options on the Middle Rio Grande. Restoration Ecology 17(6):762-771.

White, M.S. 2011. Effluent-Dominated Waterways in the Southwestern United States:
Advancing Water Policy through Ecological Analysis. Ph.D. Dissertation, Arizona State University. 244p.



