                           Supporting Statement for:
                                       
Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona


1. Identification of the Information Collection.

	1(a) Title of the Information Collection

Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona (New), EPA #2484.01, OMB #2080-NEW

	1(b) Short Characterization/Abstract

The USEPA Office of Research and Development is investigating public values for scenarios of change for perennial reaches of the effluent-dominated Santa Cruz River, Arizona. These values will be estimated via a willingness to pay mail survey instrument. There are two effluent-dominated perennial reaches considered in the survey. A "South" reach which starts at an outfall in Rio Rico, AZ, flowing northward through Tumacácori National Historical Park. A "North" reach, fed by two outfalls in northwest Tucson, Arizona, flows northwest through Marana, AZ. In other locations north of where the channel crosses the border with Mexico about 5 miles east of Nogales the Santa Cruz River is ephemeral. For each of the South and North reaches, two different scenarios of change are considered. The first is a reduction in flow length, and associated decreases in cottonwood-willow riparian forest, a rare forest type in the region. The second is an increase in water quality to allow full contact recreation, such as submersion, at normal flow levels. The baseline flow length and forest acreages, as well as the acreages of forest that would be associated with reduced flow lengths, are derived from natural science information and modeling. A choice experiment framework is used with statistically designed tradeoff questions, where options to maintain flow length and forest, or increase effluent water quality, are posed as increases in a yearly household tax. Each choice question allows a zero cost "opt out" option. The choice experiment is designed to allow isolation of the value of marginal change for each reach. A few additional questions to further understand the motivations for respondent choices, as well as their river-related recreation behavior, are also included. Several pages of background introduce the issue to respondents. Limited sociodemographic questions are included to gauge how well the sample respondents represent the target population. Samples of the two major metropolitan areas in southern Arizona, Phoenix and Tucson, will receive the survey. 

2. Need for and use of the Collection 

	2(a) Need/Authority for the Collection
Current ORD research revolves around the theme of sustainability (USEPA, 2013a). An overarching goal cited on the USEPA website for sustainability research is:
	"EPA Sustainable communities research is providing decision tools and data for 	communities to make strategic decisions for a prosperous and environmentally 	sustainable future, and providing the foundation to better understand the balance between 	the three pillars of sustainability- environment, society and economy" (USEPA, 2013b). 
As part of including public input for finding the "balance" of sustainability, this survey research will estimate public values for management of the Santa Cruz River in southern Arizona. The Santa Cruz watershed is a subject of continuing research collaboration between USGS, ORD, and other partners, with a peer-reviewed research plan published by Norman et al. (2010). ORD is also collaborating with recipients of a recent National Science Foundation grant engaging in Santa Cruz River natural and social science research, led by the University of Arizona (NSF, 2010). The survey will gather public value information on Santa Cruz River management scenarios to complement partnering natural science research.    

	2(b) Practical Utility/Users of the Data

The primary reason for the proposed survey is public value research. The Santa Cruz River is a case study of a waterway highly impacted by human modifications, and partly within an urban area. However it still represents potentially valuable ecological goods such as rare riparian habitat and recreational opportunities for the regional population. As a secondary purpose, the survey results may be informative to local decision-makers considering Santa Cruz River management options. Water scarcity in the region raises periodic debates on the best uses of effluent.

3. Non duplication, Consultations, and Other Collection Criteria 

	3(a) Non duplication

Willingness to pay survey research has occurred for changes to other US rivers, as well as other southwestern US rivers (e.g. Weber and Stewart, 2009; Berrens et al., 2000). While techniques of benefit transfer could be applied to the results of prior studies in an attempt to gain a limited sense of the public value for Santa Cruz River management options, there are a number of hurdles to value estimates derived by benefit transfer (e.g. Desvousges et al., 1992, Brouwer, 2000). Differences in the ecological good being considered, local availability of substitute resources, and local tastes and preferences are some of the limitations in transferring estimates from one study or group of studies to a new valuation context. The proposed survey incorporates natural science modeling of the relationship between surface water and riparian forest. The ecological changes respondents would value are tailored to the Santa Cruz River, have been explicitly defined in the survey, and the survey has been extensively pretested on the regional population to minimize cognitive problems.

EPA has not identified any other studies that would consider public values for Santa Cruz River options considered in the proposed survey. The survey options were specifically designed to encompass changes to both the South and North effluent-dominated perennial reaches. The options were also carefully defined to specify miles of flow, acreage of riparian forest, and safety of water contact for different types of recreation. The language, graphics, and question formats in the survey were carefully pretested. However, there is prior willingness to pay survey research pertaining to the South reach of the Santa Cruz River (Frisvold and Sprouse, 2006). Although full documentation and results of that survey are neither available nor published, further information on this survey was sought before initiating plans for this new collection (personal communication with G. Frisvold, August, 2009). It was determined that the prior survey, even if results become available, would not be sufficient for the Santa Cruz research goals of this study. This survey will ask respondents to choose between numerically defined levels of environmental attributes matching natural science quantitative modeling (alongside varying cost levels). In contrast, the prior survey asked a more general question of willingness to pay "to permanently preserve the Santa Cruz River habitat as it is today", using photos to describe a baseline perennial stream habitat, and a changed habitat without perennial flow. Unfortunately the specific marginal change to be valued was not defined, limiting the reliability of respondent data as well as the ability to apply results to the gradient of possible management scenarios. Furthermore, a key geographic difference is that the prior survey focused on the South Santa Cruz only, whereas this survey considers management changes to both the South and North Santa Cruz River, and the reaches have significant differences in both vegetation and proximity to the Tucson urban area. 

	3(b) Public Notice Required Prior to ICR submission to OMB

This is the second of two federal register notices.  The first public notice period was 60 days and closed on July 8, 2012 (78 FR 26773, May 8, 2013).

	3(c) Consultations

The principal investigator for this effort is Matthew Weber, postdoctoral researcher at USEPA, ORD, Western Ecology Division, Corvallis, OR. The principal investigator has past direct experience with willingness to pay survey research, with a study estimating public values for management changes for the river and riparian area of the Rio Grande in Albuquerque, New Mexico (Weber and Stewart, 2009). Previously approved OMB surveys were consulted in designing this survey, in particular a NOAA coral reef valuation study (OMB # 0648-0585) and a USEPA study on fish and aquatic habitat impacts from cooling water intake structures (OMB # 2020-0283). The survey instrument booklet format and several questions were adapted from the USEPA study. The principal investigator participated in a workshop amongst stated preference survey practitioners working on federal government projects, convened by NOAA and Stratus Consulting in June of 2012 (NOAA and Stratus Consulting, 2012). This workshop was a helpful forum for comparing notes in willingness to pay survey design, with an emphasis on strategies for presenting ecological goods in a way meaningful to the lay public. Several completed or working draft willingness to pay survey instruments were presented for group discussion. 

Although the need to clearly define the ecological good to be valued is established advice for stated preference research (Arrow et al., 1993), Boyd and Krupnick (2009) found that practitioners sometimes use only vague language to define water-related ecological commodities.
This survey uses an explicit approach to defining ecological commodities to be valued, following concepts described in Boyd and Banzhaf (2007). Ecological good to be valued were derived from extensive focus group research during survey development. Throughout the focus group research Dr. Paul L. Ringold, a research ecologist at USEPA ORD, was consulted for his experience identifying publicly valued stream commodities and metrics (Ringold et al., 2009, and Ringold et al., 2013).

It should be noted that although all changes in the survey are theoretically possible, there is no specific policy proposed by any group or interest that this survey attempts to research. In general the attributes were driven primarily by selecting a subset of rivers and stream attributes expressed in locally convened focus groups, for which plausible scenarios and choice information could be developed for the Santa Cruz River. In terms of actual local policy discussions, possibilities for changing flow and forest appear to have been more discussed than changes that would allow safe full body contact.

This survey includes natural science modeling outcomes which quantify the relationship between riparian forest acreage and surface flow extent in both the South and North Santa Cruz River, two river areas (North vs. South) with different hydrogeological conditions. The natural science modeling is based on dissertation research at Arizona State University (White, 2011) and further analysis by Arizona State University Professor Dr. Juliet Stromberg (personal communication, May, 2013). Drs. White and Stromberg were consulted during development of the natural science background provided in the survey instrument as well as the ecological changes the survey poses. Dr. Thomas Meixner and Ph.D. student Rewati Niraula at the University of Arizona were consulted on potential surface streamflow extents under different effluent release scenarios for the North and South reaches of the Santa Cruz River. The most recent Arizona Department of Environmental Quality (2010) report summarizing water quality status for the South and North reaches of river were reviewed. Persons with either research or management interests in the Santa Cruz River (USGS, University of Arizona, Pima County, and Tumacácori National Historical Park) were consulted on the range of likely treated wastewater releases into the South and North reaches of the Santa Cruz River, and the current state of water quality. Notably, zero effluent releases into the North or South Santa Cruz River are considered extremely unlikely and accordingly the survey does not consider this possibility. Region 9 of US EPA was also consulted on the project, including Santa Cruz Watershed contact Jared Vollmer. 

Informal courtesy review comments were specifically solicited from three expert reviewers, two with stated preference experience, and one with experience with Santa Cruz River issues. Their comments are included in Appendix 1. It should be noted that their comments should not be taken as indication of support of methodological decisions made in this ICR proposal, nor were they aware at the time they made their comments that they might be included as part of a federal register notice. The first reviewer was Dr. V. Kerry Smith, Distinguished Sustainability Scientist at the Global Institute of Sustainability, and Regents' Professor in the Department of Economics at Arizona State University. Dr. Smith is a renowned expert in environmental economics techniques, including stated preference survey design. Dr. Smith's made 10 points in his comments. A follow-up teleconference was held with Dr. Smith to discuss the points made and how the survey could be improved. Point 1) is in regards to an insufficiently described payment vehicle. Revisions were made to pg. 6 describing that there are competing demands for water in southern Arizona, and the payments would be increased to compensate for not selling the water to these competing demands. Potential further description of the taxes, such as how to address taxes typically being progressive (depending on income) was not added. Pre-tests did not yield a sense of doubt from respondents regarding the scenarios and the taxes posed, thus adding additional complexity may be counterproductive and may actually invite suspicion. Point 2) is that the draft had too many choice panels. In the best of cases, each respondent is only asked one question to maintain independence of responses. However, asking multiple questions, or "replications", is standard in choice experiment design since so much investment has been made in each survey to describe management options. There are concerns that the quality of responses is reduced as the number of replications increases. Based on Dr. Smith's suggestion the 5 panels were reduced to 4 per survey, the lower end of replications suggested by Bateman et al. (2002: pg 253). As Dr. Smith suggests, the experimental design will be inspected and modified as necessary to eliminate better, cheaper options within the same survey block, which can be confusing for respondents. Point 3) is in regards to the description of the "Expected Future". This is similar to the complaint made by other reviewers, including Pima County (described below). The justification for reduced river flows was poorly described.  Revisions were made to pg. 6 describing that there are competing demands for water in southern Arizona besides supporting instream flows. Furthermore, as suggested by Dr. Smith a specific time horizon was added for a more complete description of the Expected Future. Finally, there is now a summary graphic on pg. 8 for the flow and forest options. Point 4) deals with consequentiality - respondents must believe their responses matter and that the scenarios are real. Dr. Smith recommended stronger consequentiality language. Throughout pre-tests, respondents conveyed a strong sense that the scenarios were real, thus edits in survey language were not made. Point 5) notes the minimal description of other ecological impacts of water besides the forest. Indeed, this is a weakness of the survey background, reflecting a general need for more robust "ecological production functions" describing specified and various impacts from environmental changes (USEPA 2009; e.g. pg 4). What makes this survey possible is the rare availability of a modeled relationship between water flow and forest change for the Santa Cruz River. Point 6) remarks on the awkwardness of the "full body contact" attribute name. In pre-tests ORD found subjects did understood this attribute as described and thus ORD chose not to "rebrand" so central a feature of the background despite the wordiness. This attribute is meant to capture preferences for more direct recreational contact in contrast to the ecological attribute of flow & forest (which does mention the presence of fish), thus fish are not mentioned in this attribute description. Point 7) suggests not to mention "tradeoffs" but rather "alternative choices" on pg 13, this change has been made. Point 8) suggests being more clear in the recreation question, the question has been modified to say "in the last 12 months". Point 9) addresses question 13, that it is too direct and options may not be complete. This question was dropped, in favor of collecting more detail on river-related recreational behavior, pursuant to Dr. Hoehn's comments (see below). Point 10) are suggestions for further sociodemographic variables. These were all incorporated, with the exception of home ownership and boat ownership, for which there was insufficient space. The last comment from Dr. Smith was to sketch how the results were designed to be meaningful for policy analysis, which is contained in other portions of this ICR supporting statement (Dr. Smith had not reviewed the supporting statement, only the survey itself).

The second invited reviewer was Dr. John Hoehn, Professor of Environmental and Natural Resource Economics in the Department of Agricultural Economics at Michigan State University. Dr. Hoehn is also a renowned expert in environmental economics including stated preference surveys. In particular, Dr. Hoehn has co-authored articles describing the crucial insights that arise during qualitative research during survey development (Kaplowitz and Hoehn 2001, and Hoehn et al. 2003).  Dr. Hoehn's comments are attached in appendix 1. A follow-up teleconference with Dr. Hoehn occurred to talk through his comments and to discuss how the survey could be improved. Dr. Hoehn's comments are broken into 5 points, which he prefaces by noting the important role of pre-testing (he was at that point unaware of the pre-testing ORD had done). Point 1) is a series of questions about wastewater. These questions are in large part addressed by the survey background or the survey attributes. For example, that wastewater treatment would eliminate odor is mentioned in the survey, and there is an attribute that captures the safety of water contact. The term treated wastewater was widely understood in pre-tests, and ORD wished to be plain about the source of the water, rather than using a term such as "recycled water". Point 2) notes that the background material is quite lengthy. Dr. Hoehn suggests some areas to cut back information; however in pre-tests subjects supported maintaining the level of information given in the survey, including information on infiltration since the state of the aquifer is a factor people considered in pre-tests. A problem with the draft implicit in Dr. Hoehn's comments was the impression that one could pay to get more water into the river. Instead, the intention is to investigate willingness to pay to maintain different extents of flow. Revisions were made to pg. 6 and a summary graphic was added on pg. 8 to make this clear. Point 3) asks why the photos on pg 7 aren't titled "with" and "without" water. The reason is that in pre-tests this led to an erroneous conclusion that a possible scenario was zero water in the river, which is not actually a scenario considered in the survey. Point 4) asks why a sample question is needed (pg 9). In initial pre-tests without a sample question we occasionally encountered respondent confusion on where/how to answer since choice experiments are not commonly found in surveys. Thus we opted to include a sample question. Point 5) notes that options in the South seem like a bargain compared with the North. We designed the sample survey price levels in this manner, since in pre-tests people seemed to greatly prefer maintaining conditions closer to Tucson. Making prices cheaper for the South allows us to better estimate the public values for ecological commodities in both locations. The more difficult the alternatives are to decide between, the more preference information ORD gains for the utility model. A further point raised in discussion was possibly changing question 13 to be more experientially-based, such as frequency of listed recreational activities, rather than a "rate the importance of..." question. Responses might then be ways to organize willingness to pay results, or even predictors of willingness to pay. Based on this suggestion a new experiential recreation question #11 was added, and prior question #13 was dropped (see Dr. Smith point 9 above). 

Comments were also solicited from the Sonoran Institute, a southern Arizona non-profit with a specific program focused on the Santa Cruz River. The Sonoran Institute had also been consulted at various times earlier in the project, and had provided the draft cover photo sequence (which has been replaced in the revised survey, see Pima County comments below). Sonoran Institute comments are attached in Appendix 1. A teleconference was held to discuss how the survey could be improved based on Sonoran Institute comments. Most of the comments are suggestions to reduce the impression that the survey presents real, immediate options that authorities are considering. ORD explained that the wording was a methodological issue to ensure consequentiality for respondents (see Dr. Smith's point 4 above, which recommended even stronger consequentiality language). The less hypothetical the scenario appears to be, the less invitation there is for strategic behavior from respondents. An example of strategic behavior is a respondent claiming a willingness to pay larger than their actual willingness to pay. Like other reviewers, the Sonoran Institute noted the confusing language regarding why river flows would be cut back. This was addressed by changes to pg 6 and pg 8 as described above. Other minor errors noted by the Sonoran Institute were also corrected. 

During the first public notice period comments were received from Pima County, attached in Appendix 1. Although Pima County had been contacted at an earlier point of survey development, the federal register notice served to advertise the project more widely to additional Pima County staff. Through a series of follow-up teleconferences ORD and Pima County discussed the issues raised and how the survey could be improved as a result. Pima County partitioned its submitted comments into four separate points. The first point focuses on whether the outcomes considered in the survey are feasible for the prices listed and safe full body contact in particular. In follow-up ORD explained that the methodology is not meant to pose actual estimated prices for the changes considered, but prices designed to bracket public values for those changes. The topic of whether safe full body contact is an appropriate standard for effluent-dominated waters was also raised.  In follow-up ORD described that the purpose of the survey was not to investigate a change to a regulation, but public value research for a potential change. A desire for safe full body contact in waterways was a frequent theme in public focus groups, and is also a long-time theme in environmental willingness to pay research (e.g. Carson and Mitchell 1993). The second point notes that the survey does not fully describe the current and expensive upgrades to wastewater treatment. The survey does actually mention this upgrade (as well as the upgrade to the treatment plant further south), so that respondents aware of these upgrades would not confound those changes with changes considered in the survey. It was found in focus groups and survey pre-tests that a particularly important factor is the odor associated with wastewater; thus that treatment upgrades will address odor is noted in two separate places in the survey, pg. 3 and pg. 6. More background on the wastewater upgrades could have been considered if the background material was not already lengthy. The third point remarks that the survey neglects the benefits of using treated wastewater for either off-channel purposes as reclaimed water, or in-channel as a means of aquifer recharge. In follow-up ORD agreed that all of the possible uses of treated wastewater could not be considered in the survey, and only a few key issues are investigated. However, the recharge impacts of in-channel wastewater was something about which the public displayed marked curiosity, thus there is a dedicated figure and explanation on pg 4 which describes the recharge implications of in-channel use of treated wastewater. The figure on pg 4 was pre-tested and supported by pre-test subjects. The fourth point questions the survey frame; why Phoenix would be surveyed while Santa Cruz County would not be surveyed. In follow-up ORD described the experimental design goal of testing "market extent" for the Santa Cruz River as the reason for surveying the Phoenix area, which is indeed relatively far away from the Santa Cruz River reaches described. Ideally Santa Cruz County would also be surveyed, but as a cost-savings measure only the two largest population centers of Arizona are proposed for the sample frame.  

Four additional issues were discussed in the course of conversation with Pima County that did not appear in their submitted comments. The first was that the cover photo sequence in the draft survey could be interpreted as sensationalistic, and potentially cast Santa Cruz River management in a negative light, when in fact there has been considerable investment and attention to Santa Cruz River conditions. Thus, photos from pg 7 were copied onto the cover, which still capture a demonstration of river conditions with and without water. The second additional topic raised was the potential impression that one could pay more in order to receive more wastewater releases to the river, mirroring Dr. Hoehn's comments (described above). The intention of the survey is to elicit willingness to pay to maintain different extents of river flow and forest, not willingness to pay to create new extents. Since this was insufficiently described in the draft survey, a graphic on pg 8 was added to summarize the information on the preceding pages, and the scenario description on pg 6 was revised. The third topic was the insufficient draft language describing the reason for river flows being cut-back in the Expected Future, which other reviewers also remarked on. The draft language has now been revised to state that there are competing demands for water resources in southern Arizona, besides supporting instream flows. It furthermore states that the funds collected through taxes would be used as compensation for not selling the wastewater for off-channel purposes. The fourth topic was whether the estimated current condition extent of river flow for the North reach was accurate. In follow-up email dialogue with Pima County, and including the previously consulted natural scientists Drs. Thomas Meixner and Juliet Stromberg, there was noted uncertainty in the length of flow since there are agricultural return flows as well as potentially increased infiltration due to water quality changes due to the wastewater treatment upgrade. By necessity the survey uses best available knowledge.

	3(d) Effects of Less Frequent Collection

Without this collection the public values for potential Santa Cruz River management scenarios could not be estimated. The management scenarios are specifically designed to be related to prevalent themes of public interest as derived from focus groups. Furthermore, the collection is a rare opportunity to link environmental value research with a highly developed natural science model of surface flow and accompanying forest vegetation available for the Santa Cruz River.  Furthermore, the case study is of a highly impacted urban area, a situation for which few environmental valuation references are available. The survey will compare values for two contrasting ecosystem services, for two separate locations, allowing numerous points of comparison of research interest. An additional point of research interest is the investigation of the "market extent" of the Santa Cruz River which would be achieved by comparing valuation results for Tucson residents, who are relatively nearby, with valuation results for Phoenix residents, who are relatively far away. And last, without the collection there would be less information on public values regarding Santa Cruz River management available to the local community for their planning purposes. 

	3(e) General Guidelines

The survey will not violate any of the general guidelines described in 5 CFR 1320.5 or in
EPA's ICR handbook.

	3(f) Confidentiality

All responses to the survey will be kept confidential. The surveys will be processed, including data entry, by the principal investigator; nobody else will have a record of who has responded or the answers of any given respondent. A list of the addresses of the members of the sample who have responded versus those who have not will be maintained in order to more efficiently mail reminders and replacement surveys. This will be a single file, accessible to and updated only by the principal investigator. To protect confidentiality in survey results, each respondent will be identified by a numeric code in that file rather than their name or address. The survey questions do not ask for any personally identifiable information and personally identifiable information will not be entered in the results even if volunteered by the respondent, for example in the comments section. In the cover letter, respondents will be informed that their responses will be kept confidential. After the data collection is complete, the respondent status file will be deleted, and only the numeric code assigned to each respondent will remain. After data entry is complete, the surveys themselves will be destroyed.

The USEPA ORD office location (the Western Ecology Division of USEPA) and USEPA ORD electronic file system used by the principal investigator are highly secure. A keycard possessed only by ORD employees and contractors is necessary to enter the building. The principal investigator is then in a separate keyed office space within the secure building. The computer system where the personal names and addresses associated with respondent numeric codes will be stored during the process of data entry is a secure server requiring principal investigator personal login username and password. At the conclusion of data entry, this file linking personal names and addresses to respondent codes will destroyed (along with hard copy survey responses themselves) at the conclusion of data entry and only respondent codes will remain. 

	3(g) Sensitive Questions

In focus groups two questions were found to be sensitive by some of the participants; the question of racial category, and the question of income category. However it was found that describing the research need for these questions, of gauging how well different groups are represented in survey results, was accepted by these participants as a worthwhile reason for asking these questions. The reason for these questions now prominently appears preceding those questions within the survey: "We need the following questions to ensure votes from all groups have been fairly represented in this survey". Confidentiality of responses is then reiterated.

4. The Respondents and the Information Requested 

	4(a) Respondents/SIC Codes

The target respondents for this survey are representatives 18 yrs or older of households in the two most populated urban areas of Arizona, the Phoenix metro area, and the Tucson metro area. A sample of household representatives 18 yrs or older in each metro area will be contacted by mail following multiple contact protocol in Dillman (2009). A response rate of 30% will be targeted. To increase response rates from the sample, several contacts will be used, including a prenotice to all recipients, a reminder postcard, and followup mailing. The target responses from the Phoenix and Tucson metro areas are 250 households each, or 500 households total.  

	4(b) Information Requested

		(i) Data items, including record keeping requirements

The current draft survey has also been uploaded to the federal register (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). The survey is divided into 4 main parts. The first part is background for the choice questions. The second part is the choice questions themselves. The third part is questions designed to understand the context for why respondents responded to the choice questions as they did. These questions include attitudinal questions as well as recreational preferences questions. The fourth part is designed to assess whether major sociodemographic categories of the received sample are representative of the population sampled. There are no record keeping requirements.

		(ii) Respondent Activities

The following respondent activities are envisioned. Participants will read the cover letter and survey, respond to the survey questions, and return the survey using a provided postage paid envelope. Focus group and cognitive interview participants typically took no longer than 30 minutes to complete the survey, so 30 minutes per response is the estimated burden for the average respondent. 

5. The Information Collected - Agency Activities, Collection Methodology, and Information Management

	5(a) Agency Activities

Development of the survey questionnaire through focus group and cognitive interview pretesting is occurring under the separate ICR# 2090-0028. Pretest techniques follow standard approaches in the qualitative methods literature (Morgan and Krueger, 1998; Rubin and Rubin, 2005), as well as guidance in the economics literature for the specific purposes of pretesting a willingness to pay survey (Johnston et al., 1995; Kaplowitz et al. 2001, Hoehn et al. 2003).

Under this ICR, agency activities will include: 
      * Develop the choice experiment design
      * Obtain a representative sample mailing list for each of the two target metro area populations, Phoenix and Tucson
      * Printing of questionnaires
      * Mailing of prenotices
      * Mailing of cover letters and questionnaires
      * Reminder mailings
      * Follow-up mailings and replacement questionnaires to non-respondents as needed
      * Data entry and quality assurance of data file
      * Analysis of survey results, including characterization of nonresponse and potential degree of nonresponse bias
      * Modeling choice experiment results with a standard multinomial logit approach
      * Reporting survey results

	5(b) Collection Methodology and Management

The proposed survey is a choice experiment questionnaire delivered and returned by mail. Standard multi-contact mail survey methods will be used to increasing response rate (Dillman, 2009, pg. 242). The desired number of completed surveys is 250 in each of the Phoenix and Tucson metro areas with a target response rate of 30% in each area. Thus it will be necessary to contact approximately 834 households in each metro area.

Data quality will be monitored by checking returned survey responses for consistency, and by assessing any comments made on the survey or returned with the survey that signal strategic responses or respondent confusion. Coded survey data will not include any identifying information of the respondents. Returned survey data will be coded and used as the dataset for multinomial logit regression modelling.

	5(c) Small Entity Flexibility

This survey will be administered to individuals, not businesses. Thus, no small entities
will be affected by this information collection.

	5(d) Collection Schedule
			
A breakdown of the expected collection schedule is as follows:

   * Week 1: Printing surveys
   * Week 2: First contact mailing for pilot survey, notifying that a survey will be mailed in 1-2 weeks
   * Week 3 and 4: Pilot survey mailing
   * Week 5 and 6: Pilot survey reminder postcards mailing
   * Week 7 through 9: Data entry of pilot survey results. Revising estimation of the beta vector (coefficients on utility variables, see part B of the supporting statement). Adjusting the choice experiment and cost levels for the main survey mailing based on the beta vector estimated from the pilot survey
   * Week 10: First contact mailing for main survey mailing, notifying that a survey will be mailed in 1-2 weeks
   * Week 11 and 12: Main survey mailing
   * Week 13 and 14: Main survey reminder postcards mailing
   * Week 15 through 18: Main survey additional reminders and replacement surveys as necessary to reach target response rate
   * Week 19 to 20: Data entry

The schedule above is staged such that if response rates are higher or lower than expected, the appropriate number of replacement surveys will be printed and mailed to most efficiently use funds.

6. Estimating The Burden and Cost of the Collection

	6(a) Estimating Respondent Burden

For a typical respondent, a conservative estimate of their time to review and respond to survey questions is 30 minutes. Assuming the target of 500 people total respond to the survey, the burden is 250 hours. This would be a one-time expenditure of their time. 

	6(b) Estimating Respondent Costs
		(i) Estimating Labor Costs

The Bureau of Labor Statistics reports average wage rates for some metropolitan areas, with the most recent data being May 2012 (Bureau of Labor Statistics, 2013). The average hourly wage for all occupations in the Phoenix metro area was $21.75, or an average cost per participant of $10.88. The average hourly wage for all occupations in the Tucson metro area was $20.45, or an average cost per participant of $10.23. Assuming 250 participants in each metro area fill out the survey, the total estimated respondent labor cost is $5,275. 

		(ii) Estimating Capital and Operations and Maintenance Costs

There are no anticipated capital, operations or maintenance costs associated with this collection.

		(iii) Capital/Start-up Operating and Maintenance (O&M) Costs

There are no anticipated capital, operations or maintenance costs associated with this collection.

		(iv) Annualizing Capital Costs

There are no anticipated capital, operations or maintenance costs associated with this collection.

	6(c) Estimating Agency Burden and Cost

The various aspects of the survey mailing are assumed to be done by the principal investigator, with an associated hourly wage rate of $32.50. Preparing survey mailings, tracking nonrespondents, sending new mailings as needed, and data entry are anticipated to amount to 8 weeks total or 320 hours of work. Agency labor cost would be 320 hours times $32.50 per hour or $10,400. 

	6(d) Estimating the Respondent Universe and Total Burden and Costs

Assuming 250 participants in each of the Phoenix and Tucson metro areas fill out the survey, the total labor cost will be $5,275.

	6(e) Bottom Line Burden Hours and Cost Tables

Item
Quantity
Cost
Public Burden
Time burden: 0.5 hours per respondent
500 persons
250 hours; $5,275 labor
Agency Burden
Time burden
Entire project
320 hrs; $10,400 labor
Mailing list
1,000 names in Tucson area & 1,000 names in Phoenix area (accounts for % of incorrect addresses)
$800
Prenotice letter paper and printing
1,700 pieces
$100
Prenotice envelopes
1,700 pieces
$150
Prenotice postage (bulk mail)
1,700 pieces
$700
Color surveys paper and printing
2,000 pieces
(includes estimated replacements)
$2,900
Printing return envelopes 10.5" x 7.5"
2,000 pieces
(includes estimated replacements)
$450
Outgoing envelopes 11.5" x 8.75"
2,000 pieces
(includes estimated replacements)
$300
Outgoing survey postage (bulk mail)
2,000 pieces
(includes estimated replacements)
$1,900
Return survey postage (bulk mail)
500 pieces
$500
Reminder postcard paper & printing
1,700 pieces
$100
Total

$23,575

The estimated respondent burden for this study is 250 hours and $5,275. The estimated agency cost for this study is 320 hours and $10,400. Agency costs besides labor hours total $7,900 for the mailing list, paper, printing, and postage.

	6(f) Reasons for Change in Burden

The survey is a one-time data collection activity.

	6(g) Burden Statement

The annual public reporting and recordkeeping burden for this collection of information is estimated to average 0.5 hours per response.  Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency.  This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information.  An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number.  The OMB control numbers for EPA's regulations are listed in 40 CFR part 9 and 48 CFR chapter 15.     

  To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this ICR under Docket ID Number EPA-HQ-ORD-2013-0282, which is available for online viewing at www.regulations.gov, or in person viewing at the Office of Research & Development (ORD) Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Avenue, NW, Washington, D.C.  The EPA Docket Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays.  The telephone number for the Reading Room is (202) 566-1744, and the telephone number for the ORD Docket is (202) 566-1752.  An electronic version of the public docket is available at www.regulations.gov.  This site can be used to submit or view public comments, access the index listing of the contents of the public docket, and to access those documents in the public docket that are available electronically.  When in the system, select "search," then key in the Docket ID Number identified above.  Also, you can send comments to the Office of Information and Regulatory Affairs, Office of Management and Budget, 725 17th Street, NW, Washington, D.C. 20503, Attention: Desk Officer for EPA.  Please include the EPA Docket ID Number EPA-HQ-ORD-2013-0282 and OMB Control Number 2080-NEW in any correspondence.


Part B of Supporting Statement

1. Survey Objectives, Key Variables, And Other Preliminaries
(a) Survey Objectives

The objectives of the survey are bulleted below:
   * To estimate public values for changing the extent of the flow mileage and associated forest vegetation acreage along the effluent-dominated Santa Cruz River.
   * To estimate public values for full contact recreation in the effluent-dominated Santa Cruz such as submersion, as a change from partial body contact recreation such as wading.
   * To provide a case study for estimating public values for modifying river attributes of a waterway highly impacted by urban processes. 
   * To compare estimated public values for changing attributes of two different reaches of the Santa Cruz River, the South and the North.
   * To compare estimated public values between two population centers, the Phoenix metro area and the Tucson metro area. 
                                                          
(b) Key Variables

The survey asks respondents whether they would choose a permanent tax increase for their household in exchange for changes in Santa Cruz River attributes. The key variables are:

North Santa Cruz Flow and Forest: This is a bundled variable with two numeric values for each level specific to the North Santa Cruz. One number shows miles of surface flow, and another number shows acres of cottonwood/willow forest associated with that flow. The survey considers three possible levels of this variable. The future baseline level is termed the "Expected Future". Maintaining larger extents of flow and forest are posed, with two different options, the largest of which is the same as the current condition. 

North Santa Cruz Full Body Contact: This is a binary "Yes" or "No" variable specific to the surface flows in the North Santa Cruz. A "Yes" means the surface flow would be considered safe for full body contact at normal flow levels, including submersion. A "No" means the surface flow would be considered safe only for partial body contact, i.e. wading, and normal flow levels. The Expected Future and Current Condition are both "No", but the survey poses the possibility of a "Yes".

South Santa Cruz Flow and Forest: This is a bundled variable with two numeric values for each level specific to the North Santa Cruz. One number shows miles of surface flow, and another number shows acres of cottonwood/willow forest associated with that flow. The survey considers three possible levels of this variable. The future baseline level is termed the "Expected Future". Maintaining larger extents of flow and forest are posed, with two different options, the largest of which is the same as the current condition. 

South Santa Cruz Full Body Contact: This is a binary "Yes" or "No" variable specific to the surface flows in the South Santa Cruz. A "Yes" means the surface flow would be considered safe for full body contact at normal flow levels, including submersion. A "No" means the surface flow would be considered safe only for partial body contact, i.e. wading, and normal flow levels. The Expected Future and Current Condition are both "No", but the survey poses the possibility of a "Yes".

Tax increase per year: Each change varying from the expected future of reduced flows and cottonwood/willow acreages in the North and South has an associated cost. These cost levels currently vary from $0 for the Expected Future, to as high as $60 (subject to change based on pilot survey results). These cost levels are not tied to actual costs estimates for the changes, but rather are designed to bracket values. That is, the design goal is to set cost levels such that some people agree to them and some people don't agree to them. A pilot survey will be used to test whether cost levels posed in the survey should be revised. The proposed draft survey and choice experiment design use 6 different possibilities for cost (including a zero cost option).

(c) Statistical Approach

The statistical approach to analyzing survey results will be a multinomial logit model (Ben-Akiva and Lerman, 1985).

(d) Feasibility

The survey has been extensively pretested, as described below, to reduce cognitive difficulties for respondents. The principal investigator has research funding to cover the costs of the survey. 

2. Survey Design
(a) Target Population And Coverage

The target population is households of the Phoenix and Tucson metropolitan areas in southern Arizona. Respondent coverage will be such that each household in each respective area has an equal probability of being selected to receive a survey, based on the mailing list.

(b) Sample Design
(i) Sampling Frame

The sample frame is households of the Phoenix and Tucson metropolitan areas in southern Arizona. A sample will be such that each household in each respective area has an equal probability of being selected to receive a survey, based on best available mailing lists, available from companies specializing in preparing mailing lists for surveys.

(ii) Sample Size

There is no single sample size choice since there will be uncertainty with estimates regardless of sample size. This survey design will utilize the rule of thumb of available from the developers of Sawtooth Software (Orme, 1998), a software popular for designing choice sets. This formula was also recently utilized by NOAA (OMB # 0648-0585). The rule of thumb formula for a minimum sample size is:

(n x t x a)/c > = 500

Where:
n = minimum number of respondents
t = number choice questions
a = number of alternatives per task (not including the "status quo" option)
c = number of "analysis cells."

When considering main effects, c is equal to the largest number of levels for any single attribute. If considering all two-way interactions, c is equal to the largest product of levels of any two
attributes (Orme, 1998). The sample size will be based on a main effects model. A minimum of 4 choice questions will be in each survey, with two options each (not counting "expected future"), and the maximum number of levels for any single attribute is 3. Thus the minimum sample size "n" is 188 for each population to be sampled, and our target number of respondents for both the Phoenix and Tucson metro areas is 250. This value is larger than the 200 minimum suggested when the intent is to compare subgroups (Orme, pg 67). Bateman et al. (2002; pg. 110) recommend a sample size of 500 to 1,000 (for each subgroup) for close-ended contingent valuation questions, but also note a smaller sample size can be used if one collects more information per respondent (as with replications in choice experiments). With a target response rate of 30%, this means approximately 834 households in each metro area will receive a survey to reach the target of 250 respondents.

(iii) Stratification Variables

The Phoenix and Tucson metro areas will be treated as different populations.

(iv) Sampling Method

A sample mailing list for each of the two metro areas will be purchased from a mail survey support company. The company will be given instructions to prepare the sample such that each household in each metro area has an equal chance of being chosen (a simple random sample approach).

(v) Multi-Stage Sampling

Not applicable.

(c) Precision Requirements
(i) Precision Targets

Louviere et al. (2000: pg 262) provide a formula, based on elementary statistical theory, for the minimum sample size "n" for target accuracy and confidence interval of predicting a proportion. The formula assumes a large sample frame population, a simple random sample strategy is used, and choice occasions from each respondent are independent:

n > = (q/rpa[2])Φ[-1]((1+α)/2)

where "p" is the population proportion (choice probability), as predicted within "a" percent of the true value with probability "α" or greater, "q" = 1-p, "r" is replications (choice occasions per respondent) and "Φ[-1] (.)" is the inverse cumulative normal distribution function. This study will use a target of plus or minus 10% for predicting population proportions to be estimated, with a probability of 0.95. Assuming a population proportion of 0.5, and an r of 4, the minimum sample size is 96 respondents, less than the 250 sample size planned for each metro area.

The survey will be blocked into 9 versions (see Part B 2(d) below). Louviere et al. (2000) citing Bunch and Batsell (1989) recommend at least 6 respondents per block to satisfy large sample statistical properties, and our expectation is an average of 250/9 responses per block (much higher). Nonetheless, during follow-up mailings to achieve the target overall response rate, it will also be ensured that enough responses from each of the 9 survey versions is being achieved.

(ii) Nonsampling error

With a target response rate of 30% there will be a large percentage of nonrespondents. If preferences of nonrespondents differ markedly from respondents, nonresponse bias will affect the results. A nonresponse analysis will be conducted by comparing the sociodemographics of the respondents with the sampling frame. A description of any sociodemographics that were less represented in the responses will be included in the results writeup. If respondents and nonrespondents have markedly different sociodemographics, the sociodemographic vector of the sampling frame (as opposed to the respondents) will be fed into the model to obtain valuation estimates for an "average" household (Morrison, 2000; and employed in Weber and Stewart, 2009).


(d) Questionnaire Design

The current draft survey is also uploaded to the federal register (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). Below is a description of the sections and questions.

PART 1
Background. The cover photos show various states of the Santa Cruz River in both the North and the South, representing both perennial flow and downstream of where perennial flow ends.  There are 4 photos in all, which are repeated with further deatil on pg 7. Page 2 shows the Santa Cruz River within the landscape of the broader river network in Arizona, for perspective. Page 3 shows where the treated wastewater flow in the South and North reaches are in southern Arizona, background on how they came to exist, and introduces why these resources might be relevant to the respondent. Page 4 describes the partitioning of treated wastewater releases and the relatively small fraction that is consumptively used by plants or that evaporates in the perennial river ecosystem, with the rest adding to groundwater supplies. Page 5 describes the type of riparian forest that will be a topic in the survey, and some of the wildlife that are found in that type of habitat. Pages 6, 7, 8, and 9 describe the different attributes that will be in the later choice questions. The "Expected Future" with no intervention is described as opposed to potential management changes. The first attribute of management changes are those that would retain more of the current condition flow miles and riparian forest acres in the North or South. Representative photos are shown of the riparian area with and without perennial water, along with specified numeric changes of flow mileage and forest acres that the survey will consider.  Representative photos are shown for both South and North since the vegetation character is different; this difference is also described in the text and numerically in terms of acres of forest. Page 9 describes the second attribute of management changes, the safety of direct contact with the water in the South and North. The term Full Body Contact is equivalent to "swimmable" water quality but since the river reaches are typically not deep enough to swim in, we did not use the more familiar "swimmable" term since it could be misleading. Page 10 is an example vote showing the format of the choice question and page 11 describes how the attribute levels will be described.

PART 2
Question 1: The first question on page 12 is designed to initiate the respondent's thought process of weighing the relative importance of the different attributes within the choice experiment in questions 2 - 5.

Questions 2 through 5: These questions comprise the choice experiment portion of the survey, where respondents choose between different cost levels and different marginal changes in river-related attributes. There is always an opt-out zero cost option. Following standard techniques of choice experiments, the options (also known as profiles) participants choose between will be a fraction of the theoretically possible combinations of attributes. The questions are designed to be difficult in order to efficiently yield preference information.  There will be different survey versions, with 4 questions per survey (also known as "replications"), as an efficient method of allowing sufficient number of tradeoffs for model estimation. Different survey versions allow "blocking" the still large number of tradeoff questions posed into different groups. These practices save expense and also reduce the sample size and associated public burden.

Choice Experiment Design
All possible choice profile tradeoffs could be presented to respondents, but would be an inefficient way to gauge preferences. Instead, "fractional factorial" models are a standard approach (Louviere et al., 2000). The statistical software package SAS will be used to develop the most efficient choice experiment design, given a total number of design choice sets to manipulate, as well as a provisional beta vector (Kuhfeld, 2010). Essentially the software will search for the questions, given the constraint on choice sets, likely to yield the most preference information. The computer generated design will be manually checked for any potentially dominating choices, or scenarios that may seem unlikely to respondents. The total number of choice sets must be at least as large as the number of parameters to be estimated and is typically much more. A number of choice sets that allows each level to occur an equal number of times is also desirable, for balance (Kuhfeld, 2010; pg. 78).

The proposed design is attached as Appendix 2.  The choice experiment design balances the number of factors to be estimated, and funds available for the survey. To summarize relevant factors from Part B 1(b) above, there are 2 attributes with 3 levels each (North and South flow & forest), 2 attributes with 2  levels each (North and South full body contact), and 1 attribute with 6 levels (cost). An experiment of 72 choice profiles, plus one constant, no cost, opt-out alternative (the Expected Future), was selected as the smallest number of profiles allowing both orthogonality and balance among the main effects to be estimated. These profiles were then optimally organized into choice sets (individual choice questions), and blocked in 9 survey versions with SAS software (v 9.3). The design was checked to ensure important effects would be estimable, and was manually inspected to ensure no dominating alternatives per question or per block, which resulted in modification of cost levels in a small number of choice profiles. Note that the choice experiment design is subject to update based on pilot testing (section 3 below).

PART 3
Questions 6 through 13: These questions shed light on motivations for respondents' answers, and also test for inconsistencies in their responses. Question 8 helps identify "protest bids", that is, occasions when people choose not to pay for philosophical reasons rather than a price that is too high. Recreation behavior questions 11, 12, and 13 allow insight into recreational preferences related to the resources being considered. 

PART 4
Questions 14 through 23: These are sociodemographic questions that allow comparing the received sample with the sample population as a gauge of representativeness, as well as some potentially useful predictors of survey responses. 

3. PRETESTS AND PILOT TESTS
Pretests

The survey content and format has undergone extensive pretesting. Several phases of qualitative survey development occurred under a different ICR (ICR # 2090-0028). The earliest phases of survey development did not present study participants with a survey, instead the intention of those research sessions was to identify the most relevant variables upon which to pursue follow-up quantitative valuation survey research. The first phase of qualitative cognitive interviews occurred in October 2011. Interviews were held with a convenience sample of 12 neighborhood presidents in Tucson. This stage was used to prepare ideas and moderation techniques for the upcoming phase of focus groups. In the spring of 2012, there were 10 focus groups in southern Arizona, with 8 in Tucson, 1 in Rio Rico, and 1 in Tubac. Focus group participants were recruited from the general public by a market research contractor using standard market research methods, including paying participants an incentive fee as compensation for the opportunity cost of time. At this stage participants were not asked to take or react to a survey, but instead were asked to help ORD identify attributes important to them about southern Arizona rivers and streams, as well as for the Santa Cruz River in particular. 

The information from the above qualitative research was used to develop a first draft of a survey focusing on the North Santa Cruz, the reach nearest to southern Arizona population centers. This version was pretested in the fall of 2011. Cognitive interviews were conducted with 17 persons, all recruited from the general population of Tucson by a market research contractor, again paying incentive fees. The survey draft included key variables identified by earlier qualitative research, but the list of attributes seemed to be near the limit for respondents to effectively consider. In addition there were two attributes that both dealt with forest acreage that were easily confused; one based on whatever forest acreage would volunteer based on the nearby river, and a second attribute that increased forest acreage with the aid of drip irrigation. At the conclusion of the pretest, participants were asked to comment on the relative appeal of preserving the North Santa Cruz River versus the South Santa Cruz River, based on a map of their respective locations and representative photos of the two areas (the photos being the same photos represented on page 7 of the proposed survey). Some participants preferred maintaining the South Santa Cruz location despite it being further away from Tucson. Based on these pretests it was decided to narrow the scope of attribute types, and to feature both the North and South so as to include two different effluent-dominated sections of the same river. 

For the revised survey, the principal investigator requested and received updated natural science modeling of the relationship between surface water and forest acreage for the North and South Santa Cruz River (personal communication, J. Stromberg, May 2013). In the spring of 2013, there were an additional 2 focus groups and 9 cognitive interviews. One of the focus groups and 3 of the interviews were with people living in the Phoenix area, with one focus group and the remaining interviews being with persons living in the Tucson area. The survey was reformatting into a booklet format, adapted from a prior EPA survey (OMB # 2020-0283). The draft survey instrument has been uploaded to the federal register (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). 

The final round of pre-tests verified the higher comprehension achieved by narrowing the number of attributes into the two categories of flow and forest, and safety of water contact. The current draft reflects several further edits that were made based on insights gained during the final round of pretests. A primary change was revising the description of the full contact vs. partial contact recreation. It was originally posed as "Safe" for contact recreation or "Unsafe" for contact recreation. In the early sessions it was found that people frequently felt compelled to vote for "Safe", presuming there to be a significant public health hazard if the water was left "Unsafe". The labeling in the choice experiment matrix was thus changed to the current "Yes" vs. "No" language, along with a fuller description of the attribute, which helped participants have a more accurate grasp of the issue. Another change was inclusion of more background describing "Expected Future" and the competing options. In particular, the revisions emphasized the difference with North and South forest which complements the photo visual aids. In addition, color coding for attributes that remain at "Expected Future" levels was used to make differences between choice questions clear at a glance; previously, many participants described confusion that the series of choice questions appeared to be the same. 

The final round of pre-tests continued to show, as did previous pre-tests, the strong public interest in preserving surface flow of rivers, even if the source is treated wastewater rather than natural flow. Different preferences for the various attributes were noted: some respondents looked for the greatest forest acreage change per dollar; others strongly preferred preserving river and forest resources closer to a city. In addition, there seemed to a split between those who preferred to have the water be safe enough to swim in, versus some who were indifferent to this attribute.  Furthermore, some respondents indicated that it would be difficult for them to trust contact with the water knowing that it was treated wastewater, for others this was not an issue.


Pilot Test 

A pilot survey will be mailed to a subset of the Phoenix and Tucson samples. This will not represent an additional burden, but will be some designated fraction of the total mailing. This will allow for the possibility of adjusting the survey for any problems that may surface after this initial wave of survey returns before committing to the full mailing. The beta parameter vector (see econometric specification section below) may be revised based on analysis of the results, in particular the cost levels may need to be adjusted to efficiently bracket values. 

4. COLLECTION METHODS AND FOLLOW-UP.
(a) Collection Methods

A mail survey collection method is selected due to its frequent and successful use in the choice experiment literature and its relatively low cost.

(b) Survey Response And Follow-up

Multiple contact methods (Dillman, 2009; pg 242) will be used. Thus, those who have already responded will be tracked in a spreadsheet to ensure follow-up mailings are only sent to those who have not yet responded.

5. ANALYZING AND REPORTING SURVEY RESULTS
(a) Data Preparation

All data entry will be conducted by the Principal Investigator. Debriefing question responses or other hand-written responses that indicate confusion regarding voting question responses will be flagged, and these data will not be used to estimate the choice model. Responses that indicate protest to the payment vehicle used in this survey, i.e. philosophical objection to increased taxes, will not be used. Responses from persons less than 18 yrs of age as indicated from the `what year were you born' question will not be used.  

(b) Analysis

A standard multinomial logit model, as described by Ben-Akiva and Lerman (1985), will be fit to the data. Let U = utility of household (well-being). Consider U to be a function of a vector zin of attributes for alternative i, as perceived by household respondent n. The variation of preferences between individuals is partially explained by a vector Sn of sociodemographic characteristics for person n.

Uin = V(zin, Sn) + ε(zin, Sn) = Vin + εin 

The "V" term is known as indirect utility and "ε" is an error term treated as a random variable (McFadden 1974), making utility itself a random variable. An individual is assumed to choose the option that maximizes their utility. The choice probability of any particular option (Expected Future, Option A, or Option B) is the probability that the utility of that option is greatest across the choice set Cn:

P (i│Cn) = Pr[Vin + εin  >=  Vjn + εjn , for all j ∈ Cn, j not equal to i]

If error terms are assumed to be independently and identically distributed, and if this distribution can be assumed to be Gumbel, the above can be expressed in terms of the logistic distribution:
Pn(i) = e[μVin] / ∑ e[μ][Vj][n]  
The summation occurs over all options Jn in a choice set. The assumption of independent and identically distributed error terms implies independence of irrelevant attributes, meaning the ratio of choice probabilities for any two alternatives is unchanged by addition or removal of other unchosen alternatives (Blamey et al. 2000). The "μ" term is a scale parameter, a convenient value for which may be chosen without affecting valuation results if the marginal utility of income is assumed to be linear. The analyst must specify the deterministic portion of the utility equation ``V,'' with subvectors z and S. The vector z comes from choice experiment attributes, and the vector S comes from attitudinal, recreational, and sociodemographic questions in the survey. An econometrics software will be used to estimate the regression coefficients for z and S, with a linear-in-parameters model specification. These coefficients are used in estimating average household value for a change in one level to another level of a particular attribute for welfare estimation. Welfare of a change is given by (Holmes & Adamowicz 2003):

$ Welfare = (1/βc)[V[0] - V[1]] 

where βc is the coefficient on cost, V[0] is an initial scenario, and V[1] is a change scenario.

Econometric Specification

A main effects utility function is hypothesized for each of the Phoenix and Tucson metro areas. A generic format of the indirect utility function to be modeled is:

V = βo  + β1(North Flow & Forest Change) + β2(North Full Contact Recreation Change) + β3(South Flow & Forest Change) + β4(South Full Contact Recreation Change) + β5(Cost)

(c) Reporting Results
The results will be written up and submitted to a peer-reviewed environmental journal. 



References

Arizona Department of Environmental Quality. 2010. Draft 2010 Status of Water Quality in Arizona 305(b) Assessment and 303(d) Listing Report. http://www.azdeq.gov/environ/water/assessment/assess.html. Retrieved March, 2013.

Arrow, K., Solow, R., Leamer, E., Portney, P., Rander, R., Schuman, H., 1993. Report of the NOAA panel on contingent valuation. Federal Register 58(10): 4602-14. 

Bateman, I.J., R.T. Carson, B. Day, M. Hanemann, N. Hanley, T. Hett, M. Jones-Lee, G.
Loomes, S. Mourato, E. Ozdemiroglu, D.W. Pierce, R. Sugden, and J. Swanson. 2002.
Economic Valuation with Stated Preference Surveys: A Manual. Northampton, MA:
Edward Elgar.

Ben-Akiva, M., and S. R. Lerman. 1985. Discrete choice analysis. MIT Press, Cambridge, Massachusetts.

Berrens, R. P., A. K. Bohara, C. L. Silva, D. Brookshire, and M. McKee. 2000. Contingent values for New Mexico instream flows with test of scope, group-size reminder and temporal reliability. Journal of Environmental Management 58:73 - 90.

Blamey, R. K., J. W. Bennett, J. J. Louviere, M. D. Morrison, and J. Rolfe. 2000. A test of policy labels in environmental choice modelling studies. Ecological Economics 32:269 - 286.

Boyd, J., Banzhaf, S., 2007. What are ecosystem services? The need for standardized environmental accounting units. Ecological Economics 63 (2 - 3), 616 - 626.

Boyd, J., Krupnick, A., September, 2009. The Definition and Choice of Environmental
Commodities for Nonmarket Valuation Resources For the Future Discussion Paper 09-35. 60p.

Brouwer, R. 2000. Environmental value transfer: state of the art and future prospects. Ecological Economics 32(1): 137-152.

Bunch, D.S., and Batsell, R.R. 1989. A Monte Carlo comparison of estimators for the multinomial logit model. Journal of Marketing Research 26: 56-68.

Bureau of Labor Statistics. 2011. http://www.bls.gov/oes/. Retrieved September, 2013.

Carson, R.T., and R.C. Mitchell. 1993. The Value of Clean Water: The Public`s Willingness to Pay for Boatable, Fishable, and Swimmable Quality Water. Water Resources Research 29(7): 2445-2454. July.

Desvousges, W.H., Naughton, M.C., and G.R. Parsons. 1992. Benefit transfer: conceptual problems in estimating water quality benefits using existing studies. Water Resources Research 28 (3), 675 - 683.

Dillman, D.A., J.D. Smyth, and L.M. Christian. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Third Edition. 2009. John Wiley & Sons, Inc., Hoboken, N.J.

Frisvold, G., and T.W. Sprouse. 2006. Willingness to Pay for Binational Effluent. Water Sustainability Program. http://wsp.arizona.edu/node/277. Retrieved May, 2012. 

Hoehn, J.P., Lupi, F., Kaplowitz, M.D., July, 2003. Untying a Lancastrian bundle: valuing ecosystems and ecosystem services for wetland mitigation. Journal of Environmental Management 68(3): 263-272. 

Holmes, T. P., and W. L. Adamowicz. 2003. Attribute-based methods. Pages 171 - 220 in P. A. Champ, K. J. Boyle, and T. C. Brown, editors. A primer on nonmarket valuation. Chap . 6. Kluwer Academic Publishers, The Netherlands.

Johnston, R.J., Weaver, T.F., Smith, L.A., Swallow, S.K., April, 1995. Contingent Valuation Focus Groups: Insights from Ethnographic Interview Techniques. Agricultural and Resource Economics Review, 56-68.

Kaplowitz, M.D., Hoehn, J.P., February, 2001. Do focus groups and individual interviews reveal the same information for natural resource valuation? Ecological Economics 36(2): 237-247. 

Kuhfeld, W.F. 2010. Marketing Research Methods in SAS. SAS 9.2 Edition, MR-2010. Available for download at: http://support.sas.com/techsup/technote/mr2010.pdf. 

McFadden, D. 1974. Conditional logit analysis of qualitative choice behavior. Pages 105 - 142 in P. Zarembka, editor. Frontiers in econometrics. Chap. 4. Academic Press, New York.

Morrison, M. 2000. Aggregation biases in stated preference studies. Australian Economic Papers 39:215 - 230.

Louviere, J.J., D.A. Hensher, and J.D. Swait. 2000. Stated Choice Methods: Analysis and Application. Cambridge University Press. 402 p. 

Morgan, D.L., and R.A. Krueger. 1998. Focus Group Kit (6 volumes). Sage Publications, Thousand Oaks, CA.

NOAA Office of Habitat Conservation and Office of Response and Restoration, and Stratus Consulting. 2012. Ecosystem Valuation Workshop (binder prepared for workshop participants). Dates: June 6-7, 2012. Location: Asheville, N.C. 

Norman, L.M.; N. Tallent-Halsell, W. Labiosa, M. Weber, A. McCoy, K. Hirschboeck, J. Callegary, C. van Riper III, and F. Gray. 2010. Developing an Ecosystem Services Online Decision Support Tool to Assess the Impacts of Climate Change and Urban Growth in the Santa Cruz Watershed; Where We Live, Work, and Play. Sustainability 2(7):2044-2069. 

NSF. 2010. Press Release 10-182: NSF Awards Grants for Study of Water Sustainability and Climate. http://www.nsf.gov/news/news_summ.jsp?cntn_id=117819. Retrived April, 2013.

Orme, B. 1998. Sample Size Issues for Conjoint Analysis Studies. Sawtooth Software Research Paper Series, Sawtooth Software, Inc.

Ringold, P.L., Boyd, J.W., Landers, D., Weber, M., Meeting Date: July 13 to 16, 2009. Report from the Workshop on Indicators of Final Ecosystem Services for Streams. EPA/600/R-09/137. 56 p. http://www.epa.gov/nheerl/arm/streameco/index.html

Ringold, P.L., J. Boyd, D. Landers, and M. Weber. 2013. What data should we collect? A framework for identifying indicators of ecosystem contributions to human well-being. Frontiers in Ecology and the Environment 11: 98 - 105.

Rubin, H.J., Rubin, I.S, 2005. Qualitative Interviewing. 2nd Edition. Sage Publications. Thousand Oaks, CA.

USEPA Science Advisory Board. 2009. Valuing the Protection of Ecological Systems and Services. http://yosemite.epa.gov/sab/sabproduct.nsf/WebBOARD/SAB-09-012/$File/SAB%20Advisory%20Report%20full%20web.pdf. Retrieved September, 2013.

USEPA. 2013a. Research Programs: Science for a Sustainable Future. http://www.epa.gov/ord/research-programs.htm. Retrieved April, 2013.

USEPA. 2013b. Sustainability. http://www.epa.gov/sustainability/. Retrieved April, 2013.

Weber, M., Stewart, S., 2009. Public Valuation of River Restoration Options on the Middle Rio Grande. Restoration Ecology 17(6):762-771.

White, M.S. 2011. Effluent-Dominated Waterways in the Southwestern United States:
Advancing Water Policy through Ecological Analysis. Ph.D. Dissertation, Arizona State University. 244p.



