        Supporting Statement Part B for Information Collection Request
                 EPA ICR No. 2660.01, OMB Control No. 2050-New
            Survey of State Emergency Response Commissions (SERCs)
                                       
1.	SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER PRELIMINARIES	
	(a)	Survey Objectives
      The Agency is proposing to administer a voluntary survey to State Emergency Response Commissions (SERCs) to collect this information. The information is intended to assist EPA in identifying how EPCRA is being implemented and best practices, challenges, and gaps in meeting requirements. This information is intended to assist EPA in identifying areas where states need assistance in implementing SERC requirements under EPRCA. 
      (b)	Key Variables
      Key information for this survey includes:
 A series of questions that ask the SERCs to better characterize their state program (e.g., number of LEPCs)
 A series of yes/no style questions (including check all that apply questions) that ask SERCs to identify which aspects of EPRCA are being implemented
 A series of questions that ask the SERCs to identify how the program is administered at the state and local levels
      (c)	Statistical Approach	
      The potential respondent universe are the 56 SERCs (50 states, five territories, and the District of Columbia). EPA plans to solicit responses from all of the 56 SERCs. Response to the survey is voluntary, however, EPA plans to actively pursue responses from each SERC. EPA has determined that a complete, or near complete, set of responses is the most valuable set of information to have for identifying issues and gaps. 
      EPA contacted five SERCs and asked them to review the draft survey questions and  provide a burden estimate to complete the survey. All five SERCs responded to EPA's request (Colorado, Maine, Ohio, Oklahoma, and Oregon). Four of the five SERCs provided estimates of the time required to complete the survey:
 Colorado indicated it would take 3-6 hours to complete
 Maine indicated it would take one hour to complete
 Ohio indicated it would take 3-4 hours to complete
 Oklahoma indicated it would take 2 hours to complete
 Oregon did not provide an estimate
      Based on these estimates, EPA has decided to assume that each response will take approximately four hours to complete (i.e., one half-day). This estimate reflects an upper-bound estimate and is based on the feedback obtained during the consultations. 
      The Agency expects to use contract support for data collection, tabulation and preliminary analyses of the data and information collected pursuant to this request.  Eastern Research Group, Inc. (ERG; 110 Hartwell Ave., Lexington, MA, 02421) will provide this support. ERG has significant expertise in developing and implementing surveys for EPA and other federal agencies.
      (d)	Feasibility
      During the consultations with SERCs, EPA discussed potential obstacles related to responding to the information collection and impacts on estimated burden. The following two key issues were identified:
            i. Needing to assemble and/or look up the information needed. SERCs indicated that much of the information being asked for was not something a SERC Coordinator would readily remember. Thus, time would be needed to search files and assemble responses. EPA has added time to the burden estimate to account for time to perform these actions. Additionally, from the consultations, EPA used an estimate from the upper end of the time estimates that were provided. 
            ii. The time needed to complete the survey. The survey asks 90 questions, some requiring an open-ended response and others asking for detailed listings of information. To account for this, EPA expects to allow for 30 days to complete the survey. 
2. 	SURVEY DESIGN 
	(a)	Target Population and Coverage
      The target population for this ICR is the 56 SERCs as defined above. The survey is intended to be completed by the SERC coordinator in each state or territory. This survey is intended to census the respondent population.   		
      (b)	Sample Design  
      Not applicable since survey will be distributed to 100% of the potential respondent population.
      (c)	Precision Requirements
      Not applicable since a statistically-based sampling design was not applicable.	
      (d)	Questionnaire Design. 
      The objective of the survey is to collect information to assist EPA in identifying how EPCRA is being implemented and best practices, challenges, and gaps in meeting requirements. EPA will use this information to identify areas where states need assistance in implementing SERC requirements under EPCRA. Thus, the questionnaire covers all relevant areas of EPCRA for which SERCs are responsible and asks for details on how SERCs are implementing those requirements. The instrument asks 90 detailed questions which, once compiled, will provide a comprehensive picture of how SERCs meet their requirements. 
      To design the instrument, EPA developed an initial list of topic areas under EPCRA based on the requirements for SERCs under the Act. EPA then detailed all of the required sub-elements and the relevant implementation details associated with those sub-elements. These topics were then formulated into a set of initial survey questions. EPA then asked ERG, its subcontractor, to review the questions and to refine based on ERG's survey design expertise. 
      The survey is designed to be administered via an online data collection portal. For this reason, the questionnaire is formatted in a way that is user-friendly. The layout is clear and the skip patterns are logical and function appropriately. A screenshot of the survey instrument is included in Appendix C this ICR.
      (e) 	Addressing Potential Non-response Bias
      EPA plans to follow up with each of the 56 SERC Coordinators directly to ensure a response from each one. EPA maintains contact information with all SERC Coordinators. Thus, EPA is targeting and expects little to no non-response. 
3.	PRETESTS AND PILOT TESTS
      To pilot test the survey, EPA asked five SERCs to review the survey instrument and to provide feedback on the survey questions. This pilot was done as part of the consultation process. The SERCs that participated in the consultation process reviewed the instrument and provided some feedback to EPA. Notably, each SERC indicated that the information would be something that a SERC Coordinator should be able to provide although the survey is long. 
4.	COLLECTION METHODS AND FOLLOW-UP	
	(a)	Collection Methods
      EPA will collect responses electronically using an online data collection portal (Qualtrics) maintained by its subcontractor, ERG. EPA will review each submittal for completeness and understandability. If items are missing, EPA will follow-up with the SERC to obtain the missing information. The information collected pursuant to this ICR will be maintained electronically in ERG's secure Qualtrics account.  
      EPA will notify the public via the Federal Register of the ICR and survey.  Additionally, EPA will provide the SERCs with a pre-notification alerting them to this upcoming data collection effort at least a week ahead of the survey request. This pre-notification will give each SERC Coordinator time to assemble the necessary information and to prepare as needed.  
	(b)	Survey Response and Follow-up	
      The default target response rate for this survey is 100% because EPA is requesting responses from all potential respondents (56).  The actual estimated response rate (54 respondents; 96%) is based on an assumption that at most two SERC will be unable to respond for various logistical reasons. EPA plans to conduct multiple follow-up requests with the SERCs to obtain the information, including scheduling meetings with the SERC Coordinators to assist them in starting the response process. 
5.	ANALYZING AND REPORTING SURVEY RESULTS
	(a)	Data Preparation
      The data will be collected using ERG's online web-based account with Qualtrics, Inc. EPA will have ERG review each entry from each SERC to ensure data are complete and understandable. 
	(b)	Analysis
      There are three types of data being collected in survey: fixed response option data (both multiple response and single response), quantitative values (e.g., percent), open-ended lists of items, and open-ended responses. For the fixed response data, EPA will develop tabulations and/or cross-tabulation as needed; each tabulation will provide the count and percentage of total responses for each response option. For the quantitative data, EPA will tabulate means, medians, ranges, and other summary statistics to characterize the distribution of the data across the SERCs. For the open-ended lists, EPA will have ERG develop a coding scheme to group similar list items together. ERG will provide EPA with a coding scheme that indicates which items were groups into which category. EPA will then tabulate the percentage of SERCs that listed each grouped item. Finally, for the open-ended responses, EPA will review each one individually and will also have ERG conduct a qualitative data analysis to identify trends and other themes within those data.
	(c)	Reporting Results
      EPA will produce a summary of the information collection and develop a report that is made available to the SERCs. The report will summarize the collected data and the conclusions that EPA draws from the data. 
