The SunWise Program

ICR # 1904.06

November 19, 2010

U. S.  Environmental Protection Agency

Office of Air and Radiation Part A of the Supporting Statement

	Identification of the Information Collection

1(a)	Title of Information Request

The title of this Information Collection Request (ICR) is The SunWise
Program (ICR# 1904.06).

1(b)	Short Characterization/Abstract

The SunWise Program was initiated in 1998 through a statutory mandate
under Title IV of the Clean Air Act.  The long term objective of the
SunWise Program is to reduce the incidence of, and morbidity and
mortality from, skin cancer, cataracts, and other UV-related health
effects in the United States.  Short term objectives include: 1)
reducing the risk of childhood overexposure to the sun by changing the
knowledge, attitudes, and behaviors of elementary school children and
their care givers; and 2) improving the availability of accurate,
timely, and useful UV data directly to schools and communities across
the United States.

The SunWise Program builds on traditional health education practices
through the use of existing curricula, learning standards, scientific
strategies, and evaluation mechanisms.  The Program is a collaborative
effort of schools, communities, health professionals, educators,
environmental organizations, meteorologists, local governments, federal
agencies, and others.  Participating schools sponsor classroom and
school activities to raise children’s awareness of stratospheric ozone
depletion, UV radiation, and the largely preventable health risks from
overexposure to the sun, as well as simple sun safety practices.  All
educators interested in participating in this partnership program are
asked to register using the online form
(www.epa.gov/sunwise/becoming.html and
www.epa.gov/sunwise/becoming_partner.html) or a hard copy version
distributed by EPA.  EPA will use the information provided through this
registration to maintain a database of participating schools and
organizations and a mailing list for information distribution purposes.
Participating schools and organizations receive a variety of materials,
including a classroom “Tool Kit” of games, songs, puzzles, story
books, videos, access to the internet UV intensity mapping/graphing
tools, and more. The Tool Kit also includes sample sun safety policies
and guidelines to help expand the sun safety message beyond the
classroom.

Teachers who sign up for SunWise are asked to complete a survey at the
end of program implementation.  Results of these surveys are used to
fine-tune existing SunWise materials and develop new ones that better
meet our participants’ needs.  Teachers are also asked if they are
interested in administering a brief survey to their students before and
after program implementation.  The surveys will be made available on the
SunWise website for all teachers that express interest in the survey
process.  Student survey responses are voluntary and anonymous. The
results from the student surveys are used to gauge program effectiveness
and also help guide materials development.

SunWise is also seeking additional qualitative information regarding
barriers to the promotion of SunWise and adoption of school policies, as
well as teachers’ receptiveness to a new recognition program that
SunWise is considering developing.  The program would recognize levels
of SunWise teaching for interested educators.  The more SunWise teaching
and sun safety policy changes implemented by an educator, the higher
her/his level of SunWise (similar to a bronze, silver and gold level)
recognition. This new recognition/incentives program would allow EPA to
get a better grasp on the level of engagement with SunWise, and would
allow for future effectiveness studies comparing impacts on student
knowledge, attitude and behavior with an educator’s level of
engagement.  To gather qualitative information on these issues, the
SunWise Program will conduct voluntary, individual, semi-structured
telephone interviews with willing educators.

In addition, EPA has teamed with other members of the National Council
on Skin Cancer Prevention, which include the American Cancer Society,
the American Academy of Pediatrics, and the Centers for Disease Control
and Prevention, to support the Don’t Fry Day (DFD) campaign.  As part
of this campaign, educators are asked to pledge to incorporate sun
safety into their spring and summer activities.  The DFD pledge is
available online at:   HYPERLINK
"http://www.epa.gov/sunwise/dfdpledge.html" 
http://www.epa.gov/sunwise/dfdpledge.html .

Further, SunWise has developed an online interactive SunWise Sun Safety
Certification Program that enables students, adults, organizations, and
employers to develop credentials on sun safety awareness and behaviors.
In order to gauge the certification program’s effectiveness, EPA will
be collecting information on demographics, knowledge, attitudes,
intended behavior, and behavior of the tutorial users. User types
include: outdoor recreation staff at camps, parks, recreation programs,
sports organizations, lifeguards, etc.  The certification program is
available online at:   HYPERLINK
"http://www.epa.gov/sunwise/tutorial.html" 
http://www.epa.gov/sunwise/tutorial.html 

Finally, EPA will pretest a new survey for SunWise non-school partners. 
While schools are the primary programmatic component of SunWise, SunWise
is also promoted through registered 501(c)(3) organizations such as
science centers and camps, children's museums, and scouting groups, as
well as other not-for-profit organizations like local, county and state
health, recreation and education departments. The new partner survey
will aim to understand how the SunWise Program is being implemented by
non-school partners, and how it can be improved. 

PREVIOUS TERMS OF CLEARANCE:

On November 2, 2001, OMB approved ICR #1904.01 with Terms of Clearance. 
EPA has addressed OMB’s terms in the following manner.

This collection is approved in part and disapproved in part. EPA is
approved to collect registration information and to conduct the
requested student survey and teacher survey. These portions of the
collection are approved for two years, until November 2003. OMB
understands that EPA no longer intends to collect information under the
Parent Survey and the School Administrator Survey. These two surveys are
not approved. OMB has adjusted the burden of the collection accordingly
from EPA's request. 

EPA Response: We will not be surveying parents or school administrators
under this ICR renewal.

As discussed in Part B of the revised supporting statement, EPA plans to
assess two sampling issues in Fall 2002 based on results from the first
year of the student survey: (1) whether it is necessary to include in
the sampling protocol an approach to stratify between warm and cool
states, and (2) whether it is necessary to include an additional
sampling stage to sample classrooms within a school that has been
selected for participation. EPA should report to OMB its findings with
regard to these issues before beginning the second year of sampling, and
advise OMB whether it intends in the second year of the collection to
(1) stratify between warm and cool states in the first stage of
sampling, 

EPA Response (sent to OMB on 8/29/02): We have conducted this analysis
and found no 

differences between the students in the 37 cooler states versus the 13
warm states, as categorized by the UV Index values. We measured change
in scores from pretest to posttests on children's knowledge of sun
protection, attitudes toward the sun, and current sun protection
practices and found minimal differences in the change. Therefore, with
no differences between the two groups, there would be no reason to
stratify our findings. 

and/or (2) add a sampling stage to sample classrooms within a selected
school. 

EPA Response (sent to OMB on 8/29/02): We analyzed the composition of
registrants in the SunWise database by region of the country and the
number of schools that contained more than one classroom. It appears
that 75 percent of schools in warm and cool climates have no more than 1
classroom per school with negligible difference by region. There is
little evidence of clustering or differences in clustering between
regions of the country. If that were the case, we would have to account
for this effect by performing tests of intraclass correlation.

OMB also notes that EPA must include the OMB number, expiration date,
and Paperwork Reduction Act notice on the teacher survey before using
that instrument to collect information.

EPA Response: All information is now included on the teacher survey. 
See attachment #1a and 1b for verification.

	Need for Use of the Collection

2(a)	Need/Authority For The Collection

This collection will be used for program material distribution and to
determine program effectiveness and participant satisfaction.  

Educators will be asked to fill out a simple registration form, which we
use to mail out the program materials and keep track of the Program’s:

	geographic reach (Which states/regions have SunWise schools?);

	grade-level and subject-matter distribution (How many 1st grade
teachers are using SunWise? How many math teachers are using SunWise?
etc.); and

	student participation level (How many students is SunWise potentially
reaching?).

Surveys to be administered include:

	Teacher online survey for measuring their receptivity to the
educational component of the Program and experience with the SunWise
Tool Kit and educational resources;

	Student survey to identify sun safety knowledge, attitudes, and
behaviors among students before and after participation in the Program; 

   One-on-one semi-structured telephone interviews with teachers to
complement the information collected through the self-reported Teacher
Survey instrument by providing richer, qualitative information regarding
barriers to the promotion of SunWise and adoption of school policies, as
well as teachers’ receptiveness to a new recognition program that
SunWise is considering developing;  

   Embedded questions within the SunWise Don’t Fry Day pledge collect
information related to demographic and mailing information and intent
for incorporating sun safety activities into spring and summer teaching.

   Embedded questions within the SunWise Sun Safety Certification
Program measure receptivity to sun protection, demographic information,
and current practices, attitudes and knowledge.

   Pretest of the partner survey for developing a survey to measure and
understand partners’ receptivity to and use of the SunWise Tool Kit
and other educational resources.

The data will be analyzed and results will indicate the Program’s
effect on participants’ sun safety knowledge, attitudes, and
behaviors, as well as help SunWise understand how the Program is being
implemented and how it can be improved.  Responses to the collection of
information are voluntary.  Responses to the collection of information
remain confidential, and responses to the student survey are anonymous.

The SunWise Program recognizes the challenge of measuring the progress
and evaluating the effectiveness of an environmental and public health
education program where the ultimate goal is to reduce risk and improve
public health.  Therefore, the continual and careful evaluation of
program effectiveness through a variety of means, including data from
pre- and post-intervention surveys, tracking and monitoring of classroom
activities and school policies, and experts, is necessary to monitor
progress and refine the program. 

2(b)	Practical Utility/Users of the Data

EPA/SunWise will use the survey results to review process and impact
results—including cost effectiveness—and adapt as appropriate its
messages, approaches, and materials.  Survey results will enable
EPA/SunWise to better meet the needs of its educator and student
participants, with the long range goal of reducing the incidence and
effects of skin cancer and other UV-related health problems among
children and adults. 

Past collections have resulted in numerous publications and
presentations at health-related conferences.  Examples can be found at
the links below:

  HYPERLINK
"http://apha.confex.com/apha/138am/webprogram/Paper216559.html" 
http://apha.confex.com/apha/138am/webprogram/Paper216559.html 

  HYPERLINK
"http://www.cdc.gov/nccdphp/conference/pdf/program_book_2009.pdf" 
http://www.cdc.gov/nccdphp/conference/pdf/program_book_2009.pdf  

  HYPERLINK
"http://pediatrics.aappublications.org/cgi/content/abstract/121/5/e1074"
 http://pediatrics.aappublications.org/cgi/content/abstract/121/5/e1074 

  HYPERLINK "http://epa.gov/sunwise/evaluation.html" 
http://epa.gov/sunwise/evaluation.html  

3.  Nonduplication, Consultations, and Other Collection Criteria

				

3(a)	Nonduplication

The information required to complete the survey instruments and
interviews for the SunWise Program is not duplicative of information
otherwise available to EPA.  In the early stages of the SunWise
Program’s development in 1997, several searches for information were
completed in consult with external stakeholders, including
representatives from the following organizations:

American Academy of Dermatology

American Cancer Society

Boston University Medical Center - Skin Oncology, Cancer Prevention &
Control Center

Centers for Disease Control and Prevention

National Association of Physicians for the Environment

National Safety Council

The Skin Cancer Foundation

Results from these consultations indicated that no other formal,
student-focused, sun safety programs were being implemented in the
United States, nor were surveys being conducted on attitudes and
practices of children relating to sun exposure. 

In addition, to EPA/SunWise’s knowledge, there is no other sun safety
certification available to outdoor recreation staff in the U.S.,
therefore it is essential that accurate information on the users and use
of the certification program be collected for program refinement.

Conducting timely and useful process evaluation is also of importance if
the SunWise Program is to continue providing high-grade and pertinent
resources for educators across the United States.

3(b)	Public Notice Required Prior to ICR Submission to OMB

Official notice of this proposed collection appeared in the Federal
Register on September 3, 2010 (75 FR 54143).  One comment was received.
Since EPA coordinates with other federal and non-federal entities
working to prevent skin cancer and other health effects from
overexposure to the sun, and both Congress and EPA allocate funding for
the program, no further justification for the comment is needed. 

3(c)	Consultations

The following professionals were consulted during the development of the
three survey instruments:

		Alan Geller, Harvard School of Public Health, Division of Public
Health Practice, (617) 495-4000

		Dave Buller, PhD, AMC Cancer Research Center, (303) 239-3511

	Dr. Barbara Gilchrest, Chair, Department of Dermatology, Boston
University School of Medicine, (617) 638-5538

	Dr. Donald Miller, Assistant Professor of Epidemiology and Health
Policy, Boston University School of Medicine, (781) 687-2865

  	Dr. Amy Mack, Psy.D., ICF International, (703) 219-4311

3 (d)	Effects of Less Frequent Collection

SunWise depends on registration information to

	

	maintain an accurate list of participants; and 

	ensure timely distribution of program materials and program updates to
participants.

	

SunWise depends on survey responses to 

	help guide program development; 

	measure participant satisfaction with the program; 

         	test new ideas for recognition and incentives; and

	elicit basic information on attitudes and practices of children and
their caregivers relating to sun exposure.

SunWise depends on certification program information to

	determine the current knowledge, attitudes, behavioral intentions and
behaviors of individuals taking the tutorial; 

	determine which tutorial to provide to the user;

	measure how many and what type of users are becoming certified;

	ensure the tutorial does not take too long to complete;

	ensure rigor and enthusiasm for the tutorial; and 

	determine whether or not the tutorial is delivering the information in
an easy-to-understand manner.

Conducting the surveys and information collection less frequently may
slow down the Program’s ability to institute participants’ desired
changes. 

3(e)	General Guidelines

All OMB guidelines will be adhered to by EPA/SunWise Program.

3(f)	Confidentiality

		

Names of participating schools and organizations may be made public. 
All names of registered educators and other participating individuals
will remain confidential.  All responses to the collection of survey
information will remain confidential.  All student surveys are also
completed on an anonymous basis (no identifying information is included
on the survey form). Certification program users will be asked to
provide their first and last name so they can be given a certificate of
completion with their name on it; however, the information will not be
collected by EPA. 

EPA and a contractor will analyze survey results and proceed
accordingly.  

3(g) 	Sensitive Questions

The survey instruments of this ICR contain no sensitive questions.

4.	The Respondents and the Information Requested

4(a)	Respondent/SIC and NAICS Codes

Entities potentially affected by this action are elementary, middle, and
high school students and educators (SIC Div. I:  Group 8211, NAICS code:
 61111), as well as recreation workers (NAICS code: 813400), health
educators (NAICS code: 999300), and preschool teachers (NAICS: 624400).

	

4(b)	Information Requested

The registration form (Attachment 1a and 1b, also available at  
HYPERLINK "http://www.epa.gov/sunwise/becoming.html" 
www.epa.gov/sunwise/becoming.html  and
www.epa.gov/sunwise/becoming_partner.html) is a simple, 10-minute
questionnaire that asks teachers to provide: the name and contact
information of the participating school; school composition (e.g. grade
levels); and information specific to the interest areas of the
registering teacher.  The purpose of this form is to ensure that EPA
distributes the most relevant education materials to all SunWise
participants.

The survey instruments covered under this ICR are as follows:

		Teacher Survey (Attachment 2):  Educators will be asked to evaluate
their and their students’ receptivity to sun safety resources provided
by EPA.  Additionally, educators will be asked about how they
implemented the SunWise program in their school/classroom and how many
students they reached. Finally, educators will be asked about areas for
program growth, including their receptivity to new resources and an
updated recognition program.  Educator feedback about the usefulness of
classroom and school materials will be vital to the refinement of
program materials.  This information can be submitted online. Part B(i)
of the Supporting Statement provides additional information on the
teacher survey design.

		Student Survey (Attachment 3a and 3b): This survey will be
administered by volunteer teachers to participating students before and
after implementation of SunWise activities.  Pre-test and post-test
surveys are nearly identical in content, with the exception of one
question in the post-test which is aimed at verifying that the student
has participated in SunWise.  This simple, 10-minute questionnaire
elicits basic information on knowledge, attitudes, and practices of
children relating to sun exposure. The survey is identical to that
previously approved by OMB (Control No. 2060-0439). Part B(i) of the
Supporting Statement provides additional information on the student
survey design.

   		Teacher Telephone Interviews (Attachment 4): To complement the
information collected through the self-reported Teacher Survey
instrument, some teachers will be asked to participate in one-on-one,
semi-structured telephone interviews to provide qualitative information
regarding barriers to the promotion of SunWise and adoption of school
policies, as well as teachers’ receptiveness to a new recognition
program that SunWise is considering developing.  An interview guide with
topics for discussion is provided in Attachment 4.  Part B(ii) of the
Supporting Statement provides additional information on the teacher
survey design.

  	SunWise Don’t Fry Day pledge: Embedded questions within the pledge
collect information related to demographic information and intent for
incorporating sun safety activities into spring and summer teaching.
This information can be submitted online, and the pledge is available
at:   HYPERLINK "http://www.epa.gov/sunwise/dfdpledge.html" 
http://www.epa.gov/sunwise/dfdpledge.html 

         SunWise Sun Safety Certification Tutorial Questions: 
Certification program users will be asked to provide their first and
last name so they can be given a certificate of completion with their
name on it. The information will not be collected by EPA. Additionally,
users will be asked a series of questions to determine their current sun
protection knowledge, attitudes and behaviors, and their receptivity to
sun protection generally. The questions being asked will also help
educate the user by reminding them of their own behavior in comparison
to the desired behavior (practicing sun safety).  Part B(iii) of the
Supporting Statement provides additional information on the
certification program survey design.

         Pretest of the Partner Survey (Attachment 5): Selected partners
will be asked to pretest a survey to measure partners’ receptivity to
the SunWise Tool Kit and other educational resources. Part B(iv) of the
Supporting Statement provides additional information on the pretest
partner survey design.

Registration forms can be submitted electronically or in hard copy form
using envelopes provided by EPA.  The teacher survey is available
electronically.  Teachers will be given the option to return student
surveys either by email, fax, or postage-paid envelopes provided by EPA.
 Neither the registration nor the surveys require that respondents keep
records or maintain files.

										

5.  The Information Collected

5(a)	Agency Activities

The Agency activities associated with registration of participants done
through the

SunWise Program consists of the following:

		Maintain participant database;

		Maintain mailing list for information distribution purposes.

The Agency activities associated with surveying done through the SunWise
Program consists of the following:

		Develop collection instruments;

		Answer respondent questions;

          Conduct individual telephone interviews with teachers;

		Audit and/or review data submissions;

		Reformat the data;

 		Analyze the data and make program adjustments as needed;

		Store the data.

The Agency activities associated with the certification program done
through the SunWise Program consists of the following:

		Store and consolidate data, none of which is sensitive or personal;
and

		Review consolidated data and make adjustments as needed.

5(b)	Collection Methodology and Management

In collecting and analyzing the information associated with this ICR,
EPA will use electronic and hard-copy registration forms, electronic and
hard-copy surveys, and telephone interviews.  

Further details on the collection methodology and management for the
surveying done through the SunWise Program are provided below.

Registration

EPA routinely promotes the SunWise Program through presentations and
exhibits at meetings of nurses, teachers, and other educators.
Registrants provide their name and contact information, including the
name of their school, and state whether they are a classroom teacher,
health teacher, gym teacher, or school nurse on paper copy registration
forms. This information is then entered into a registration and tracking
system housed on EPA servers. In addition to the paper copy
registrations, EPA also registers educators and partners through an
online registration page housed on EPA’s SunWise program website. All
information collected is protected by adequate security and the system
is registered with the Automated System Security Evaluation and
Remediation Tracking (ASSERT) program to meet reporting requirements
under the Federal Information Security Management Act (FISMA). 

The data is used to send registrants SunWise resources and alert
registrants of sun safety-related opportunities and new resources. No
personally identifiable information is shared outside of EPA and its
contractors and grantees.

Teacher and Student Surveys

Teacher Surveys are conducted to determine: 

		Students’ satisfaction with SunWise activities and resources;

 		Teachers’ satisfaction with SunWise activities and resources;

 		How and how often teachers are using the SunWise materials, resources
and programming;

   		How many students are receiving SunWise education;

 		If teachers are sharing resources with other teachers;

 		If school policies are being changed as a result of SunWise;

 		If teachers are changing their own behavior;

 		If students are changing their behavior;

 		If teachers have suggestions for improving or creating new SunWise
resources.

EPA will send a recruitment email in the Spring/Summer timeframe each
year encouraging all registered participants to take the SunWise Teacher
Survey (Attachment 2) hosted online. Participants may also be recruited
through additional avenues, such as recruitment letters distributed
through the SunWise Tool Kit, educator conferences, or direct mailings.
Since this survey will be voluntary and self-selecting, it will not be
generalizable to the entire pool of registered SunWise teachers.
However, it will be informative and provide insight into how some
teachers utilize SunWise materials and how EPA can encourage higher use
rates.

Part of the Teacher Survey will be an optional student pre-test and
post-test survey using the validated student survey used in the previous
ICR period (Attachment 3a and 3b). There will be no control group for
this portion of the survey, and no generalizations will be made from
this data. It will serve as a useful way to see if students are still
getting the same benefit from the SunWise Tool Kit as in
quasi-experimental study designs testing the same concepts (see previous
ICR supporting statement for more details). Again, because this portion
of the survey will be voluntary and self-selecting, the results will not
be generalizable to every student that has received a SunWise education.

 

For those teachers that choose to participate, they will provide
children with a double-sided, one-page anonymous survey instrument.
After students complete the pre-test in the spring, teachers will lead
the SunWise lessons. SunWise will recommend that participating teachers
administer the post-test survey at least one month after teaching the
SunWise lessons, and will ask teachers to report what the time gap was
between SunWise lessons and administration of the survey. 

All student surveys are anonymous. Since surveys are anonymous, no
specific information on the child can be reported to parents or school
staff.  Surveys are done in the classroom setting, and conducted by the
teachers; thus, it would not be feasible for the teachers to obtain
consent from the parents and assent from the children for a classroom
teaching tool.

Teachers will be instructed to return completed student surveys to EPA
by one of several ways, including scanning and emailing the surveys,
faxing the surveys, or by requesting a self-addressed, stamped envelope
from EPA.

EPA will ensure the accuracy and completeness of collected information
by having all surveys reviewed by a contractor, grantee, or EPA staff. 
The data collected from the surveys will serve to provide information
internally to help improve the SunWise Program.  Since the results are
not intended to be generalizable, no statistical approach is needed.

Part B(i) of the Supporting Statement presents more detailed information
on the data collection, management, and analysis methods for the teacher
and student surveys.

Teacher Individual Interviews

To complement the self-reported Teacher Survey instrument, individual
interviews will be conducted with selected teachers to gather richer,
qualitative information regarding: 

 		Teachers’ involvement in the SunWise Program;

          If school policies are being changed as a result of SunWise
and how to overcome the barriers to those changes;

         Which SunWise activities and resources teachers feel are the
most effective; 

 		How teachers’ approach to teaching SunWise activities has changed
over time;

 		How the SunWise Program can more effectively disseminate its
materials and recruit more teachers;

 		How the SunWise Program can encourage teachers to increase their
involvement in SunWise and their promotion of sun safety in schools; and

 		Receptiveness to an incentives or “Levels of SunWise” educator
recognition program, and ideas for making such a program successful.

Interview participants will be recruited via a screening email to all
registered SunWise educators asking (a) whether they have taught SunWise
in the past two years; (b) how many years they have been teaching
SunWise; and (c) whether they are willing to both complete an online
teacher survey and participate in a one-on-one telephone interview.
Teachers responding positively to both screening questions (a) and (c)
will be grouped by region and length of participation, and across these
groups, 50 teachers will be randomly selected to participate in the
interview process. To the extent possible, the selected teachers will
represent the geographical and participation range of SunWise, though
the sample will not necessarily be representative in a statistical
sense. The teachers that are not selected will still be encouraged to
take the online survey, but will not be part of the group that will be
individually interviewed. After the selected participants have taken the
online survey, EPA will set up a convenient time to interview each of
the selected teachers.  

Teachers will participate in one online survey and one telephone
interview per year over a three year period, with slightly different
informational goals for each year. In the first year, the interview will
include discussion about the development of an educator recognition or
incentives program, while interviews in later years may focus on other
areas of interest, such as parental involvement. 

Since participation is both voluntary and self-selecting, the results of
this qualitative study will not be generalizable to the entire pool of
registered SunWise teachers.  The data collected from the interviews
will serve only to provide information internally to help improve the
SunWise Program and the development of a new recognition program.  

Part B(ii) of the Supporting Statement presents more detailed
information on the data collection, management, and analysis methods for
the teacher interviews.

SunWise Don’t Fry Day Pledge 

EPA will collect information as educators complete the pledge online. 
Prior to Don’t Fry Day each year (the Friday before Memorial Day), EPA
will review the information collected and summarize participation for
promotional efforts. Additionally, EPA will mail a poster and stickers
to all educators taking the pledge. The pledge is available at:  
HYPERLINK "http://www.epa.gov/sunwise/dfdpledge.html" 
http://www.epa.gov/sunwise/dfdpledge.html 

Certification Tutorial

EPA will collect information as participants take the tutorial. Many of
the questions will help instill the information they are learning
through the tutorial. Part B(iii) of the Supporting Statement presents
more detailed information on the data collection, management, and
analysis methods for the tutorial/certification program.

EPA plans to periodically review data collected from the certification
program/tutorial and make refinements to the program as necessary. The
knowledge gained through this information collection will inform
programmatic decisions, and allow EPA to gain a better understanding of
the target audience to determine if additional intervention is needed in
the outdoor recreation setting. While results cannot be generalized to
the general outdoor recreation staff population due to the limitation of
self-selection, the information will be informative and will be shared
with partners and the public for improved tailoring of interventions to
the outdoor recreation audience. 

Pretest Partner Survey

EPA will undertake pretesting of a survey for non-school partners
participating in the SunWise Program.  These partners may include state
and local health departments, childcare centers, museums, camps, and
science centers.  The purpose of the survey will be generally to better
understand how non-school partners are interacting with the SunWise
Program, as well as to determine:

		How and how often partners are using the SunWise materials, resources
and programming;

   		How many children are receiving SunWise education through
non-school partners;

 		Children’s satisfaction with SunWise activities and resources;

 		Partners’ satisfaction with SunWise activities and resources;

          If partners are sharing resources with other partners;

 		If partner organizations’ sun safety policies are being changed as
a result of SunWise;

		If partners have suggestions for improving or creating new SunWise
resources.

The pretesting is intended to determine the validity and effectiveness
of the survey questions—e.g., whether questions measure what they are
supposed to measure, whether partners understand what the questions are
asking are asking, and whether the questions are the right questions to
gain a better understanding of how partners are interacting with the
SunWise Program.

EPA will send a recruitment email in the Spring/Summer timeframe
encouraging registered partners to participate in the pretesting of the
partner survey (Attachment 5).  Participants may also be recruited
through additional avenues, such as recruitment letters distributed
through the SunWise Tool Kit, educator conferences, or direct mailings. 
From those partners indicating their willingness to participate, EPA
will sort the partners into types of partners (e.g., health departments,
childcare centers, camps, and educational centers such as museums or
science centers) and randomly select participants from each group for a
total of 30 participants.  

Depending on available resources and other constraints, the survey may
be self-administered with feedback gathered from each participant over
the telephone, or the survey may be administered in-person either in an
individual or group setting, with feedback gathered through in-person
interviews.  In either case, participants will be asked if they
understood all questions and whether there were questions they would
suggest removing or adding to better reflect the participation of
partners in the Program.  Based on this feedback, EPA will revise the
partner survey.  

Part B(iv) of the Supporting Statement presents more detailed
information on the data collection, management, and analysis methods for
pretesting the partner survey.

												

5(c)	Small Entity Flexibility

Not applicable.

5(d)	Collection Schedule

Registration: All teachers are required to register for the Program if
they wish to receive the SunWise Tool Kit
(www.epa.gov/sunwise/becoming.html and
www.epa.gov/sunwise/becoming_partner.html) and regular program updates. 


Teacher and Student Surveys: All program participants are invited to
take the Teacher Survey at any time during the year. As noted above,
recruitment emails will be sent in the Spring/Summer to all registrants
encouraging them to take the Teacher Survey, but it is always optional. 


Teachers who opt for their classrooms to participate in the student
pre-test and post-test surveys will be asked to administer the pre-tests
before teaching the SunWise lessons, and then to administer the
post-test survey at least one month after teaching the SunWise lessons.

Teacher Interviews: Participants (as selected using the method described
in Part B(ii) of the Supporting Statement will complete the online
survey and one telephone interview per year over the three year period.

Don’t Fry Day Pledge:  All educators are invited to take the DFD
pledge at any time during the year.  A recruitment email will be sent in
the Spring to all registrants encouraging them to take the pledge, but
participation is always optional. Participants may also be recruited
through additional avenues, such as educator conferences.

Certification Tutorial:  All outdoor educators are invited to take the
certification tutorial at any time during the year.  

Partner Survey Pretesting: The partner survey will be pretested once in
the three-year ICR period.

6.  Estimating the Burden and Cost of the Collection

6(a)	Estimating Respondent Burden

Registration: EPA developed the SunWise Program Registration Form with
the Agency’s Internet Support Team in Research Triangle Park, North
Carolina. Input from a five-person focus group was used to determine
average completion time.  Teachers are asked to complete the
registration form only once during their participation in the program
for a total registrant burden of 10 minutes.

Annual estimated respondent burden: 

Annual Respondent Burden- Registration

Registrant Group	Hour Burden

Educator	0.17



Teacher and Student Surveys: During the development of the teacher
survey, EPA, in consult with a contractor and less than nine educators
reviewed the teacher survey to determine appropriate content and survey
completion time. The teacher survey is administered one time each year
and takes approximately 20 minutes to complete.  If the teacher decides
to conduct the student pre- and post-test surveys, additional time
burden will be experienced by the teacher to administer the student
surveys (noted below).

During the development phase of the student surveys, EPA, in consult
with a contractor, pretested the survey with 9 children to determine
appropriate content and survey completion time.  The student survey will
be administered once in years 1 and 3 (i.e., pre-test for Group A and
post-test for Group B) and twice in year 2 (i.e., post-test from Group A
and pre-test for Group B).  Each survey will take approximately 10
minutes to complete, for an annual per student burden of 10 minutes.  

Annual estimated respondent burden: 

Annual Respondent Burden- Surveys

Survey Group	Hour Burden

Student	0.17

Educator – No student survey	0.33

Educator – Yes student survey	0.5

	

Teacher Interviews: Some selected teachers will both complete the online
teacher survey (estimated at 20 minutes, as discussed above), as well as
participate in a 30 minute interview with EPA and/or a contractor, for a
total annual per teacher burden of 50 minutes.

Annual estimated respondent burden: 

Annual Respondent Burden- Interviews/Surveys

Survey Group	Hour Burden

Educator	0.83



Don’t Fry Day Pledge:  Educators will be invited to take the Don’t
Fry Day pledge at any time throughout the year. The pledge can be
completed online and requires participants to fill out their name,
address, school, and commitment to sun protection.  The total respondent
burden is estimated to be 5 minutes per year.

Annual estimated respondent burden: 

Annual Respondent Burden- Pledge

Survey Group	Hour Burden

Educator	0.08



Certification Tutorial:  Users will be asked a series of questions about
sun protection to determine their demographics (no personal identifiers
will be captured), knowledge, attitudes, behavior, perception of others
they work with, and the environmental conditions in the place where they
work. They will also be asked to enter their first and last names so it
can be put into a certificate that they can print. This information will
not be saved by EPA. The total respondent burden is estimated to be 7
minutes.

Annual estimated respondent burden: 

Annual Respondent Burden –Tutorial/Certificate Questions

Survey Group	Hour Burden

Student	0.12

Outdoor Educator	0.12



Partner Survey Pretesting: To pretest a new survey for SunWise
non-school partners, some selected partners will complete the survey
(estimated at 20 minutes, based on the survey’s similarity to the
teacher survey described above), as well as participate in a 30 minute
interview with EPA and/or a contractor to discuss the survey and ways to
improve it.  The total respondent burden is thus estimated at 50
minutes.

	 

Annual estimated respondent burden: 

Annual Respondent Burden- Partner Survey

Survey Group	Hour Burden

Non-school Partner	0.83



6(b)	Estimating Respondents Costs

The Bureau of Labor Statistics figures were used to determine labor
costs for these tables.  In order to account for benefits and overhead,
the average hourly wage rate of $36.45 for a teacher and $12.04 for a
recreation worker (i.e., outdoor educator) were increased by 110% for a
labor cost of $76.55 per hour for teachers and $25.28 per hour for
outdoor educators.  

For partners, the average of the average hourly wage of $23.59 for a
health educator and $13.20 for a preschool teacher was calculated to
determine the hourly labor cost, since these occupations are considered
typical of SunWise’s non-school partners.  This averaged hourly labor
cost of $18.40 was increased by 110% for a labor cost of $38.63 per hour
for partners to account for benefits and overhead.

	Annual Respondent Burden and Cost- Registration

Registrant Group	# of responses per participant	Hour Burden	Labor Cost

Educator	1	0.17	1 * 0.17 * $76.55 = $13.01



Annual Respondent Burden and Cost – Teacher and Student Surveys

Survey Group	# of responses per participant	Hour Burden	Labor Cost

Student	1	0.17	1 * 0.17 * $0 = 0

Educator – No student survey	1	0.33	1 * 0.33 * $76.55 = $25.26

Educator – Yes student survey	1	0.5	1 * 0.5 * $76.55 = $38.27



Annual Respondent Burden and Cost – Teacher Interviews/Surveys

Survey Group	# of responses per participant	Hour Burden	Labor Cost

Educator 	1	0.83	1 * 0.83 * $76.55 = $63.79



Annual Respondent Burden and Cost – Don’t Fry Day Pledge

Survey Group	# of responses per participant	Hour Burden	Labor Cost

Educator 	1	0.08	1 * 0.08 * $76.55 = $6.38



Annual Respondent Burden and Cost – Tutorial/Certificate Questions

Survey Group	# of responses per participant	Hour Burden	Labor Cost

Student	1	0.12	1 * (0.12 * 0) = 0

Outdoor Educator	1	0.12	1 * 0.12 * $25.28 = $3.03



Annual Respondent Burden and Cost – Partner Survey Pretesting

Survey Group	# of responses per participant	Hour Burden	Labor Cost

Non-school Partner	1	0.83	1 * 0.83 * $38.63 = $32.19



The respondents will have no capital/startup or O&M costs.

6(c)	Estimating Agency Burden and Cost

Registration: Registration information collection is done primarily
through a website database feature.  The start-up cost associated with
designing the registration web page was approximately $25,000, but that
money has already been spent prior to previous ICRs.  Maintenance of the
website is estimated to involve three types of staff: EPA personnel,
Grantees through the Senior Environmental Employee (SEE) Program, and
contractor staff costing $130 per hour. The EPA employees will take 4
hours/month or 48 hours per year. The cost of this labor is calculated
based on a GS 13 Step 1 pay level living in Washington, DC ($68.26/hour
using the salary associated with this grade and step, multiplied by a
benefits factor of 1.6), making the total annual cost $3,276.29.  The
contractor will spend 240 hours per year on the maintenance and
enhancement of the registration and tracking system at an annual cost of
$31,200. 

Finally, EPA will manually input all information received via hard-copy
registration form onto the database. The costs of this labor are
estimated to be 2000 hours per year at a SEE Program pay level of
$40,000 annually. 

Agency Burden and Costs - Registration

	Burden Hours	Total Costs ($)

EPA (Annual)	2,288	$74,476.29 

EPA (3-Year ICR)	6,864	$223,428.86 



Teacher and Student Surveys: The contractor assists EPA in data
collection and analysis.  The contractor also provided technical support
in the development of the surveys.  To perform these functions, EPA will
contract for a total of 150 professional hours per year.  At an average
rate of $130.00 per hour, the total cost for the contractor is about
$19,500 annually.  Agency burden to manage this contract is estimated at
4 hours/month or 48 hours annually.  The cost of this labor will be
calculated based on a GS 13 Step 1 pay level ($68.26/hour using the
salary associated with this grade and step, multiplied by a benefits
factor of 1.6).  Total hours (48) amounts to a total agency labor cost
of $3,276.29/per annum.

	Agency Burden and Costs- Teacher and Student Surveying

	Burden Hours	Total Costs ($)

EPA (Annual)	198	$22,776.29 

EPA (3-Year ICR)	594	$68,328.86 

												

Teacher Interviews: The contractor assists EPA in data collection and
analysis.  The contractor also provides technical support in the
development and deployment of the surveys and interview questions.  To
perform these functions, EPA will contract for a total of 150
professional hours per year.  At an average rate of $130.00 per hour,
the total cost for the contractor is about $19,500 annually.  Agency
burden to manage this contract is estimated at 4 hours/month or 48 hours
annually.  Agency burden associated with the coordination and
participation in interviews is estimated at 50 hours per year.  The cost
of this labor will be calculated based on a GS 13 Step 1 pay level
($68.26/hour using the salary associated with this grade and step,
multiplied by a benefits factor of 1.6).  Total hours (98) amounts to a
total agency labor cost of $6,689.09/per annum.

	Agency Burden and Costs- Teacher Interviewing/Surveying

	Burden Hours	Total Costs ($)

EPA (Annual)	248	$26,189.09 

EPA (3-Year ICR)	744	$78,567.26 



Don’t Fry Day Pledge:  To perform the data collection and analysis
function, agency burden is estimated at 2 hours/month or 24 hours
annually.  The cost of this labor will be calculated based on a GS 13
Step 1 pay level ($68.26/hour using the salary associated with this
grade and step, multiplied by a benefits factor of 1.6).  Total hours
(24) amounts to a total agency labor cost of $1,638/per annum.

	Agency Burden and Costs- Don’t Fry Day Pledge

	Burden Hours	Total Costs ($)

EPA (Annual)	24	$1,638.14 

EPA (3-Year ICR)	72	$4,914.43 



Certification Tutorial: The contractor will maintain the tutorial,
including the data collection component.  The contractor will also
analyze the data every other year (i.e., during year 1 and year 3 of the
ICR).  To perform this task, EPA has contracted for a total of 200
professional hours, 100 hours for each year of analysis.  In addition,
EPA has contracted a total of 12 hours each year for maintenance.  At an
average rate of $130.00 per hour, the total cost for the contractor is
$13,000 per year for data collection in year 1 and 3, and $1,560 per
year for maintenance.  Agency burden to manage this contract is
estimated at 4 hours/month or 48 hours annually.  The cost of this
labor will be calculated based on a GS 13 Step 1 pay level ($68.26/hour
using the salary associated with this grade and step, multiplied by a
benefits factor of 1.6).  

                Agency Burden and Costs –
Tutorial/Certification

	Burden Hours	Total Costs ($)

EPA (Annual) – Year 1 and 3	160	$17,836.29 

EPA (Annual) - Years 2 	60	$4,836.29 

EPA (3-Year ICR)	380	$40,508.86



Partner Survey Pretesting: The contractor will assist EPA in the
development and deployment of the partner surveys and interviews in the
pretesting process.  To perform these functions, EPA will contract for a
total of 90 professional hours in year 3.  At an average rate of $130.00
per hour, the total cost for the contractor is about $11,700.  Agency
burden to manage this contract is estimated at 2 hours/month or 24 hours
in year 3.  Agency burden associated with the coordination and
participation in pretesting interviews is estimated at 30 hours in year
3.  The cost of this labor will be calculated based on a GS 13 Step 1
pay level ($68.26/hour using the salary associated with this grade and
step, multiplied by a benefits factor of 1.6).  Total hours (54) amounts
to a total agency labor cost of $3,685.82 in year 3.

	Agency Burden and Costs- Partner Survey

	Burden Hours	Total Costs ($)

EPA (Annual) – Year 3	144	$15,385.82 

EPA (3-Year ICR)	144	$15,385.82 

	

6(d)	Estimating the Respondent Universe and Total Burden Costs

Registration

(A)

Number to register	(B)

Total Hours	(C)

Rate per hour ($)	(D)

# of responses	(E)

Total Cost

E=B*C

3,500 Educators	595	$76.55	3,500	$45,544.28 

Total (Annual)                     	595	 	3,500	$45,544.28 

ICR Total (3 years)             	1,785	 	10,500	$136,632.83 



Student and Teacher Surveys

(A)

Number to be surveyed	(B)

Total Hours	(C)

Rate per hour ($)	(D)

# of responses	(E)

Total Cost

E=B*C

1,000 Students per year	170	 $0.00	1,000	$0.00 

1,000  Educators – No student survey	333	 $76.55 	1,000	$25,515.00 

300  Educators – Yes student survey	150	 $76.55 	300	$11,481.75 

Average Total (Annual)                     	653	 	2300	$36,996.75 



ICR Total (3 years)             	1,960	 	6,900	$110,990.25 



Teacher Interviews and Surveys

(A)

Number to register	(B)

Total Hours	(C)

Rate per hour ($)	(D)

# of responses	(E)

Total Cost

E=B*C

50 Educators	42	 $76.55 	50	$3,189.38 

Total (Annual)                     	42	 	50	 $3,189.38 

ICR Total (3 years)             	125	 	150	$9,568.13 



Don’t Fry Day Pledge

(A)

Number to register	(B)

Total Hours	(C)

Rate per hour ($)	(D)

# of responses	(E)

Total Cost

E=B*C

1,500 Educators	125	$76.55	1500	$9,568.13 

Total (Annual)                     	125	 	1500	$9,568.13

ICR Total (3 years)             	375	 	4,500	$28,704.38 



Tutorial/Certificate

(A)

Number to be surveyed	(B)

Total Hours	(C)

Rate per hour ($)	(D)

# of responses	(E)

Total Cost

E=B*C

100 Students per year	12	$0.00	100	$0.00 

1,500  Outdoor Educators 	180	$25.28	1,500	$4,551.12 

Average Total (Annual)                     	192	 	1,600	$4,551.12

ICR Total (3 years)             	576	 	4,800	$13,653.36 



Partner Surveys

(A)

Number to register	(B)

Total Hours	(C)

Rate per hour ($)	(D)

# of responses	(E)

Total Cost

E=B*C

30 Partners	25	$38.63	30	$965.74 

Total (Annual)  - Year 3                    	25	 	30	$965.74 

ICR Total (3 years)             	25	 	30	$965.74 



Total

ICR Total-Registration + Surveys + Tutorial + Interviews + Pledge
(average annual)*             	1,615	 	8,960	$100,171.56

ICR Total- Registration + Surveys + Tutorial + Interviews + Pledge (3
years)             	4,846	 	26,880	$300,514.67

*Represents average annual cost, however not all activities will occur
during all three years of the ICR, as described above.

6(e) Bottom Line Burden Hours and Cost Tables

Bottom Line Burden and Costs (3-Year ICR)

	Burden Hours	Total Costs ($)

Students	546	$0.00

Educators	3,735	$285,895.58

Outdoor Educators	540	$28,704.38

Non-school Partners	25	$965.74

EPA	8,798	$431,134.11

Subtotal (respondents)	4,846	$315,565.69

Subtotal (government)	8,798	$431,134.11

Total	13,644	$746,699.80



Bottom Line Burden and Costs (Average Annual)*

	Burden Hours	Total Costs ($)

Students	182	0

Educators	1245	95298.525

Outdoor Educators	180	9568.125

Partners	8	321.9125

EPA	2933	$143,711.37

Subtotal (respondents)	1,615	$105,188.56

Subtotal (government)	2,933	$143,711.37

Total	4,548	$248,899.93

*Represents average annual cost, however not all activities will occur
during all three years of the ICR, as described above.

6(f)	Reasons for Change in Burden						

There is an increase of 2,738 hours annually in the total estimated
burden currently identified in the OMB Inventory of Approved ICR
Burdens.  Fewer hours for EPA are anticipated for the survey work due to
a decreased level of sophistication in the analysis and decreased effort
being needed to solicit survey responses. More hours were added for
teachers participating in an individual interview.  Hours were
subtracted for teachers administering the student survey.   Hours were
added for teachers participating in the Don’t Fry Day pledge. Finally,
registering such a large number of teachers has resulted in more hours
needed at EPA.  Hours and burden for educators is about the same; hours
for students has also decreased significantly. The bottom line burden
hours increased along with the total cost.

6(g)	Burden Statement 

The annual public reporting and record keeping burden for this
collection of information is estimated to average 10 minutes per
response for the registration, 10 minutes per response for the student
survey, 20 minutes per response for the educator survey without the
student survey, 30 minutes per response for the educator survey with the
student survey, 50 minutes per response for the teacher interview with
survey, 5 minutes per response for the Don’t Fry Day pledge, 7 minutes
per response for the certification tutorial program, and 50 minutes per
response for pretesting the partner survey.  Burden means the total
time, effort, or financial resources expended by persons to generate,
maintain, retain, disclose, or provide information to or for a Federal
agency.  This includes the time needed to review instructions; develop,
acquire, install, and utilize technology and systems for the purposes of
collecting, validating, and verifying information; processing and
maintaining information, and disclosing and providing information;
adjust the existing ways to comply with any previously applicable
instructions and requirements; train personnel to be able to respond to
a collection of information; search data sources; complete and review
the collection of information; and transmit or otherwise disclose the
information.  An agency may not conduct or sponsor, and a person is not
required to respond to, a collection of information unless it displays a
currently valid OMB control number.  The OMB control numbers for EPA’s
regulations are listed in 40 CFR Part 9 and 48 CFR Chapter 15.			

Part B(i) of the Supporting Statement – Teacher and Student Surveys

SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES

1(a) Survey Objectives

EPA’s SunWise Program provides sun protection education via a
standardized curriculum to school children in grades K-8 in public,
parochial, and charter schools. More than 25,000 schools and 3,000,000
children have received SunWise education since the 1999-2000 school
year. EPA proposes to conduct customer satisfaction and process-related
evaluative surveys with the teachers using the program.  The Teacher
Survey and Individual Interview will determine:

		Students’ satisfaction with SunWise activities and resources;

 		Teachers’ satisfaction with SunWise activities and resources;

 		How and how often teachers are using the SunWise materials, resources
and programming;

   		How many students are receiving SunWise education;

 		If teachers are sharing resources with other teachers;

 		If school policies are being changed as a result of SunWise;

 		If teachers are changing their own behavior;

 		If students are changing their behavior;

 		If teachers have suggestions for improving or creating new SunWise
resources.

The data will be analyzed, and results, although not generalizable, will
indicate how the Program is being implemented and how it can improve.

The primary objective of the optional student surveys is to see if
students are still getting the same benefit from the SunWise Tool Kit as
in quasi-experimental study designs testing the same concepts (see
previous ICR supporting statement for more details). Because this
portion of the survey will be voluntary and self-selecting, the results
will not be generalizable to every student that has received a SunWise
education.

1(b) Key Variables

Satisfaction; frequency of use; number of students participating; number
and types of activities taught; school policy change; student and
teacher knowledge, attitudes and behavior; ways to improve the program.

1(c) Statistical Approach

The primary objective in conducting the SunWise Teacher Survey is to
understand how the SunWise program is being implemented, and how it can
be improved.  It is not practical to survey every teacher that
participates in the SunWise Program, however. Since the results are not
intended to be generalizable to the complete pool of SunWise teachers,
no statistical approach is needed.

The student surveys will serve as a useful way to see if students are
still getting the same benefit from the SunWise Tool Kit as in previous
years.  However, because these surveys are optional and self-selecting,
and the results are not intended to be generalizable, no statistical
approach is needed.

1(d) Feasibility

EPA has reviewed the administrative procedures necessary to conduct the
SunWise teacher and student surveys and has determined that it is
feasible to continue with the surveys.  The Teacher Survey was reviewed
by educators and survey specialists to ensure that the questions asked
will reveal sufficient information to evaluate the implementation of the
SunWise Program and how it could be improved, especially by adding an
incentives or “Levels of SunWise” recognition program. The student
survey was previously pretested as described in Section III below.

In addition, EPA has funding to conduct the survey and provide the
necessary analysis of the resulting data.

SECTION II – SURVEY DESIGN

2(a) Target Population and Coverage

A self-selected sample from all participating SunWise teachers will be
used. SunWise teachers are very diverse, with some in schools and others
in recreation programs and other organizations. 

2(b) Sample Design

School faculty and other educators register for the SunWise program
through EPA.  Registrants provide their name and contact information,
including the name of their school/organization, and state whether they
are a classroom teacher, health teacher, gym teacher, school nurse, or
other.  Recruitment emails will be sent to all registered SunWise
schools and partners.  Participants may also be recruited through
additional avenues, such as recruitment letters distributed through the
SunWise Tool Kit, educator conferences, or direct mailings. However,
many will not participate in the survey. 

2(b)ii Sample Size

EPA anticipates sending recruitment emails to more than 35,000 formal
and informal educators, however only 1,300 are expected to actually
participate in the survey.  This number is based on previous survey
participation. 

2(b)iii Stratification Variables

None.

2(b)iv Sampling Method

As noted above, recruitment emails will be sent to all teachers who have
registered for the SunWise Program since the program began in 1999. 
Because participation in the teacher survey is voluntary, the sampling
method is voluntary self-selection. 

Inclusion criteria: Signed up with the SunWise program.

Exclusion criteria:  Incomplete Teacher Survey.

2(b)v Multi-Stage Sampling

None.

2(c) Precision Requirements

2(c)i Precision Targets

N/A

2(c)ii Nonsampling Error

N/A

2(d) Questionnaire Design

The Teacher Survey was derived from a SunWise instrument previously
approved by OMB on November 2, 2001 and April 15, 2008 (ICR #1904.01 and
#1904.04).  It is based on the instrument approved by OMB in the most
recent ICR (ICR #1904.04, April 15, 2008). The Teacher Survey was
updated based on pretesting with nine teachers.

The student survey is derived from a SunWise instrument previously
approved by OMB on November 2, 2001, and most recently approved on
February 28, 2010.  

SECTION III – PRETESTS AND PILOT TESTS 

To pretest the revised SunWise Teacher Survey, less than nine teachers
attending conferences that SunWise attended completed the survey and
then participated in an interview with staff from EPA.  The pretesting
focused on the readability and understandability of the Teacher Survey. 
Teachers had no suggestions for revisions to the Teacher Survey.  The
survey was also time-tested to ensure completion in 20 minutes or less.

The pretesting of the student survey was conducted under the previous
ICR (1904.04). It focused on the readability and understandability of
the questions and possible responses; following the pretest, the survey
was revised to: (1) include instructions for students to turn over the
two-page, double-sided survey; (2) increase the font of multiple choice
instructions; (3) put all questions referring to “last summer”
together in a box at the end of the survey; (4) delete one question that
students found difficult; (5) revise the wording of several questions to
clarify question meaning; (6) add a new response choice for why students
do not wear sunscreen; and (7) increase the response scale for several
questions from a three-point to a five-point scale.

SECTION IV – COLLECTION METHODS AND FOLLOW-UP

4(a) Collection Methods

Teacher Surveys are not anonymous and are administered online. 

All student surveys will be anonymous. Student surveys are administered
in the classroom setting, and conducted by the teachers; thus, it would
not be feasible for the teachers to obtain consent from the parents and
assent from the children for a classroom teaching tool.  In addition,
all student surveys are anonymous, and thus no specific information on a
child can be reported to parents or school staff.  Student surveys will
be returned to EPA by one of several methods: scanned and emailed;
faxed; or sent through U.S. Postal Service using a self-addressed,
postage-paid envelope supplied by EPA.

4(b) Survey Response and Follow-Up

The target response rate is approximately 3 to 4 percent among all
teachers registered for the SunWise Program, although some of those
teachers may no longer be teaching the Program. Actual response rate
will be measured based on the number of teachers that submit surveys
divided by the number of total teachers signed up for the program.  No
additional follow-up will occur unless there are questions with the
survey, or if additional clarification is necessary on suggested
improvements.

SECTION V – ANALYZING AND REPORTING SURVEY RESULTS

5(a) Data Preparation

All Teacher Survey data will automatically be entered into a database
hosted on the EPA server.

All student survey data will be entered into a database, including
surveys with questions that have not been completed.  A double-entry
protocol will be observed throughout data entry to ensure accuracy.

5(b) Analysis

The data obtained through this survey will be reviewed and analyzed
using descriptive statistical methods in the aggregate for the purpose
of determining satisfaction; frequency of use; number of students
participating; number and types of activities taught; school policy
change; student and teacher knowledge, attitudes and behavior; ways to
improve the program. All of this information will give EPA insight into
how best to improve the program, and how the program is being used. The
results will not be generalizable to the total pool of SunWise teachers
or students, but will nonetheless be informative.

5(c) Reporting Results 

The results of the survey will not be written up formally, rather they
will be used internally by the SunWise program to understand how the
SunWise program is being implemented, and how it can be improved.  The
raw survey data will be maintained by EPA. EPA will share the
information with a contractor, but will remain unavailable to the
public.  

Part B(ii) of the Supporting Statement – Teacher Interviews

SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES

1(a) Survey Objectives

EPA’s SunWise Program provides sun protection education via a
standardized curriculum to school children in grades K-8 in public,
parochial, and charter schools. More than 25,000 schools and 3,000,000
children have received SunWise education since the 1999-2000 school
year. EPA proposes to conduct customer satisfaction and process-related
evaluative surveys with the teachers using the program.  The teacher
telephone interviews will gather qualitative information regarding:

		Teachers’ involvement in the SunWise Program;

          If school policies are being changed as a result of SunWise
and how to overcome the barriers to those changes;

         Which SunWise activities and resources teachers feel are the
most effective; 

 		How teachers’ approach to teaching SunWise activities has changed
over time;

 		How the SunWise Program can more effectively disseminate its
materials and recruit more teachers;

 		How the SunWise Program can encourage teachers to increase their
involvement in SunWise and their promotion of sun safety in schools; and

 		Receptiveness to an incentives or “Levels of SunWise” educator
recognition program, and ideas for making such a program successful.

The information will be analyzed and results, although not
generalizable, will indicate how the Program is being implemented and
how it can improve.

1(b) Key Variables

Involvement; school policy change; frequency of use; number and types of
activities taught; most effective activities and resources; interest in
a recognition program; motivation for increased participation; effort
involved in recognition program.

1(c) Statistical Approach

The primary objective in conducting the SunWise teacher interviews is to
gather qualitative information on how the SunWise program is being
implemented, and how it can be improved.  Since the results are not
intended to be generalizable to the complete pool of SunWise teachers,
no statistical approach is needed.

1(d) Feasibility

EPA has reviewed the administrative procedures necessary to conduct the
SunWise teacher interviews and has determined that it is feasible to
continue with the interviews.  In addition, EPA has funding to conduct
the interview and provide the necessary analysis of the resulting data.

SECTION II – SURVEY DESIGN

2(a) Target Population and Coverage

The target population consists of teachers that have taught the SunWise
Program in the past two years; these teachers span the United States.

2(b) Sample Design

2(b)i	Sampling Frame

The sampling frame consists of all teachers that have registered for the
SunWise Program. Interview participants will be recruited via a
screening email to all registered SunWise educators asking (a) whether
they have taught SunWise in the past two years; (b) how many years they
have been teaching SunWise; and (c) whether they are willing to both
complete an online teacher survey and participate in a one-on-one
telephone interview.  

2(b)ii Sample Size

EPA anticipates 200 teachers will respond positively to both screening
questions (a) and (c) described in section 2(b)i above; however only 50
will be selected to participate in the interview process.  

2(b)iii Stratification Variables

None.

2(b)iv Sampling Method

As noted, interview participants will be recruited via a screening email
to all registered SunWise educators asking (a) whether they have taught
SunWise in the past two years; (b) how many years they have been
teaching SunWise; and (c) whether they are willing to both complete an
online teacher survey and participate in a one-on-one telephone
interview. Teachers responding positively to both screening questions
(a) and (c), will be grouped by region and length of participation, and
across these groups 50 teachers will be randomly selected to participate
in the interview process. To the extent possible, the selected teachers
will represent the geographical and participation range of those
teachers responding positively to the screening questions (i.e., a
sample of a sample), though the sample will not necessarily be
representative in a statistical sense. The teachers that are not
selected will still be encouraged to take the online survey, but will
not be part of the group that will be individually interviewed. 

Inclusion criteria: Registered with the SunWise Program; has taught
SunWise in the past two years; indicated willingness to participate in
an interview through the screening email.

Exclusion criteria:  Incomplete Teacher Survey.

2(b)v Multi-Stage Sampling

None.

2(c) Precision Requirements

2(c)i Precision Targets

N/A

2(c)ii Nonsampling Error

N/A

2(d) Questionnaire Design

An interview guide with topics of discussion (Attachment 4) was
developed by a team of contractors, a grantee, and EPA staff, and
reviewed by educational and survey experts. 

SECTION III – PRETESTS AND PILOT TESTS 

Given the semi-structured approach to the teacher interviews and the
qualitative nature of the study, pilot testing is not needed.  The
interview guide was reviewed by educational and survey experts to ensure
that the questions asked will reveal sufficient information to evaluate
the implementation of the SunWise Program and how it could be improved,
especially by adding an incentives or “Levels of SunWise”
recognition program.

SECTION IV – COLLECTION METHODS AND FOLLOW-UP

4(a) Collection Methods

Selected participants will be asked to complete the online teacher
survey, as provided in Attachment 2 and described in Part B(i) of the
Supporting Statement.  After the participants have completed the online
survey, EPA will arrange a convenient time to interview each of the
selected teachers via the telephone.  

Teachers will participate in one online survey and one telephone
interview per year over a three year period, with slightly different
informational goals for each year. In the first year, the interview will
include discussion about the development of an educator recognition or
incentives program, while interviews in later years may focus on other
areas of interest, such as parental involvement.  Teachers may receive
an annual incentive for their participation in the survey and interview,
such as a $25 gift certificate to a bookstore of their choosing to
purchase classroom resources.

4(b) Survey Response and Follow-Up

The target response rate is approximately 80 percent, with a total of 50
teachers participating in the interviews. Actual response rate will be
measured based on the number of teachers that participate in the
interviews divided by the number of teachers that indicated interest and
were selected to participate.  Follow-up emails and telephone calls will
be made to all teachers who were selected to participate but have not
responded to an initial interview invitation.  These follow-ups will
explain the importance of the interviews and strongly encourage teachers
to participate. 

In addition, if fewer than 50 positive responses to the initial
screening questions are received, EPA will follow up with an additional
recruitment email encouraging participation and may also recruit
participants through conferences, direct mailings, and other means.

SECTION V – ANALYZING AND REPORTING SURVEY RESULTS

5(a) Data Preparation

The interviewer and/or assistant will take notes during the interview.

5(b) Analysis

The information obtained through these interviews will be analyzed
qualitatively through a systematic content analysis to identify key
findings based on response frequency and emphasis.  While these findings
will give EPA insight into how best to improve the program, and how the
program is being used, the results will not be generalizable to the
total pool of SunWise teachers.

5(c) Reporting Results

The results of the qualitative content analysis will be written up in a
summary report, which may also be shared on the EPA website and with
partners and interested parties.  The individual interview notes will be
maintained by EPA and/or an EPA contractor and will remain confidential.
 

											

Part B(iii) of the Supporting Statement – Certification Program

SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER PRELIMINARIES

1(a)	Survey Objectives

To expand the SunWise Program beyond the formal classroom, EPA has
developed an online tutorial and certification questions to educate
outdoor recreation staff who supervise teens and pre-teens about the
importance of sun safety. Outdoor recreation staff who supervise teens
and pre-teens include camp counselors, swim instructors, and parks and
recreation staff/educators.  By way of this voluntary
tutorial/certification program, EPA looks to educate outdoor recreation
staffers  that supervise teens and pre-teens about the importance of sun
safety for them and the youth in their care. This can be done through a
number of ways including smart-scheduling, positive role-modeling and
other policy changes at individual camps, pools or centers. 

Upon completion, each user will be able to email their certificate of
accomplishment to their supervisor. Whole camps will then be able to
state that their staff has taken the EPA Sun Safety Certification,
meaning they understand the importance of sun safety, and know how to
prevent sun damage in children.

A series of survey questions are included throughout the tutorial and
must be answered to advance through and complete the
tutorial/certification program. EPA intends to meet two main objectives
through asking the survey questions.  The primary objective is to
determine the current sun protection knowledge, attitudes, practices,
intended practices, and teaching habits of outdoor educators, as well as
basic demographic information (e.g. age, gender, education).  Responses
to the tutorial/certification questions will also determine the
environmental conditions (i.e., policies) already in place to minimize
UV damage to the staff and visitors, and the perceived cultural norms of
camp staff.  This information will help inform SunWise program decisions
such as framing a sun safety message and developing materials for
pre-service teachers and outdoor educators to promote sun protection
outside of the formal classroom.

The secondary objective is to gather feedback on the usefulness of the
certification program.  This feedback will help the Agency determine if
online media is an effective method of teaching sun safety lessons, and
whether additional resources are needed for outdoor recreation
facilities.

1(b)	Key Variables

Key variables considered in this tutorial/certification program are
demographic information about the educator (e.g., age, gender,
education), the age range of children being supervised by the educator,
the educator’s sun safety practices and intended practices, and
policies related to sun safety implemented by the educator’s program
or organization. 

1(c)	Statistical Approach

The primary objective of the tutorial/certification program questions is
to measure the sun protection attitudes, practices, intended practices,
and teaching habits of outdoor educators.  Every outdoor educator that
participates in the tutorial/certification will be asked to respond to
the questions that are built into the online program.  It is not
practical, however, to require all outdoor educators to participate in
the tutorial/certification program, so participation will be voluntary
and self-selecting.  From this group, the Agency will be able to draw
conclusions about outdoor educators in general.  Anecdotal information
is not sufficient for this purpose, and thus EPA has chosen a
standardized questionnaire-based approach for this survey.

The tutorial/certification program asks a series of questions about
outdoor educators’ sun protection attitudes, practices, intended
practices, and teaching habits.  EPA intends to have each educator only
complete the survey once during the period for which this ICR is in
effect.  An analysis of these results will give a snapshot of outdoor
educators’ behaviors, and will help the Agency to better understand
the challenges associated with promoting sun safety in outdoor education
programs.

1(d)	Feasibility

EPA has determined that an online, voluntary, self-selecting
tutorial/certification program is a feasible way to gather information
regarding sun protection in an outdoor education setting.  In addition,
EPA has developed the online tutorial/certification program, and has
funding to collect and review responses and perform necessary updates to
the program.

SECTION II – SURVEY DESIGN

2(a)	Target Population and Coverage

The target population consists of outdoor recreation staff members that
supervise teens and pre-teens. Because the tutorial/certification
program is self-selecting and voluntary, the coverage of this population
is not known.  

2(b)	Sample Design

2(b)i	Sampling Frame

The sampling frame is all U.S. outdoor recreation staff members at
programs included in the mailing lists from the American Camp
Association (ACA) and the National Recreation and Park Association
(NRPA), which are updated regularly. The sampling frame will be
identified by the purchase of these e-mail lists.  

Survey participants will be recruited via a targeted email to all
programs in the ACA and NRPA lists, which will direct participants to
the EPA website.  Advertisements for the tutorial/certification program
will also be placed on the EPA SunWise Web site and ACA and NRPA Web
sites.  In addition, participants will be recruited at conferences
related to sun safety and education, and EPA may also reach out to
sports coaching associations for participant recruiting.

2(b)ii Sample Size

Because the tutorial/certification program is self-selecting and
voluntary, it is not possible to pre-determine a specific sample size.
However, EPA anticipates about 1,000 outdoor recreation staff members
will complete the tutorial/certification program.

2(b)iii Stratification Variables

None.

2(b)iv Sampling Method

Because participation in the tutorial/certification program is
voluntary, the sampling method is voluntary self-selection. Because the
survey questions must be answered to advance through the tutorial, all
outdoor educators that elect to participate in the certification program
will necessarily complete the survey. There is sampling bias associated
with self-selection on both an individual and an organizational level as
camps or programs may require all of their educators to take the
tutorial; however, because participation in the certification program is
voluntary, self-selection is the only feasible sampling method available
to the Agency. 

2(c) Precision Requirements

2(c)i Precision Targets 

Based on an assumed sample size of 1,000 participants, the Agency’s
precision target is ±3.1 percentage points at a 95 percent confidence
level for aggregated question responses (i.e., the estimated percentage
of all potential respondents that give a certain answer to a given
survey question).  For question responses for sample subsets (e.g., the
percentage of male respondents that give a certain answer to a given
survey question), the uncertainty in the estimated percentages will be
greater.

2(c)ii Non-sampling Error

It is expected that the largest non-sampling error will be the result of
non-response.  

2(d) Questionnaire Design

The tutorial/certification questions were derived from those developed
in a 2008 study by Glanz et al.  In Glanz et al. (2008), a group of
investigators evaluated available questionnaire measures of sun exposure
and protection in order to propose a core set of standardized survey
items for a range of age groups (adults, adolescents aged 11 to 17, and
children 10 years or younger).  The investigators used these core
questions in cognitive testing and found that they had good clarity and
applicability for measuring sun exposure and sun protection behaviors
across a broad range of populations.  In addition, it was determined
that these methods are appropriate for studies tracking morbidity and/or
mortality and evaluating prevention program effects. 

Based on this study, participants are asked a series of questions
throughout the tutorial about themselves, their sun protection-related
behavior and their experience with the tutorial. These questions are
meant as a self-assessment tool and must be answered in order to
complete the tutorial.  Participants are encouraged to think about how
they can improve their current behavior while answering.  The questions
are incorporated into each of the five sections of the on-line
tutorial/certification program: 

Introduction: provides a brief overview of the tutorial. 

UV Basics: emphasizes the importance of knowing the facts about UV
radiation.

Louder than Words: describes sun safe actions and provides information
on sun protection attire and SPF sunscreen.

Do As I Do: provides insight to participant’s ability to influence
young people, and offers suggestions for modeling sun safe behaviors. 

Before You Go: summarizes information from the tutorial, provides links
to additional resources, and presents questions about participant’s
intentions to follow suggestions presented in the tutorial.

The training is expected to take approximately 45 minutes to complete,
of which approximately 7 minutes will be for answering the survey
questions. 

SECTION III – PRETESTS AND PILOT TESTS

No pilot testing of the tutorial/certification questions is planned.

SECTION IV – COLLECTION METHODS AND FOLLOW-UP

4(a) Collection Methods

Tutorial/certification program participants will complete the
tutorial/certification online through the EPA SunWise Program website,
and responses will be automatically submitted electronically upon
completion. All tutorials/certifications are anonymous, and no personal
information will be stored.  

4(b) Survey Response and Follow-Up

Because outdoor educators self-select to complete the tutorial and
certification questions, no response rate will be specifically measured,
although EPA will keep count of the number of respondents. No follow-up
will be performed with outdoor educators that complete the
tutorial/certification program.

SECTION V – ANALYZING AND REPORTING SURVEY RESULTS

5(a) Data Preparation

No personal information will be stored.  Responses to questions
presented in the tutorial will be collected and stored automatically
upon completion, and will be reviewed by EPA to help determine necessary
changes to make to the tutorial and program.  

5(b) Analysis

Data obtained through the tutorial/certification will help EPA gain a
better understanding of adjustments to make to the SunWise
tutorial/certification program to better educate outdoor recreation
staff.  In addition, responses will provide EPA with an idea of the
types of sun safety policies implemented in outdoor education programs,
policy effectiveness, as well as possible areas of improvement to
increase sun protection.

Basic descriptive statistical analyses will be used to describe and
present a basic summary of responses collected and organize them in a
logical manner.  Data will be aggregated to calculate, for example, the
percentage of outdoor educators that practice a certain sun safety
behavior, or the mean number of times outdoor educators remind teens and
pre-teens in their care about sun protection, along with 95% confidence
intervals.  Data will also be parsed for comparison by different
demographic or other categorical variables in order to compare, for
example, the percentages of male versus female outdoor educators that
remind children to protect themselves from the sun.  Differences in
percentages will be calculated together with 95 percent confidence
intervals and statistical tests to determine whether the differences are
statistically significant. 

5(c) Reporting Results

Information from tutorial/certification responses will serve to inform
EPA of program areas needing improvement to better reach the target
audience, as well as increase sun safety awareness and practices in
informal and outdoor education settings.  

Raw survey data will be kept confidential.  EPA plans to share
aggregated findings with partners and the public to improve sun safety
education.

Part B(iv) of the Supporting Statement – Pretesting the Partner
Survey

SECTION I – SURVEY OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES

1(a) Survey Objectives

EPA’s SunWise Program provides sun protection education via a
standardized curriculum and other resources (i.e., Sun Safety Tutorial)
to children in grades K-8 in communities at registered 501(c)(3)
organizations such as science centers and camps, children's museums, and
scouting groups, as well as other not-for-profit organizations like
local, county and state health, recreation and education departments.
About 4,600 informal education centers have registered with SunWise
since the 1999-2000 school year. 

EPA will undertake pretesting of a survey for non-school partners
participating in the SunWise Program.  These partners may include state
and local health departments, childcare centers, museums, camps, and
science centers.  The purpose of the survey will be generally to better
understand how non-school partners are interacting with the SunWise
Program, as well as to determine:

		How and how often partners are using the SunWise materials, resources
and programming;

   		How many children are receiving SunWise education through
non-school partners;

 		Children’s satisfaction with SunWise activities and resources;

 		Partners’ satisfaction with SunWise activities and resources;

          If partners are sharing resources with other partners;

 		If partner organizations’ sun safety policies are being changed as
a result of SunWise;

		If partners have suggestions for improving or creating new SunWise
resources.

The pretesting is intended to determine the validity, reliability, and
effectiveness of the survey questions—e.g., whether questions measure
what they are supposed to measure, whether partners understand what the
questions are asking are asking, and whether the questions are the right
questions to gain a better understanding of how partners are interacting
with the SunWise Program.

1(b) Key Variables

Understandability, validity, reliability, and effectiveness of
questions; right questions for understanding partner participation in
SunWise.

1(c) Statistical Approach

The primary objective of the pretesting is to gather information that
will help EPA develop a new questionnaire for SunWise non-school
partners, and to determine the validity, reliability, and effectiveness
of initial draft questions. No statistical approach is needed.

1(d) Feasibility

EPA has reviewed the administrative procedures necessary to pretest the
SunWise partner survey and has determined that it is feasible to
continue with the pretesting. EPA has funding to conduct the pretesting
and use the resulting input to revise the partner survey.

SECTION II – SURVEY DESIGN

2(a) Target Population and Coverage

The target population includes all non-school partners registered with
the SunWise Program, or approximately 4,600 organizations.  These
organizations include 501(c)(3) organizations such as science centers
and camps, children's museums, and scouting groups, as well as other
not-for-profit organizations like local, county and state health,
recreation and education departments.

2(b) Sample Design

2(b)i	Sampling Frame

Partners register for the SunWise program through EPA.  Registrants
provide their name and contact information, including the name of their
organization, and state what their role is.  EPA will send a recruitment
email to all registered SunWise partners in the Spring/Summer timeframe
encouraging them to participate in the pilot testing of the partner
survey (Attachment 5).  Participants may also be recruited through
additional avenues, such as recruitment letters distributed through the
SunWise Tool Kit, educator conferences, or direct mailings.  However,
many may choose not to participate.

2(b)ii Sample Size

EPA anticipates sending a recruitment email to about 4,600 non-school
partners, however only 30 will be selected for participation in the
survey pretesting.  

2(b)iii Stratification Variables

None.

2(b)iv Sampling Method

From those partners responding positively to the recruitment email, EPA
will sort the willing partners into types of organizations (e.g.,
health, recreation, and educational departments; childcare centers;
camps and scouting groups; and educational centers such as museums or
science centers) and randomly select participants from each group for a
total of 30 participants.  The number of participants selected from each
group will be weighted based on the overall composition of SunWise
registered partners.

Inclusion criteria: Registered with the SunWise program.

Exclusion criteria:  Non-response.

2(b)v Multi-Stage Sampling

None.

2(c) Precision Requirements

2(c)i Precision Targets

N/A

2(c)ii Nonsampling Error

N/A

2(d) Questionnaire Design

The partner survey for pretesting derived from a SunWise teacher
instrument previously approved by OMB on November 2, 2001 and April 15,
2008 (ICR #1904.01 and #1904.04).  Because this information collection
consists of pretesting with the goal of developing a new survey for
SunWise non-school partners, the design of the survey will necessarily
change through the pretesting process and additional or different
questions may be pretested with partners. The draft survey for
pretesting is provided in Attachment 5.

SECTION III – PRETESTS AND PILOT TESTS

This information collection consists of pretesting for the development
of a new survey for SunWise non-school partners.

SECTION IV – COLLECTION METHODS AND FOLLOW-UP

4(a) Collection Methods

Depending on available resources and other constraints, the survey may
be self-administered with feedback gathered from each participant over
the telephone, or the survey may be administered in-person either in an
individual or group setting, with feedback gathered through in-person
interviews.  In either case, participants will be asked if they
understood all questions, whether the survey was easy to complete, and
whether there were questions they would suggest removing or adding to
better reflect the participation of partners in the Program.  Based on
this feedback, EPA will revise the partner survey.  EPA will also
measure the time it takes for each respondent to complete the survey
(estimated at less than 20 minutes).

4(b) Survey Response and Follow-Up

The target response rate is approximately 80 percent. Actual response
rate will be measured based on the number of partners that participate
in the pretesting divided by the number of partners that indicated
interest and were selected to participate.  Follow-up emails and
telephone calls will be made to all partners who were selected to
participate but have not responded to an initial invitation.  These
follow-ups will explain the importance of the pretesting and strongly
encourage partners to participate. 

SECTION V – ANALYZING AND REPORTING SURVEY RESULTS

5(a) Data Preparation

All survey data will be entered into a database.  A double-entry
protocol will be observed throughout data entry to ensure accuracy.

During interviews with partners following pretesting of the survey, the
interviewer and/or assistant will take detailed notes.

5(b) Analysis

The survey data and qualitative input obtained through the pretesting
will be aggregated and analyzed for the purpose of determining the
validity, reliability, and effectiveness of survey questions.  All of
this information will be used by EPA to revise the survey questions.  

5(c) Reporting Results

The results of the pretesting will be used by EPA to revise the partner
survey. The raw survey and interview data will be maintained by EPA
and/or an EPA contractor and will remain unavailable to the public.  

   HYPERLINK "http://www.bls.gov/oes/2009/may/oes_nat.htm" 
http://www.bls.gov/oes/2009/may/oes_nat.htm  and   HYPERLINK
"http://stats.bls.gov/news.release/ecec.t02.htm" 
http://stats.bls.gov/news.release/ecec.t02.htm  

 Glanz, K, Yaroch, AL, Dancel, M, Saraiya, M, Crane, LA, Buller, DB,
Manne, S, O’Riordan, DL, 

Heckman, CJ, Hay, J, Robinson, JK.  Measures of Sun Exposure and Sun
Protection Practices for Behavioral and Epidemiologic Research. 
Archives of Dermatology.  2008; 144 (2); 217-222.

 PAGE   

 PAGE   12 

