1
SUPPORTING
STATEMENT
INFORMATION
COLLECTION
REQUEST
(
EPA
ICR
Number
2016.01)

Drinking
Water
Customer
Satisfaction
Survey
U.
S.
Environmental
Protection
Agency
Office
of
Ground
Water
and
Drinking
Water
Drinking
Water
Protection
Division
i
APPENDICIES
Appendix
A.
Copy
of
First
Federal
Register
notice
for
the
Drinking
Water
Customer
Satisfaction
Survey
(
66
FR
209)

Appendix
B.
Summary
of
comments
received
in
response
to
the
first
Federal
Register
notice.

Appendix
C.
Drinking
Water
Customer
Satisfaction
Survey
and
associated
script
ii
ACRONYMS
ASDWA
Association
of
State
Drinking
Water
Administrators
AWWA
American
Water
Works
Association
BLS
Bureau
of
Labor
and
Statistics
CASRO
Council
of
American
Survey
Research
Organizations
CATI
Computer­
Assisted
Telephone
Interviewing
CCR
Consumer
Confidence
Reports
EPA
Environmental
Protection
Agency
FR
Federal
Register
ICR
Information
Collection
Request
OGWDW
Office
of
Ground
Water
and
Drinking
Water
OMB
Office
of
Management
and
Budget
NEETF
National
Environmental
Education
and
Training
Foundation
NGWA
National
Ground
Water
Association
RDD
Random
Digital
Dial
SDWA
Safe
Drinking
Water
Act
SWA
Source
Water
Assessment
SSI
Survey
Sampling
Inc.
1
PART
A
1.
IDENTIFICATION
OF
THE
INFORMATION
COLLECTION
1(
a)
Title
of
the
Information
Collection
Drinking
Water
Customer
Satisfaction
Survey
1(
b)
Short
Characterization
This
Information
Collection
Request
(
ICR)
calculates
the
burden
and
costs
associated
with
a
survey,
prepared
by
the
Office
of
Ground
Water
and
Drinking
Water
(
OGWDW),
on
the
effectiveness
of
the
Environmental
Protection
Agency's
(
EPA)
right­
to­
know
efforts.

The
1996
Amendments
to
the
Safe
Drinking
Water
Act
(
SDWA)
require
EPA
to
ensure
drinking
water
information
is
amde
available
to
the
general
public.
This
survey
will
allow
the
EPA
to
evaluate
current
public
awareness
initiatives
for
disseminating
drinking
water
information
to
the
public.
Conducting
this
survey
will
help
the
EPA
assess
general
consumer
perceptions
and
habits
concerning
drinking
water.
By
gauging
the
effectiveness
of
current
outreach
activities,
the
Agency
will
measure
whether
information
efforts
are
meeting
customer
needs.
The
Agency
will
also
gain
insight
on
how
to
improve
the
way
this
information
is
disseminated
in
the
future.

The
information
collected
will
involve
1000
randomly
selected
adults
from
the
general
public.
The
survey
will
be
conducted
by
the
Gallup
organization
under
contract
to
EPA.
The
selected
individuals
will
be
asked
specific
questions
concerning
general
consumer
awareness
issues,
consumer
confidence
reports
(
annual
water
quality
reports),
source
water
assessments,
and
customer
preferences
with
respect
to
receiving
information.
In
addition,
the
survey
asks
demographic
questions
about
factors
that
we
suspect
may
be
drivers
of
satisfaction.
These
factors
include
consumer
perceptions
of
water
quality,
concerns
about
taste
and
odor,
and
whether
consumers
currently
drink
bottled
water
or
filter
their
tap
water.

The
survey
instrument
is
a
voluntary
telephone
questionnaire,
averaging
11
minutes,
that
covers
26
questions.
EPA
will
conduct
this
survey
only
once
during
the
period
for
which
this
ICR
is
in
effect.

The
total
estimated
cost
to
respondents
is
$
3,016.96.
The
total
estimated
cost
to
EPA
is
$
70,065.37.

2.
NEED
FOR
AND
USE
FOR
THE
COLLECTION
2
2(
a)
Need/
Authority
For
The
Collection
The
1996
Amendments
to
the
Safe
Drinking
Water
Act
gave
EPA
new
responsibilities
to
ensure
information
about
drinking
water
is
made
available
to
the
American
people.
Section
114(
a)
of
the
SDWA
requires
public
water
systems
to
provide
annual
water
quality
reports
to
their
customers
(
consumer
confidence
reports),
improved
public
notification
of
drinking
water
violations,
and
increased
information
about
sources
and
potential
threats
to
drinking
water.
These
requirements
make
EPA
responsible
for
ensuring
the
American
public
receives
and
understands
relevant
information
about
its
drinking
water.

This
section
of
the
SDWA
also
directs
the
EPA
to
engage
in
other
public
information
efforts,
such
as
maintaining
a
drinking
water
hotline
and
consulting
with
the
public
when
proposing
new
drinking
water
regulations.
In
implementing
these
new
public
information
requirements,
the
EPA
recognizes
a
responsibility
to
evaluate
the
Agency's
effectiveness
in
this
new
role
to
make
sure
these
provisions
are
serving
customer
needs.
Hence,
this
survey
has
been
designed
to
ask
respondents
specific
questions
to
determine
whether
EPA's
drinking
water
right­
to­
know
efforts
are
adequate
under
the
SDWA.

2(
b)
Practical
Utility/
Users
of
the
Data
This
survey
will
enable
EPA
to
identify
whether
its
drinking
water
information
efforts
are:
 
Successfully
reaching
the
intended
audience
 
Meeting
customer
information
needs
 
Meeting
customer
format
and
distribution
preferences
While
the
information
will
not
be
used
for
regulatory
development,
the
EPA
anticipates
that
the
results
of
this
survey
may
lead
to
a
reallocation
of
resources,
possible
revision
in
certain
Agency
processes
and
policies,
and
development
of
guidance
related
to
EPA
customer
service
products.
Ultimately,
these
changes
will
result
in
improvements
in
services
the
Agency
provides
to
the
public,
and
in
turn,
the
public's
perception
of
the
Agency
and
knowledge
of
its
drinking
water.

3.
NONDUPLICATION,
CONSULTATIONS,
AND
OTHER
COLLECTION
CRITERIA
3
3(
a)
Nonduplication
EPA
has
made
an
effort
to
ensure
that
the
data
collection
efforts
associated
with
this
ICR
are
not
duplicated.
EPA
has
consulted
state
environmental
programs,
as
well
as
interested
non­
governmental
organizations
(
e.
g.
the
National
Environmental
Education
and
Training
Foundation
(
NEETF)
and
American
Water
Works
Association
(
AWWA)).

3(
b)
Public
Notice
In
compliance
with
the
Paperwork
Reduction
Act
(
§
3501
et
seq.),
EPA
solicited
comments
on
this
Information
Collection
Request.
A
Federal
Register
(
FR)
notice
for
the
Drinking
Water
Customer
Satisfaction
Survey
was
published
on
October
29,
2001
(
66
FR
209).
A
copy
of
this
FR
notice
has
been
included
in
Appendix
A.
EPA
received
and
reviewed
several
comments
in
response
to
the
FR
notice.
A
summary
of
these
comments
is
included
in
Appendix
B.

3(
c)
Consultations
In
addition
to
the
comments
received,
EPA
consulted
with
each
of
the
branches
involved
in
funding
this
survey
effort.
Each
branch
was
requested
to
review
the
questionnaire
to
determine
the
need
for
requesting
this
information.
EPA
also
contacted
the
Association
of
State
Drinking
Water
Administrators,
National
Defense
Council,
and
National
Rural
Water
Association
to
comment
on
the
survey
questions.
EPA
contacted
the
American
Water
Works
Association
to
discuss
methodology.
EPA
also
consulted
with
Janet
Pawlukiewicz,
Director
of
the
Water
Protection
Task
Force
to
determine
if
any
sensitive
issues
may
arise
from
the
survey.
Additional
comments
received
as
a
result
of
these
consultations
were
incorporated
and
included
in
the
survey
instrument
before
proceeding
with
discussions
to
finalize
the
survey
with
Gallup.

3(
d)
Effects
of
Less
Frequent
Collection
Since
individuals
will
only
be
surveyed
once
during
the
period
for
which
this
ICR
is
in
effect,
it
is
not
possible
to
collect
this
information
less
frequently.
Therefore,
determination
of
the
effects
of
less
frequent
information
collection
is
not
applicable
to
this
request.

3(
e)
General
Guideline
This
ICR
complies
with
OMB's
general
guidelines
for
the
collection
of
information.
Under
no
circumstances
will
the
respondents
be
required
to
take
any
of
the
following
actions:
4
 
Report
information
to
EPA
more
than
quarterly
 
Prepare
a
written
response
to
a
collection
of
information
in
fewer
than
30
days
after
receipt
of
a
request
 
Submit
more
than
an
original
and
two
copies
of
any
document
 
Retain
records
for
more
than
three
years
 
Participate
in
a
statistical
study
that
is
not
designed
to
produce
data
that
can
be
generalized
to
the
universe
of
the
study
 
Utilize
a
statistical
data
classification
that
has
not
been
reviewed
and
approved
by
OMB
 
Receive
a
pledge
of
confidentiality
that
is
not
supported
by
authority
established
in
statute
or
regulation,
that
is
not
supported
by
disclosure
and
data
security
policies
that
are
consistent
with
the
pledge,
or
which
unnecessarily
impedes
sharing
of
data
with
other
agencies
for
compatible
confidential
use
 
Submit
proprietary,
trade
secret,
or
other
confidential
information,
unless
EPA
can
demonstrate
that
it
has
instituted
procedures
to
protect
the
information's
confidentiality
to
the
extent
permitted
by
law.

3(
f)
Confidentiality
EPA
does
not
expect
to
receive
confidential
information
from
the
individuals
voluntarily
participating
in
the
Drinking
Water
Customer
Satisfaction
Survey.
In
addition,
neither
EPA
nor
any
other
person
or
entity
will
have
access
to
the
raw
survey
data,
which
will
remain
in
the
survey
database
under
the
control
of
the
contractor
hired
to
perform
the
survey.
The
contractor
(
Gallup)
will
be
required
to
keep
any
sensitive
information
confidential.

3(
g)
Sensitive
Questions
Sensitive
questions
are
defined
in
the
ICR
instructions
as
"
questions
concerning
sexual
behavior
or
attitudes,
religious
beliefs,
or
other
matters
usually
considered
private."
The
Drinking
Water
Customer
Satisfaction
Survey
contains
no
sensitive
questions.
5
4.
THE
RESPONDENTS
AND
THE
INFORMATION
REQUESTED
4(
a)
Respondents/
SIC
Codes
Possible
respondents
for
the
Drinking
Water
Customer
Satisfaction
Survey
are
members
of
the
general
public;
their
participation
in
the
survey
is
not
related
to
their
profession.
Therefore,
the
assignment
of
a
SIC
code
is
not
appropriate.

4(
b)
Information
Requested
4(
b)(
i)
Data
Items
The
Drinking
Water
Customer
Satisfaction
Survey
will
be
administered
to
members
of
the
general
public
selected
randomly
from
all
households
in
the
United
States.

The
survey
will
request
the
following
information
from
the
respondents:

 
Participant
Screening
Information:
These
questions
seek
to
identify
whether
the
respondent
or
anyone
in
the
respondent's
household
is
over
18
and
whether
the
respondent
is
willing
to
participate
in
the
survey.
This
data
is
important
in
identifying
the
eligibility
of
respondents
to
complete
the
survey.

 
General
Tap
Water
Questions:
These
questions
ask
the
respondent
what
they
use
tap
water
for
(
i.
e.
drinking,
cooking ),
whether
they
treat
water
(
i.
e.
filter,
boil ),
and
the
perceptions
they
have
on
their
water
quality
(
i.
e.
taste,
odor ).
This
data
will
allow
EPA
to
determine
the
public's
use
and
perception
regarding
their
drinking
water.

 
CCR
Questions:
These
questions
ask
whether
the
respondent
received,
read,
understood,
and
trusted
their
CCR
and
whether
they
took
action
as
a
result
of
what
they
learned.
This
data
will
be
used
by
the
EPA
as
an
indicator
of
whether
CCRs
are
being
received,
understood,
and
used
by
the
respondents.
These
questions
also
provide
and
opportunity
for
respondents
to
suggest
what
EPA
can
do
to
make
CCRs
more
effective.

 
General
Source
Water
Questions:
These
questions
request
information
from
the
respondents'
on
their
knowledge
of
source
water
and
whether
they
receive
and
are
satisfied
with
information
regarding
source
water
issues.
This
data
is
important
in
order
to
determine
if
the
public
is
making
the
link
between
source
water
and
drinking
water.
EPA
will
use
this
information
to
evaluate
whether
the
general
public
understands
this
interconnection
and
determine
if
outreach
efforts
need
to
be
reevaluated.
6
 
Source
Water
Contamination
Questions:
These
questions
seek
information
regarding
the
respondents'
knowledge
of
issues
involving
source
water
contamination,
from
whom
they
receive
information
regarding
source
water
contamination,
whether
they
trust
the
information,
and
what
actions
have
they
taken
as
an
effort
to
prevent
source
water
contamination.
This
data
is
important
in
order
for
the
EPA
to
determine
what
is
the
best
way
to
inform
the
public
on
contamination
issues
and
what
the
public
can
do
to
protect
their
drinking
water
sources.

 
Demographic
Information:
These
questions
seek
to
categorize
the
respondent's
race
or
ethnicity,
level
of
education
and
place
of
residence.
This
information
is
important
to
identify
variances
in
individual
responses
and
within
specific
populations.

A
copy
of
the
survey
and
its
associated
script
are
attached
to
this
ICR
as
Appendix
C
of
the
Supporting
Statement.
There
are
no
record­
keeping
items
specifically
required
by
this
survey.

4(
b)(
ii)
Respondent
Activities
Individuals
randomly
selected
to
participate
in
EPA's
Drinking
Water
Customer
Satisfaction
Survey
could
potentially
perform
each
of
the
following
tasks
over
the
phone:

 
Listen
to
introductory
information
 
Respond
to
Screening
Questions
 
Complete
the
Survey
These
activities
represent
a
voluntary
information
collection
for
each
respondent
and
are
not
customary
practices
of
the
respondent.

5.
THE
INFORMATION
COLLECTED
 
AGENCY
ACTIVITIES,
COLLECTION,
METHODOLOGY,
AND
INFORMATION
MANAGEMENT
7
5(
a)
Agency
Activities
Agency
activities
associated
with
the
collection
of
information
include:

 
Developing
questionnaire
 
Pretesting
questionnaire
 
Internal
EPA
review
and
approval
of
questionnaire
 
Reviewing
data
 
Analyzing
results
 
Preparing
findings
 
Reviewing
findings
 
Making
results
public
via
annual
reports
and
internet
5(
b)
Collection
Methodology
and
Management
The
survey
instrument
averages
11
minutes
and
is
a
voluntary
telephone
questionnaire
covering
26
questions.
Gallup
will
use
Random
Digital
Dial
(
RDD)
procedures
to
generate
a
probability
sample
of
households
with
telephone
service
(
including
those
with
unlisted
and
non­
published
numbers).
In
each
household
contacted,
the
number
of
all
residents
over
18
will
be
collected
from
a
knowledgeable
adult.
Using
the
"
most
recent
birthday"
method,
one
adult
will
be
selected
to
represent
each
eligible
household.
Withinhousehold
selection
probabilities
will
be
calculated
by
a
computer­
assisted
telephone
interviewing
(
CATI)
program
and
stored
for
use
in
the
constructing
sampling
weights,
as
described
further
below.

The
RDD
sample
will
be
selected
using
the
most
current
database
of
assigned
area
codeprefix
combinations
(
the
first
8
digits
of
a
10­
digit
telephone
number)
covering
the
entire
nation
obtained
from
Bell
Core
Research
(
Bellcore).
The
initial
list
will
exclude
any
area
code­
prefix
combinations
known
by
Bellcore
to
contain
only
business
listings,
toll­
free
numbers,
cellular
numbers,
and
other
non­
residential
lines.

Gallup
will
complete
1,000
interviews
using
a
CATI
methodology.
The
95
percent
level
of
confidence
will
provide
an
error
range
of
approximately
plus/
minus
3
percent
at
the
national
level.
A
ten
call­
back
design
will
be
used
for
this
study.

Upon
survey
close­
out,
the
contractor
will
develop
estimates
for
the
target
population
by
suitably
weighting
the
sample
data.
The
basic
weights
will
be
a
function
of
selection
probabilities.
In
the
calculation
of
weights,
the
number
of
adult
household
members
and
the
number
of
telephone
lines
(
multiple
telephones)
in
the
selected
household
will
be
taken
into
consideration.
Gallup
will
also
use
post­
stratification
weighting
to
make
the
sample
reflect
the
population
it
is
intended
to
represent.
For
calculating
weights,
2001
estimates
of
the
2000
Census
data
will
be
used.
The
sampling
error
will
be
approximated
through
the
use
of
standard
errors,
at
a
confidence
level
of
95
percent.
8
5(
c)
Small
Entity
Flexibility
EPA
does
not
believe
that
this
ICR
will
have
a
"
significant
economic
impact
on
a
substantial
number
of
small
entities."
The
Drinking
Water
Customer
Satisfaction
Survey
is
strictly
voluntary
and
targeted
to
members
of
the
general
public,
who
fall
outside
the
definition
of
a
""
small
entity"
provided
in
Section
601
of
the
Regulatory
Flexibility
Act.

5(
d)
Collection
Schedule
Information
collection
will
begin
upon
approval
of
this
ICR
and
the
assignment
of
an
OMB
control
number
to
the
survey
instrument.
The
collection
schedule
for
the
survey
is
expected
to
follow
the
approximate
timeline
presented
below,
beginning
upon
OMB
approval
of
the
ICR.

Activity
Timeline
Administer
telephone
survey
Within
10
days
after
OMB
approval
Analyze
survey
results
Within
7
days
after
completion
of
survey
Report
survey
findings
Within
30
days
after
completion
of
analysis
6.
ESTIMATING
THE
BURDEN
AND
THE
COST
OF
THE
COLLECTION
6(
a)
Estimating
the
Respondent
Burden
The
estimates
of
the
time
burden
involved
in
responding
to
the
survey
are
derived
from
a
survey
pretest.
The
respondent
burden
is
presented
by
type.
Two
types
of
respondents
have
been
identified:

 
Screening­
only
respondents:
These
respondents
were
asked
screening
questions
but
were
either
determined
to
be
ineligible
(
not
over
18)
or
declined
to
participate
in
the
survey.
 
Survey
respondents:
These
respondents
participated
in
the
survey
EPA
estimates
that
there
will
be
approximately
250
screening­
only
respondents.
The
estimated
average
burden
for
screening­
only
respondents
is
approximately
one
minute.
The
estimated
average
burden
for
survey
respondents
is
an
average
of
11
minutes
per
respondent.

Table
6.1
shows
the
individual
burden
for
screening­
only
and
survey
respondents
Table
6.1
Individual
Burden
Per
Respondent
9
Respondent
Type
Burden
Minutes/
Respondent
Total
Burden
Minutes
Screening­
only
1
250
Survey
respondents
11
11,000
6(
b)
Estimating
Respondent
Costs
6(
b)(
i)
Estimating
Labor
Costs
This
is
a
non­
rule
related
ICR.
Therefore,
the
labor
costs
to
perform
functions
related
to
the
collection
of
information
reflect
the
opportunity
costs
of
labor
(
i.
e.,
labor
rates
based
on
employer
costs,
including
fringe
benefits
such
as
paid
leave,
insurance
benefits,
unemployment
insurance,
Social
Security
and
minimal
overhead
costs).
Labor
costs
for
responding
is
estimated
at
$
16.07
per
hour
based
on
the
"
Employer
Cost
for
Employment
Compensation"
(
Bureau
of
Labor
and
Statistics
(
BLS),
March
2001).
The
labor
cost
is
based
on
BLS's
estimate
for
average
hourly
wage
rates
for
workers
throughout
the
country.

Table
6.2
shows
the
cost
per
screening­
only
and
survey
respondents.
$
16.07
per
hour
was
rounded
to
$
0.27
per
minute
for
the
purpose
of
calculating
the
cost.

Table
6.2
Cost
Per
Respondent
Respondent
Type
BurdenMinutes
/
Respondents
Cost/
Respondent
@$
0.27/
minute
Total
Cost
Screening­
only
1
$
0.27
$
67.50
Survey
respondents
11
$
2.97
$
2,970.00
6(
b)(
ii)
Estimating
Capital
and
Operations
and
Maintenance
Costs
EPA
does
not
expect
respondents
to
the
Drinking
Water
Customer
Satisfaction
Survey
to
incur
any
capital
or
operations
and
maintenance
costs.
This
information
collection
is
voluntary
and
does
not
require
special
equipment.

6(
b)(
iii)
Capital/
Start­
up
vs.
Operating
and
Maintenance
Costs
Not
applicable
6(
b)(
iv)
Annualizing
Capital
Costs
10
Not
applicable
6(
c)
Estimating
Agency
Burden
and
Cost
Table
6.3
provides
the
estimates
of
EPA's
burden
and
cost
associated
with
the
Drinking
Water
Customer
Satisfaction
Survey.
Wage
estimates
for
Agency
personnel
are
divided
into
three
general
categories
of
labor:
Management
(
GS­
15);
Technical
(
GS­
13);
and
Clerical
(
GS­
7).
EPA
personnel
participating
in
this
survey
are
assumed
to
be
management
and
technical
personnel.
Civil
Service
wage
estimates
are
based
on
the
2002
pay
scale
for
Washington,
DC
employees
and
include
a
benefits
multiplier
of
1.6.

 
Civil
Service
(
Manager)
$
70.58/
Hour
 
Civil
Service
(
Technical)
$
50.77/
Hour
Table
6.3
Agency
Burden/
Cost
for
Drinking
Water
Customer
Satisfaction
Survey
Activities
Burden
Hours
for
Manager
@
$
70.58/
Hour
Burden
Hours
for
Technical
@
50.77/
Hour
Total
Hours
Total
Costs
Develop
questionnaire
10
140
150
$
7,813.60
Pretesting
questionnaire
0
2
2
$
101.54
Internal
EPA
review
and
approval
of
questionnaire
4
10
14
$
790.02
Reviewing
data
2
20
22
$
1,156.56
Analyzing
results
4
60
64
$
3,328.52
Preparing
findings
0
15
15
$
761.55
Reviewing
findings
4
10
14
$
790.02
Making
results
public
via
annual
reports
and
internet
2
20
22
$
1,156.56
Total
Hours
26
277
303
Total
Cost
$
1,835.08
$
14,063.29
$
15,898.37
$
15,898.37
In
addition
to
the
labor
burden/
cost
shown
in
the
table
above,
EPA
has
a
fix
cost
contract
with
Gallup.
Gallup
will
be
conducting
the
survey,
entering
data,
and
analyzing
findings.
The
contract
with
Gallup
for
these
services
is
for
$
54,167.00.

6(
d)
Estimating
the
Respondent
Universe
and
Total
Burden
and
Cost
Burden
The
Universe
for
this
survey
is
the
non­
institutional
adult
(
18
years
of
age
or
older)
population
residing
in
the
United
States.
Table
6.4
shows
the
bottom
line
burden
hours
and
costs
for
the
respondents
11
Table
6.4
Bottom
Line
Burden
Hours
and
Costs
(
Respondents)
Respondent
Type
Number
of
Respondents
Total
Hours
Total
Burden
Cost
Screening­
Only
Respondents
250
4
1/
6
$
67.50
Survey
Respondents
1,000
183
1/
3
$
2,970.00
Total
1,250
187
½
$
3037.50
See
Table
6.3
for
specific
agency
hours
and
costs.
The
total
cost
for
the
Agency
(
including
the
contract
with
Gallup)
is
$
70,065.37.

6(
e)(
i)
Bottom
Line
Burden
Hours
and
Cost
Tables
Table
6.5
details
the
total
bottom­
line
burden
(
respondent
and
EPA)
associated
with
this
survey
effort.
Table
6.3
in
section
6(
c),
estimating
Agency
burden
and
cost,
and
Table
6.4
in
section
6(
d),
estimating
the
respondent
universe
and
total
burden
and
costs,
detail
how
these
total
figures
were
derived.

Table
6.5
Bottom­
line
Burden
and
Cost
Burden
Category
Burden
Hours
Burden
Cost
Respondent
187
1/
2
$
3,037.50
EPA
Labor
303
$
15,898.37
EPA
Contract
with
Gallup
NA
$
54,167.00
Total
Bottom­
Line
Burden
490
1/
2
$
73,102.87
6(
e)(
ii)
The
Agency
Tally
See
section
6(
c),
estimating
Agency
burden
and
cost,
for
information
on
how
the
bottomline
Agency
costs
were
derived.

6(
e)(
iii)
Variation
in
the
Annual
Bottom­
Line
EPA
does
not
anticipate
significant
variation
in
the
annual
respondent
reporting/
recordkeeping
burden
or
cost
over
the
course
of
the
clearance
period
for
this
survey.

6(
f)
Reasons
for
Change
in
the
Burden
This
section
is
not
applicable
since
this
is
a
new
ICR.
12
6(
g)
Burden
Statement
The
public
reporting
burden
for
this
collection
of
information
is
estimated
to
range
between
1
minute
per
response
for
the
screener
survey
and
11
minutes
per
response
for
the
full
survey.
This
includes
time
to
listen
to
survey
instructions,
respond
to
survey
screening
questions,
and
complete
the
survey.
The
average
respondent's
response
cost
is
$
0.27
for
screening­
only
respondents
and
$
2.97
for
survey
respondents.

The
public
reporting
and
recordkeeping
burden
for
this
collection
of
information
is
estimated
to
average
between
1
and
11
minutes
per
response.
Burden
means
the
total
time,
effort,
or
financial
resources
expended
by
persons
to
generate,
maintain,
retain,
or
disclose
or
provide
information
to
or
for
a
Federal
Agency.
This
includes
the
time
needed
to
review
instructions;
develop,
acquire,
install,
and
utilize
technology
and
systems
for
the
purposes
of
collecting,
validating,
and
verifying
information,
processing
and
maintaining
information,
and
disclosing
and
providing
information;
adjust
the
existing
ways
to
comply
with
any
previously
applicable
instructions
and
requirements;
train
personnel
to
be
able
to
respond
to
a
collection
of
information;
search
data
sources;
complete
and
review
the
collection
of
information;
and
transmit
or
otherwise
disclose
the
information.
An
agency
may
not
conduct
or
sponsor,
and
a
person
is
not
required
to
respond
to,
a
collection
of
information
unless
it
displays
a
currently
valid
OMB
control
number.

Send
comments
on
the
Agency's
need
for
this
information,
the
accuracy
of
the
provided
burden
estimates,
and
any
suggested
methods
for
minimizing
respondent
burden
to
the
Director,
Collection
Strategies
Division,
Office
of
Environmental
Information
(
OEI),
U.
S.
Environmental
Protection
Agency,
MC
2822T,
1200
Pennsylvania
Avenue,
NW,
Washington
DC
20460,
and
to
the
Office
of
Information
and
Regulatory
Affairs,
Office
of
Management
and
Budget,
725
17th
Street,
NW,
Washington,
DC
20503,
Attention:
Desk
Officer
for
EPA.
Include
the
EPA
ICR
Number
2016.01
in
any
correspondence.

PART
B
1.
SURVEY
OBJECTIVES,
KEY
VARIABLES,
AND
OTHER
PRELIMINARIES
1(
a)
Survey
Objectives
13
EPA's
Office
of
Ground
Water
and
Drinking
Water
is
proposing
to
conduct
a
survey
of
U.
S.
households
to
gain
information
on
the
public's
knowledge,
perception
of
certain
drinking
water
issues,
and
whether
they
are
receiving
pertinent
information
on
such
issues,
in
order
to
evaluate
the
effectiveness
of
its
current
right
to
know
efforts.

EPA
expects
the
following
issues
to
be
addressed
by
the
Drinking
Water
Customer
Satisfaction
Survey:

 
The
extent
to
which
information
on
relevant
drinking
water
topics
are
reaching
the
intended
audience
 
The
extent
to
which
the
audience
understands
the
information
that
it
receives
 
Whether
the
audiences'
information
needs
are
met
 
Whether
information
is
being
presented
and
distributed
in
a
way
that
is
satisfactory
to
its
audience
1(
b)
Key
Variables
The
key
variables
associated
with
this
survey
effort
include
the
level
of
knowledge
and
information
received
by
drinking
water
customers
regarding:

 
their
tap
water
 
contamination
of
tap
water
 
source
water
 
contamination
of
source
water
1(
c)
Statistical
Approach
The
primary
objective
in
conducting
the
Drinking
Water
Customer
Satisfaction
Survey
is
to
measure
the
extent
to
which
the
general
public
is
aware
of,
and
receiving
information
on,
certain
issues
relating
to
their
drinking
water
and
how
the
EPA
can
improve
this
knowledge.

1
(
d)
Feasibility
EPA
has
reviewed
the
administrative
procedures
to
conduct
the
Drinking
Water
Customer
Satisfaction
Survey
and
it
has
concluded
that
it
is
feasible
to
undertake
the
survey.
EPA
has
sufficient
funding
to
conduct
the
survey
and
has
put
a
contract
in
place
to
provide
the
necessary
logistical
support.
The
survey
was
peer­
reviewed
by
staff
from
all
three
branches
of
the
Drinking
Water
Protection
Division
to
ensure
that
the
questions
asked
will
reveal
sufficient
information
to
evaluate
if
right
to
know
efforts
are
adequately
providing
drinking
water
information
to
its
customers.

EPA
estimates
that
it
will
take
approximately
three
months
to
administer
the
Drinking
Water
Customer
Satisfaction
Survey,
collect
and
analyze
survey
responses,
and
report
its
1
The
sampling
error
is
calculated
as
1.96*
SQRT{(
P*(
1­
P)/
n)},
where
P
is
the
unknown
population
proportion
(
assumed
to
be
equal
to
.5)
and
n
is
the
sample
size.
14
findings.
EPA
plans
to
initiate
this
survey
within
weeks
of
receiving
OMB
approval
and,
therefore,
expects
to
complete
the
survey
well
within
the
period
for
which
this
ICR
is
in
effect.

2.
SURVEY
DESIGN
2(
a)
Target
Population
and
Coverage
EPA
will
target
the
non­
institutional
adult
(
18
years
of
age
or
older)
population
residing
in
the
United
States
at
the
household
level.
Respondents
will
be
selected
randomly.
The
coverage
for
this
survey
is
the
non­
institutional
adult
(
18
years
of
age
or
older)
population
residing
in
the
U.
S.
at
the
household
level
with
residential
phone
service.

2(
b)
Sample
Design
2(
b)(
i)
Sampling
Frame
The
sampling
frame
comes
from
the
random
digit
dial
(
RDD)
database
of
both
listed
and
unlisted
telephone
numbers
maintained
by
Survey
Sampling
Inc.
(
SSI).
The
frame
is
created
by
assembling
a
database
of
all
possible
combination
of
valid
area
code
exchange
combinations
that
is
generated
by
appending
all
10,000
four
digit
suffixes
(
0000
to
9999)
to
the
area
code­
prefix
combinations.
The
frame
is
prepared
by
a
commercial
sample
vendor
(
SSI)
and
poses
no
potential
legal
or
regulatory
obstacles
(
e.
g.,
confidentiality)
to
using
the
frame.
The
frame
is
current,
complete,
and
nonduplicative.

2(
b)(
ii)
Sample
Size
The
survey
will
complete
1000
interviews,
which
is
derived
to
achieve
approximately
plus/
minus
3%
of
sampling
error1
at
the
95%
level
of
confidence.
The
gain
in
terms
of
precision
by
increasing
the
sample
size
by
500
more
to
1500
would
yield
a
0.5%
reduction
of
sampling
error
from
3.1%
to
2.53%
at
the
95%
level
of
confidence.
On
the
other
hand,
reducing
the
sample
size
by
half
to
500
would
yield
an
increase
of
sampling
error
from
3.1%
to
more
than
4%
(
i.
e.,
4.38%)
at
the
95%
confidence
level.

Considering
both
the
precision
and
the
cost,
the
sample
size
of
1000
with
a
3.1%
margin
of
error
at
the
95%
level
of
confidence
will
provide
an
adequate
precision
for
national
estimates
of
key
items
representing
measurable
program
objectives
while
also
providing
adequate
numbers
of
cases
for
standard
demographic
breakdowns.
2
Casady,
R.
&
Lepkowski,
J.
(
1993).
Stratified
telephone
sample
designs.
Survey
Methodology,
19,
103­
113.
3
Brick,
J.
M.,
Waksberg,
J.,
Kulp,
D.
W.,
&
Starer,
A.
W.
(
1995).
Bias
in
list­
assisted
telephone
samples.
Public
Opinion
Quarterly,
59,
218­
235.
Giesbrecht,
L.
H.
(
1997).
Coverage
bias
in
various
list­
assisted
RDD
sample
designs.
Paper
presented
at
the
52nd
Annual
conference
of
the
American
Association
for
Public
Opinion
Research,
Norfolk,
Virginia,
May
16,
1997.
15
2(
b)(
iii)
Stratification
Variables
There
is
no
stratification
variable
for
this
survey.

2(
b)(
iv)
Sampling
Method
This
survey
will
use
a
simple
Random
Digit
Dial
(
RDD)
sampling
method.
The
RDD
sample
will
be
selected
using
the
most
current
database
assigned
area
code­
prefix
combinations
(
the
first
8
digits
of
a
10­
digit
telephone
number)
covering
the
entire
nation
(
including
Alaska
and
Hawaii)
obtained
from
Bellcore
Research
(
Bellcore).
The
initial
list
will
exclude
any
area
code­
prefix
combinations
know
by
Bellcore
to
contain
only
business
listings,
toll­
free
numbers,
cellular
numbers,
and
other
non­
residential
lines.
In
choosing
a
RDD
sample,
the
procedure
implicitly
chooses
a
random
grouping
of
100
banks.
A
bank
is
a
group
of
100
consecutive
numbers
that
share
their
first
eight
digits
 
that
is,
their
area
code,
exchange,
and
the
first
two
numbers
of
their
four­
digit
suffixes.
For
example,
all
the
possible
telephone
numbers
beginning
301­
515­
33_
_
form
a
single
bank.
Prior
to
selection,
the
banks
are
classified
according
to
the
number
of
residential
listings
they
contain.
In
the
method
developed
by
Casady
and
Lepkowski
(
1993),
2numbers
from
the
stratum
of
banks
containing
no
residential
listings
("
zero"
banks)
are
under­
sampled
to
reduce
the
number
of
calls
to
unassigned
or
non­
working
numbers.
However,
it
is
more
common
simply
to
omit
banks
that
include
few
residential
listings.
In
this
instance,
a
3+
design
will
be
used
in
choosing
the
banks,
which,
for
a
list­
assisted
design,
would
yield
total
undercoverage
of
the
household
population
of
about
8.4%
(
6%
for
non­
telephone
households
and
another
2.4%
of
households
with
phones
that
are
missed
by
the
frame).
The
omission
of
these
banks
sharply
increases
the
proportion
of
working
residential
numbers
in
the
sample.
Evaluations
of
the
bias
associated
with
the
omission
of
such
banks
indicate
that
the
bias
is
small
(
Brick,
Waksberg,
Kulp,
&
Starer,
1995;
Giesbrecht,
1997)
3.
As
a
result,
both
major
vendors
of
telephone
samples
(
SSI
and
Genesys)
routinely
offer
samples
based
on
all
banks
with
three
or
more
residential
listings.

2(
b)(
v)
Multi­
Stage
Sampling
This
survey
will
use
two­
stage
sampling.
The
first
stage
of
sampling
involves
the
selection
of
household
from
a
simple
random
RDD
sample
as
described
above.
The
second
stage
of
sampling
is
randomly
selecting
one
adult
from
the
household
by
asking
for
the
adult
with
the
most
recent
birthday.
4
The
sampling
error
is
calculated
as
1.96*
SQRT{(
P*(
1­
P)/
n)},
where
P
is
the
unknown
population
proportion
(
assumed
to
be
equal
to
.5)
and
n
is
the
sample
size.
16
2(
c)
Precision
Requirements
2(
c)(
i)
Precision
Targets
EPA's
survey
has
been
designed
to
ensure
that,
at
the
95%
level
of
confidence,
it
will
provide
an
error
range4
of
approximately
plus/
minus
3%
at
the
national
level
and
5%
for
demographic
breakouts
(
with
a
sample
size
of
380
or
more).

2(
c)(
ii)
Nonsampling
Error
In
order
to
minimize
the
potential
nonsampling
errors
due
to
faulty
measurement
procedure
or
nonrespondents
the
following
steps
will
be
taken.
To
ensure
the
uniformity
of
interview
procedures
the
survey
will
use
a
computer­
assisted
telephone
interviewing
(
CATI)
methodology.
All
interviewers
will
be
trained
to
identify
and
avert
potential
refusals
and
non­
respondents.
The
contractor
will
also
have
available
a
group
of
interviewer
specialists
whose
task
it
will
be
to
convert
nonrespondents.
These
individuals
include
experienced
interviewers
and
supervisors,
who
are
skilled
in
averting
and
converting
refusals.
The
combination
of
a
well
designed
and
executed
survey
with
a
highly
capable
staff
of
interviewers
and
refusal­
converters
should
minimize
the
magnitude
of
non­
response
and
ensure
the
reliability
of
the
survey
results.

In
addition,
standard
survey
techniques
that
have
proven
successful
in
other
academic
survey
efforts
will
be
employed
to
achieve
a
maximum
response
rate.
These
techniques
include:

 
Training
of
interviewers
on
refusal
aversion
and
conversion
techniques
 
Frequent
review
of
interviewer
refusal
rates,
and
close
monitoring
and
re­
training
of
interviewers
who
have
rates
above
the
norm
 
Requiring
interviewers
to
record
information
about
refusals,
which
may
facilitate
subsequent
interview
attempts
 
Supervisor
review
of
reasons
for
refusals
and
efforts
to
re­
contact
respondents
if
refusal
conversion
is
deemed
possible
2(
d)
Questionnaire
Design
A
discussion
of
the
data
elements
contained
in
the
Drinking
Water
Customer
Satisfaction
Survey
is
included
in
section
4(
b)(
ii)
of
part
A
of
the
Supporting
Statement
for
this
ICR.
17
EPA
has
designed
a
questionnaire
in
which
the
respondents
are
asked
to
answer
primarily
yes/
no
and
multiple
choice
questions.
By
using
yes/
no
and
multiple
choice
questions,
the
Agency
has
substantially
reduced
the
amount
of
time
necessary
for
the
respondent
to
complete
the
survey
and
has
ensured
consistence
in
data
response
and
interpretation.

The
survey
instrument
was
developed
in
consultation
with
staff
and
management
of
the
three
branches
of
the
Drinking
Water
Protection
Division,
The
Association
of
Safe
Drinking
Water
Administrators,
and
the
Gallup
Organization
to
ensure
that
respondents
will
understand
the
questions
asked
and
will
provide
the
type
of
data
necessary
to
measure
the
Agency's
objectives.
The
survey
was
also
designed
with
the
help
of
a
statistician
to
ensure
the
reliability
of
the
data.

3.
PRETESTS
AND
PILOT
TESTS
The
Drinking
Water
Customer
Satisfaction
Survey
has
gone
through
a
rigorous
review
process
in
various
stages.
This
review
process
included
reviewing
the
overall
design,
skip
patterns,
proofing,
CATI
programming,
and
timing
of
the
survey
using
specially
designed
computer
software.
Furthermore,
Gallup
has
performed
an
informal
pretest
to
7
coworkers
not
involved
with
this
survey,
in
order
to
discover
any
potential
problems
with
the
wording
of
the
questions.
Thus,
potential
problems
associated
with
the
questionnaire
design
are
expected
to
be
minimal.
For
this
reason,
a
formal
pretest
of
the
survey
will
be
conducted
during
the
initial
stage
of
the
interviewing
process.
The
pretest
will
be
carried
out
by
stopping
the
interviewing
process
after
completing
the
first
20
interviews
and
checking
for
potential
problems.

4.
COLLECTION
METHODS
AND
FOLLOW­
UP
4(
a)
Collection
Methods
Based
on
statistical
literature,
EPA
expects
a
higher
response
rate
with
a
telephone
survey
than
the
Agency
would
otherwise
realize
with
a
survey
sent
in
the
mail.
A
higher
response
rate
decreases
the
size
of
the
sample
frame
needed,
and
the
level
of
follow­
up
required,
to
achieve
the
Agency's
target
precision
rates
and
confidence
levels.
Therefore,
the
EPA
chose
to
administer
the
Drinking
Water
Customer
Satisfaction
Survey
using
telephone
interviews.

Data
will
be
collected
through
the
use
of
CATI.
The
CATI
system
allows
a
computer
to
perform
a
number
of
functions
that
are
prone
to
error
when
done
manually
by
interviewers,
including:

 
Providing
the
correct
question
sequence
 
Automatically
executing
skip
patterns
based
on
prior
answers
 
Recalling
answers
to
prior
questions
and
displaying
the
information
in
the
text
of
later
questions.
18
 
Providing
random
rotation
of
specified
questions
to
avoid
bias
 
Ensuring
that
questions
cannot
be
skipped
 
Rejecting
invalid
responses
 
Carrying
out
random
selection
of
respondents/
questions
Only
experienced
interviewers
will
be
assigned
to
the
study.
Thus,
no
general
interviewing
training
will
be
required.
The
Gallup
Project
Director,
his/
her
staff,
and
the
manager
of
the
telephone
interviewing
center
will
provide
a
study­
specific
briefing.
Training
will
have
two
components:
classroom
instruction
on
the
sampling
and
interviewing
methods
and
questionnaire
content,
followed
by
a
session
of
mock
interviewing
and
live
practice
in
the
telephone
interviewing
center
using
the
CATI
methodology.

4(
b)
Survey
Response
and
Follow­
Up
The
target
response
rate
of
this
survey
is
80%.
This
means
that
80%
of
the
valid
telephone
numbers
that
were
randomly
selected
will
result
in
a
completed
interview.
A
response
rate
of
this
magnitude
will
be
highly
representative
of
the
entire
population,
enabling
EPA
to
make
valid
generalizations
from
the
sample
data.

The
response
rate
will
be
measured
and
evaluated
as
defined
by
the
Council
of
American
Survey
Research
Organizations
(
CASRO).
The
CASRO
response
rate
(
CRR)
is
considered
as
the
industry
standard
response
rate
formula
to
be
used
for
this
type
of
RDD
survey.
It
is
defined
as
follows:

CRR
=
(
number
of
completed
interviews)/(
Estimated
number
of
eligibles)
=
(
number
of
completed
interviews)/(
Known
Eligibles
+
Presumed
Eligibles)

It
is
straightforward
to
find
the
number
of
completed
interviews
and
the
number
of
known
eligibles.
The
estimation
of
the
number
of
`
Presumed
Eligibles'
is
done
in
the
following
way.
In
terms
of
eligibility,
all
respondents
(
irrespective
of
whether
any
contact/
interview
was
obtained
or
not)
may
be
divided
into
three
groups:
(
i)
known
eligibles
i.
e.
cases
where
the
respondents,
based
on
their
responses
to
screening
questions,
were
found
eligible
for
the
survey,
(
ii)
known
ineligibles
i.
e.
cases
where
the
respondents,
based
on
their
responses
to
screening
questions,
were
found
ineligible
for
the
survey
and
(
iii)
eligibility
unknown;
(
i.
e.
cases
where
the
screening
questions
could
not
be
asked
(
for
example,
there
was
never
any
human
contact)
and
hence
the
eligibility
is
unknown.
Based
on
cases
where
eligibility
status
is
known
(
known
eligible
or
known
ineligible),
the
eligibility
rate
(
ER)
is
computed
as:

ER
=
(
known
eligibles)/(
known
eligibles
+
known
ineligibles).

So,
the
ER
is
the
proportion
of
eligibles
found
in
the
group
of
respondents
for
whom
the
eligibility
could
be
established.
At
the
next
step,
the
number
of
Presumed
Eligibles
is
calculated
as:
19
Presumed
Eligibles
=
(
ER)
*
(
Number
of
respondents
in
the
Eligibility
Unknown
group).

The
basic
assumption
here
is
that
the
eligibility
rate
among
cases
where
eligibility
could
not
be
established
is
the
same
as
the
eligibility
rate
among
cases
where
eligibility
status
was
known.

Also,
in
order
to
achieve
a
maximum
response
rate
the
survey
will
use
standard
survey
techniques
that
have
proven
successful
in
other
academic
surveys.
These
techniques
include:

 
Training
of
interviewers
on
refusal
aversion
and
conversion
techniques
 
Frequent
review
of
interviewer
refusal
rates,
and
close
monitoring
and
re­
training
of
interviewers
who
have
rates
above
the
norm
 
Requiring
interviewers
to
record
information
about
refusals,
which
may
facilitate
subsequent
interview
attempts
 
Supervisor
review
of
reasons
for
refusals
and
efforts
to
re­
contact
respondents
if
refusal
conversion
is
deemed
possible
If
respondents
wish
to
speak
to
someone
regarding
any
aspect
of
EPA
survey,
efforts
will
include
the
following:

°
Interviewers
will
be
prepared
to
refer
the
respondent
to
EPA
by
giving
them
the
telephone
number
of
the
Safe
Drinking
Water
Hotline
°
Interviewers
will
also
be
prepared
to
give
the
respondent
Gallup's
800
number
to
call
for
verification
that
interviewers
are
representing
EPA
and
The
Gallup
Organization
5.
ANALYZING
AND
REPORTING
SURVEY
RESULTS
5
(
a)
Data
Preparation
The
data
entry
procedure
with
CATI
system
allows
the
interviewer
to
enter
responses
directly
from
his
or
her
keyboard.
The
information
is
then
automatically
recorded
in
the
computer's
memory.
The
CATI
system
includes
several
safeguards
to
reduce
interviewer
error
in
direct
key­
entry
of
survey
responses.
20
First,
the
CATI
system
has
a
double
check
method
to
eliminate
the
problem
of
key
entry
error
as
a
result
of
accidentally
hitting
the
wrong
key.
Unlike
some
systems,
when
the
interviewer
enters
the
code
for
the
respondent
reply,
the
code
is
not
immediately
accepted.
Rather,
the
screen
remains
on
the
question
and
response
categories
for
the
item,
and
the
code
and
category
entered
by
the
interviewer
is
displayed
at
the
bottom
of
the
screen.
Second,
the
interviewer
must
confirm
the
initial
entry
before
it
is
accepted
by
the
computer
as
final.
If,
despite
these
safeguards,
the
wrong
answer
is
entered
or
a
respondent
changes
his/
her
reply,
the
interviewer
can
correct
the
entry
before
moving
on
to
the
next
question.
The
CATI
system
decreases
the
time
required
for
each
interview
and,
consequently,
the
overall
burden
on
respondents.
It
also
allows
the
computer
to
perform
a
number
of
critical
assurance
routines
that
are
monitored
by
survey
supervisors,
including:

!
Tracking
average
interview
length,
refusal
rate,
and
termination
rate
by
interviewer
!
Consistency
checks
for
inappropriate
combinations
of
answers.
The
CATI
system
increases
the
efficiency
and
validity
of
the
survey
and
decreases
the
burden
to
respondents
Data
from
the
CATI
output
file
will
be
fully
edited
by
the
logic
of
the
CATI
program
and
will
require
no
further
post­
survey
machine
editing.
Gallup
analysts
and
programming
staff
will
prepare
data
file
specifications,
including
variable
names,
variable
labels,
and
format
statements
for
all
data
elements
collected
for
the
survey.
Variable
creation
algorithms
for
all
derived
or
composite
constructed
variables
will
also
be
developed.
These
specifications
will
be
transmitted
to
the
EPA
for
review
and
revisions
made
upon
notification.
When
interviewing
is
completed,
the
clean,
raw
questionnaire
data
from
the
CATI
database
will
be
extracted
and
converted
into
an
ASCII
file.
Application
programmers
will
then
prepare
the
control
statements
to
create
the
analysis
system
files
in
the
format
preferred
by
EPA,
generate
the
files,
and
run
raw
frequencies
for
all
variables
in
the
data
set
in
order
to
perform
a
final
check
on
data
integrity.

5
(
b)
Analysis
Data
Analysis
and
Reporting
Sample
Weighting
and
Variance
Calculation:
Due
to
the
unequal
selection
probabilities
for
designated
respondents
in
the
specified
RDD
design,
the
sample
of
completed
cases
will
not
be
a
self­
weighting,
simple
random
sample.
Using
data
from
the
CATI
sample
control
files,
statistical
programmers
will
calculate
selection
probabilities
and
weights
for
all
completed
cases
in
the
sample.
Base
weights
will
be
adjusted
for
non­
response
at
the
number
resolution,
household
screening,
and
interviewing
stages.
Post
stratification
21
adjustments
will
also
be
made
to
weights
based
upon
population
control
totals
based
on
most
current
U.
S.
Census
data.

Because
of
the
unequal
selection
probabilities
in
the
sample
design,
statistical
analysis
packages
such
as
SAS
will
not
calculate
correct
sampling
variances
for
the
survey
estimates.
In
general,
the
clustered
design
and
unequal
selection
probabilities
will
tend
to
increase
the
sampling
errors
slightly
compared
to
a
simple
random
sample
of
the
same
size.

Using
an
advanced
statistical
package
(
SUDAAN),
Gallup's
sampling
statistician
will
calculate
correct
sampling
variances
for
approximately
20
variables.
These
calculations
may
be
used
to
estimate
a
generalized
design
effect,
an
inflation
factor
used
to
approximate
the
actual
standard
errors
of
statistics
of
interest.

Technical
Documentation:
Within
ten
days
after
completing
the
initial
analyses,
Gallup
will
submit
a
brief
technical
report
of
approximately
15
pages
documenting
the
questionnaire,
sample
design,
survey
procedures,
and
survey
results.
This
report
will
address
all
significant
issues
(
e.
g.,
unit
and
item
nonresponse
rates,
sample
weights,
etc.)
that
might
affect
the
analysis,
interpretation,
or
confidence
in
the
survey
results.

Summary
of
Findings:
Gallup
staff
will
prepare
an
analysis
plan
that
includes
analyses
of
all
core
survey
items
(
customer
experiences,
perceptions,
and
satisfaction
measures)
for
the
general
population
and
for
subpopulations
of
interest.
The
plan
will
include
an
outline
of
the
substantive
report,
a
specific
list
of
tables
to
be
prepared,
and
an
outline
for
a
methodology
section.
Gallup's
senior
project
staff
will
organize
a
telephone
conference
with
EPA
to
critique,
revise
and
improve
it
to
meet
all
current
requirements.

Table
specifications
will
be
programmed
and
tested
during
data
collection.
When
weights
are
completed
and
checked,
an
initial
set
of
weighted
frequency
distributions
and
crosstabulations
will
be
run,
illustrating
the
key
breakdowns
of
key
indicators
by
common
demographic
variables.
These
will
be
sent
to
EPA
for
review.
Any
unexpected
results
will
be
examined
in
detail
and
corrective
actions
taken
on
a
flow
basis.

The
draft
report
on
the
survey
will
be
concise
(
up
to
15
pages)
yet
substantively
and
technically
comprehensive,
and
will
make
liberal
use
of
well­
designed
graphs
that
will
communicate
the
statistical
results
in
a
way
that
is
useable
for
policy
makers,
the
media,
and
general
public.

As
described
above,
the
analysis
file
will
be
generated
from
CATI
data
within
one
day
after
interviewing
is
completed.
Gallup
will
then
produce
a
draft
report
within
seven
to
ten
days
from
the
time
the
file
is
available.
The
draft
will
be
submitted
to
EPA
for
review.
Upon
receipt
of
feedback,
Gallup
will
revise
the
report
into
final
form.

5(
c)
Reporting
Results
22
EPA
will
make
the
results
of
the
Drinking
Water
Customer
Satisfaction
Survey
available
to
its
Federal
Partners
such
as
the
Centers
for
Disease
Control,
U.
S.
Geological
Survey,
U.
S.
Department
of
Agriculture,
and
U.
S.
Department
of
Health
and
Human
Services.
EPA
will
make
the
results
of
the
survey
public
on
the
internet.
Furthermore,
EPA
will
provide
the
results
directly
to
its
stakeholders,
such
as
the
American
Water
Works
Association,
National
Rural
Water
Association,
Association
of
Safe
Drinking
Water
Administrators,
and
any
other
interested
party
whom
requests
so.
EPA
will
also
disseminate
the
survey
results
at
national
conferences
and
stakeholder
meetings.
Raw
survey
data
will
be
maintained
by
the
contractor
and
will
remain
unavailable
to
the
public
or
the
Agency.
