>>
DRAFT:
February
23,
2004
<<

SUPPORTING
STATEMENT
FOR
EPA
INFORMATION
COLLECTION
REQUEST
NUMBER
1613.02
DATA
REPORTING
REQUIREMENTS
FOR
STATE
AND
LOCAL
VEHICLE
EMISSION
INSPECTION
AND
MAINTENANCE
(
I/
M)
PROGRAMS
February
23,
2004
>>
DRAFT:
February
23,
2004
<<

TABLE
OF
CONTENTS
[
REMINDER:
Need
to
update
page
numbers
and
titles
for
final
draft
of
ICR]

1.
IDENTIFICATION
OF
THE
INFORMATION
COLLECTION
.............................
1
1(
a)
Title
and
Number
of
the
Information
Collection
..............................................
1
1(
b)
Characterization
of
the
Information
Collection
................................................
1
2.
NEED
FOR
AND
USE
OF
THE
COLLECTION
...................................................
1
2(
a)
Need
and
Authority
for
the
Collection
.............................................................
1
2(
b)
Use
and
Users
of
the
Data
................................................................................
2
3.
NON­
DUPLICATION,
CONSULTATIONS,
AND
OTHER
COLLECTION
CRITERIA
...............................................................................................................
3
3(
a)
Non­
Duplication
...............................................................................................
3
3(
b)
Public
Notice
Required
Prior
to
ICR
Submission
to
OMB................................
3
3(
c)
Consultations
....................................................................................................
3
3(
d)
Effects
of
Less
Frequent
Collection
.................................................................
4
3(
e)
General
Guidelines
...........................................................................................
4
3(
f)
Confidentiality
..................................................................................................
4
3(
g)
Sensitive
Questions
...........................................................................................
4
4.
THE
RESPONDENTS
AND
THE
INFORMATION
COLLECTED
.....................
4
4(
a)
State
and
Local
Respondents.............................................................................
4
4(
b)
Information
Requested
.....................................................................................
5
5.
THE
INFORMATION
COLLECTED
­­
AGENCY
ACTIONS,
COLLECTION
METHODOLOGY,
AND
INFORMATION
MANAGEMENT
.............................
12
5(
a)
Federal
Agency
Activities
................................................................................
>>
DRAFT:
February
23,
2004
<<

1
12
5(
b)
Collection
Methodology
and
Management
......................................................
12
5(
c)
Small
Entity
Flexibility
....................................................................................
13
5(
d)
Collection
Schedule
.........................................................................................
13
6.
ESTIMATING
THE
BURDEN
AND
COST
OF
THE
COLLECTION
..................
13
6(
a)
Estimating
Respondent
Burden
........................................................................
13
6(
b)
Estimating
Respondent
Costs
...........................................................................
14
6(
c)
Estimated
Agency
Burden
and
Cost
.................................................................
17
6(
d)
Bottom
Line
Burden
Hours
and
Costs
.............................................................
18
6(
e)
Reasons
for
Change
in
Burden
.........................................................................
19
6(
f)
Burden
Statement
..............................................................................................
19
APPENDIX
......................................................................................................................

1.
IDENTIFICATION
OF
THE
INFORMATION
COLLECTION
1(
a)
Title
and
Number
of
the
Information
Collection
This
information
collection
request
(
ICR)
is
entitled
"
Data
Reporting
Requirements
for
State
and
Local
Vehicle
Emission
Inspection
and
Maintenance
(
I/
M)
Programs,"
ICR
number
1613.02.

1(
b)
Characterization
of
the
Information
Collection
To
provide
general
oversight
and
support
to
state
and
local
I/
M
programs,
the
Transportation
and
Regional
Programs
Division
(
TRPD),
Office
of
Transportation
and
Air
Quality,
Office
of
Air
and
Radiation,
U.
S.
Environmental
Protection
Agency,
requires
that
state
or
local
program
management
for
both
basic
and
enhanced
I/
M
programs
submit
two
varieties
of
reports
to
EPA.
The
first
reporting
requirement
is
the
submittal
of
an
annual
report
providing
general
program
operating
data
and
summary
statistics,
addressing
the
program's
current
design
and
coverage,
a
summary
of
testing
data,
enforcement
program
efforts,
quality
assurance
and
>>
DRAFT:
February
23,
2004
<<

2
quality
control
efforts,
and
other
miscellaneous
information
allowing
for
an
assessment
of
the
program's
relative
effectiveness;
the
second
is
a
biennial
report
on
any
changes
to
the
program
over
the
previous
two­
year
period
and
the
impact
of
such
changes,
including
any
weaknesses
discovered
and
corrections
made
or
planned.

General
program
effectiveness
is
determined
by
the
degree
to
which
a
program
misses,
meets,
or
exceeds
the
emission
reductions
committed
to
in
the
state's
approved
SIP,
which,
in
turn,
must
meet
or
exceed
the
minimum
emission
reductions
expected
from
the
relevant
performance
standard,
as
promulgated
under
EPA's
revisions
to
40
CFR,
Part
51,
in
response
to
requirements
established
in
section
182
of
the
Clean
Air
Act
Amendments
of
1990
(
Act).
This
information
will
be
used
by
EPA
to
determine
a
program's
progress
toward
meeting
requirements
under
40
CFR,
Part
51,
as
well
as
to
assess
national
trends
in
the
area
of
basic
and
enhanced
I/
M
programs
and
to
provide
background
information
in
support
of
periodic
site
visits
and
evaluations.

2.
Need
for
and
Use
of
the
Collection
2(
a)
Need
and
Authority
for
the
Collection
The
collection
of
a
wide
variety
of
program
operating
and
summary
data
is
essential
to
the
assessment
of
an
I/
M
program's
overall
effectiveness
and
the
degree
to
which
it
complies
with
requirements
established
in
response
to
sections
182(
a)(
2)(
b)(
ii);
182(
b)(
4);
and
182(
c)(
3)
of
the
Act,
under
which
EPA
is
authorized
to
impose
these
collection
and
recordkeeping
requirements.
The
specific
program
data
to
be
collected
is
listed
in
EPA's
I/
M
rule;
40
CFR
Part
51,
subpart
S,
section
366.
A
copy
of
the
relevant
sections
from
the
I/
M
rule
is
contained
in
Attachment
1
to
this
ICR.

2(
b)
Use
and
Users
of
the
Data
There
are,
in
effect,
three
users
of
the
information
required
by
this
collection
­
the
primary
user,
represented
by
the
state
or
local
agency
or
department
in
charge
of
managing
the
I/
M
program
itself
(
hereafter
referred
to
generically
as
"
the
state"
and/
or
"
the
respondent"),
the
secondary
user,
represented
by
EPA
[
including
EPA
Regional
Offices],
and,
finally,
the
interested
public,
to
which
the
gathered
information
will
be
made
available
upon
request.
We
will
address
these
three
users
separately.

The
State
For
the
purpose
of
effectively
managing
its
I/
M
program,
the
state
must
gather
a
wide
range
of
program
data,
including
data
from
the
testing
program,
quality
control
and
assurance
efforts,
and
the
enforcement
program.
For
example,
sufficient
test
data
must
be
gathered
to
unambiguously
link
specific
test
results
to
a
specific
vehicle,
I/
M
program
registrant,
test
site,
and
>>
DRAFT:
February
23,
2004
<<

3
inspector,
to
help
determine
whether
or
not
the
correct
test
parameters
were
observed
for
the
specific
vehicle
in
question.
This
programmatic
need
­
which
is
distinct
from
any
need
and/
or
burden
imposed
by
this
ICR
­
is
reflected
in
the
fact
that
current
analyzer
specifications
include
extensive
data
capture
requirements
to
serve
just
this
purpose
(
see
Attachment
2
for
samples
of
the
data
collection
portions
from
a
variety
of
state
analyzer
specifications).
In
turn,
the
state
can
analyze
this
data
and
compare
it
to
the
registration
database
(
in
programs
enforced
through
registration
denial,
per
the
Clean
Air
Act)
or
otherwise
use
it
to
establish
a
vehicle's
compliance
with
program
requirements.
Owners
of
vehicles
found
to
be
out
of
compliance
are
not
allowed
to
register
said
vehicle(
s)
(
again,
in
programs
enforced
through
registration
denial)
or
must
be
otherwise
prevented
from
operating
the
non­
complying
vehicle(
s)
in
the
program
area.
Penalties
may
also
be
assessed
for
non­
compliance
with
program
requirements.

Data
collected
as
part
of
the
testing
program
can
also
be
used
to
target
audits
of
inspection
stations
and
inspectors,
with
irregularities
such
as
unusually
high
pass
or
fail
rates,
mismatched
vehicle
information,
etc.
acting
as
flags
to
possible
problems.
In
addition,
the
state
must
gather
and
analyze
quality
control
data
to
ensure
that
motorists
are
given
accurate
and
consistent
measurements.
In
the
interest
of
effectively
managing
its
enforcement
and
quality
assurance
efforts,
the
program
must
keep
records
of
such
efforts,
including
the
number
of
investigations
conducted
(
including
internal
control
reviews
to
detect
weaknesses
within
the
program
itself,
as
well
as
investigations
of
testing
sites
and
inspectors),
the
methodology
used,
and
the
results
of
investigations
and
other
enforcement
and
quality
assurance
activities.

EPA
For
the
purposes
of
complying
with
this
information
collection,
the
state
must
summarize
and
report
the
above
data
to
the
respective
Regional
Offices
of
EPA,
either
electronically
or
via
hardcopy
text.
To
provide
the
state
with
maximum
flexibility
to
use
pre­
existing,
internal
reporting
mechanisms,
EPA
will
allow
the
state
to
decide
the
format
used
in
reporting
this
information,
provided
the
relevant
data
points
listed
in
section
4(
b)
of
this
collection
request
are
addressed.
EPA
will
use
this
information
to
assess
specific
state
programs
and
their
success
in
complying
with
the
I/
M
rulemaking
requirements.
Periodically,
this
assessment
will
lead
to
follow­
up
site
visits
and
audits
of
the
programs
in
question.
In
addition,
this
information
will
be
used
to
assess
trends
in
I/
M
program
development
and
implementation,
and
to
determine
the
effectiveness
of
various
program
strategies
for
the
sake
of
I/
M
policy
development,
assessment,
and
periodic
revision
in
the
most
cost
efficient
manner
possible.

The
Public
EPA
will
make
the
gathered
information
available
to
the
public
upon
request.
We
foresee
this
information
being
of
use
to
a
wide­
range
of
interest
groups,
including
environmental
organizations,
state
and
local
governments,
and
industry
groups,
such
as
automotive
vehicle
and
aftermarket
parts
manufacturers,
and
the
inspection
and
repair
industries.
>>
DRAFT:
February
23,
2004
<<

4
3.
NON­
DUPLICATION,
CONSULTATIONS,
AND
OTHER
COLLECTION
CRITERIA
3(
a)
Non­
Duplication
EPA
has
made
an
effort
to
ensure
that
the
data
collection
efforts
associated
with
this
ICR
are
not
duplicated.
EPA
has
consulted
State
and
Local
environmental
programs,
other
Federal
agencies.
To
the
best
of
EPA's
knowledge,
data
currently
required
by
EPA
(
and
its
implementing
regulations
codified
at
40
CFR
Parts
51)
are
not
available
from
any
other
source.

3(
b)
Public
Notice
Required
Prior
to
ICR
Submission
to
OMB
(
This
section
is
reserved
for
the
final
version
of
ICR
Supporting
Document)

3(
c)
Consultations
EPA
requested
burden
hour
information
from
our
regional
offices
and
some
state
and
local
programs
to
prepare
this
ICR.
Specifically,
we
collected
information
on
the
number
of
hours
required
to
complete
the
following
under
the
current
I/
M
rule
data
reporting
requirement:

°
Submit
an
annual
test
data
report;

°
Submit
an
annual
quality
assurance
report;

°
Submit
an
annual
quality
control
report;

°
Submit
an
annual
enforcement
report;

°
Submit
a
biennial
report
on
any
changes
to
the
program
and
the
impact
of
such
changes,
including
any
weaknesses
discovered
and
corrections
made
or
planned;

3(
d)
Effects
of
Less
Frequent
Collection
The
I/
M
Rule
requires
annual
reporting
and
biennial
reporting
by
the
States
who
wish
to
implement
the
I/
M
program
in
order
to
accomplish
their
SIP
plans.
Submitting
these
report
less
frequently
than
annually
or
biennially
would
severely
limit
EPA's
capability
to
determine
a
program's
progress
toward
meeting
requirements
under
40
CFR,
Part
51,
as
well
as
to
assess
national
trends
in
the
area
of
basic
and
enhanced
I/
M
programs.
Less
frequent
reporting
would
delay
identification
of
possible
problems
undercutting
the
program's
benefit.
>>
DRAFT:
February
23,
2004
<<

1"
United
States
Motor
Vehicle
Inspection
and
Maintenance
(
I/
M)
Programs",
November
2003,
Sierra
Research,
Inc..

5
3(
e)
General
Guidelines
This
ICR
was
prepared
in
accordance
with
the
February
1999
version
of
EPA's
Guide
to
Writing
Information
Collection
Requests
Under
the
Paperwork
Reduction
Act
(
PRA)
of
1995
(
or
"
ICR
Handbook")
prepared
by
EPA's
Office
of
Environmental
Information,
Office
of
Information
Collection,
Collection
Strategies
Division.
The
ICR
Handbook
provides
the
most
current
instructions
for
ICR
preparation
to
ensure
compliance
with
the
1995
PRA
amendments
and
Office
of
Management
and
Budget's
(
OMB's)
implementing
guidelines.

3(
f)
Confidentiality
No
confidential
information
will
be
collected
as
a
result
of
this
ICR.

3(
g)
Sensitive
Questions
No
information
of
a
sensitive
nature
will
be
collected
as
a
result
of
this
ICR.

4.
THE
RESPONDENTS
AND
THE
INFORMATION
REQUESTED
4(
a)
State
and
Local
Respondents
The
respondents
to
this
information
collection
are
the
state
governmental
agencies
or
departments
responsible
for
oversight
and
operation
of
the
I/
M
programs
(
SIC#
91).
Thirty­
three
states
plus
the
District
of
Columbia1
will
be
affected
by
I/
M
program
requirements.
This
category
of
respondent
was
selected
because
it
represents
the
entities
most
comprehensively
involved
in
gathering
the
information
which
must
be
summarized
for
this
collection
(
i.
e.,
those
parties
responsible
for
establishing,
maintaining,
and
analyzing
the
program's
central
database,
or
overseeing
contractor
personnel
responsible
for
such
activities).
Although
I/
M
programs
can
and
do
vary
by
type
(
i.
e.,
basic
I/
M
programs
versus
enhanced
I/
M
programs),
the
data
elements
to
be
addressed
by
this
information
collection
remain
consistent
across
program
types,
and
hence
the
burden
does
not
vary
by
program
type.

4(
b)
Information
Requested
Under
current
I/
M
program
practice,
various
internal
analyses
and
reports
are
routinely
generated
using
the
data
collected
on
vehicle
tests,
as
well
as
quality
control,
quality
assurance,
>>
DRAFT:
February
23,
2004
<<

6
and
enforcement
efforts
(
see
Attachment
3
for
examples
of
internal
state
data
reports).
The
information
requested
in
this
ICR
is,
in
fact,
based
upon
the
data
items
currently
collected
in
and
the
reports
currently
generated
in
many
of
the
better
run
I/
M
programs.
These
reports
are
used
primarily
as
management
tools
for
internal
monitoring
and
evaluation
of
the
program.
The
purpose
of
this
ICR
is
to
formalize
the
process
through
which
this
data
is
reported
to
EPA,
as
well
as
to
standardize
the
reporting
schedule.

(
i)
Data
items
A.
Recordkeeping
Requirements
In
fulfilling
the
requirements
of
this
information
collection,
respondents
will
need
to
gather
and
maintain
records
on
the
following
data
items
per
vehicle
inspected
as
part
of
the
I/
M
program.
As
stated
previously,
similar
information
is
currently
collected
by
existing
I/
M
programs
and
is
written
into
the
data
recording
requirements
of
their
analyzer
specifications
(
see
Attachment
2).
As
stated
in
the
introduction
to
this
section,
these
records
represent
information
which
a
program
needs
to
gather
and
maintain
as
part
of
the
day­
to­
day
administration
and
enforcement
of
the
program,
and,
as
such,
do
not
constitute
an
additional
burden
triggered
by
this
information
collection.

1)
Test
record
number
2)
Inspection
station
and
inspector
number
3)
Test
system
number
4)
Date
of
the
test
5)
Emission
test
start
time
and
the
time
final
emission
scores
are
determined
6)
Vehicle
Identification
Number
(
VIN)
7)
License
plate
number
8)
Test
certificate
number
9)
Gross
Vehicle
Weight
Rating
(
GVWR)
10)
Vehicle
model
year,
make,
and
type
11)
Number
of
cylinders
or
displacement
12)
Transmission
type
13)
Odometer
reading
14)
Type
of
test
performed
(
i.
e.,
initial
test,
first
retest,
or
subsequent
retest)
15)
Fuel
type
of
the
vehicle
(
i.
e.,
gas,
diesel,
or
other
fuel)
16)
Type
of
vehicle
preconditioning
performed
(
if
any)
17)
Emission
test
sequence(
s)
used
18)
Hydrocarbon
emission
scores
and
standards
for
each
applicable
test
mode
19)
Carbon
monoxide
emission
scores
and
standards
for
each
applicable
test
mode
20)
Carbon
dioxide
emission
scores
(
CO+
CO2)
and
standards
for
each
applicable
test
mode
21)
Nitrogen
oxides
emission
scores
and
standards
for
each
applicable
test
mode
>>
DRAFT:
February
23,
2004
<<

7
22)
Results
(
Pass/
Fail/
Not
Applicable)
of
the
applicable
visual
inspections
for
the
catalytic
converter,
air
system,
gas
cap,
evaporative
system,
positive
crankcase
ventilation
(
PCV)
valve,
and
fuel
inlet
restrictor
23)
Results
of
the
evaporative
pressure
test
expressed
as
a
pass
or
fail
24)
Results
of
the
evaporative
system
purge
test
expressed
as
a
pass
or
fail
along
with
the
total
purge
flow
in
liters
achieved
during
the
test
25)
Results
of
the
on­
board
diagnostic
check
expressed
as
pass
or
fail
along
with
the
diagnostic
trouble
codes
revealed
(
where
applicable).

In
addition,
the
program
shall
gather
and
maintain
records
on
the
results
of
all
quality
control
checks
conducted
in
response
to
40
CFR
51,
subpart
S,
section
359,
identifying
each
check
by
station
number,
system
number,
date,
and
start
time.
The
record
shall
also
contain
the
concentration
values
of
the
calibration
gases
used
to
perform
the
gas
characterization
portion
of
the
quality
control
checks.
Raw
test
data
(
both
vehicle
inspections
and
quality
control
checks)
shall
be
saved
for
a
minimum
of
two
complete
inspection
cycles
(
i.
e.,
two
years
in
annual
programs,
and
four
years
in
biennial
programs),
and
submitted
to
EPA
electronically
upon
request.

B.
Reporting
Requirements
(
1)
Annual
Report
Internal
data
analysis
and
reporting
are
currently,
voluntarily
employed
in
better­
run
I/
M
programs
as
management
tools
to
facilitate
the
monitoring
and
evaluation
of
the
program
by
program
management.
This
ICR
requests
that
EPA
be
formally
included
as
a
recipient
of
this
information.
To
be
considered
complete,
these
reports
shall
include
information
regarding
the
types
of
program
activities
performed
and
their
final
outcomes,
including
summary
statistics
and
effectiveness
evaluations
of
the
enforcement
mechanism,
the
quality
assurance
system,
the
quality
control
program,
and
the
testing
element.
Under
this
ICR,
respondents
will
be
required
to
provide
EPA
with
the
following
data
annually.
Again,
as
previously
stated,
these
reporting
requirements
are
based
upon
the
current
reporting
practices
of
better­
run
I/
M
programs,
and,
as
such,
do
not
constitute
an
additional
respondent
burden
(
see
Attachment
3
for
examples
of
internal
program
reports
currently
generated
as
part
of
the
California
program,
and
which
address
information
similar
to
that
required
below).

Test
Data
Summary
The
program
shall
submit
to
EPA
by
July
of
each
year
a
report
providing
basic
statistics
on
the
testing
program
for
January
through
December
of
the
previous
year,
including:

1)
The
number
of
vehicles
tested
by
model
year
and
vehicle
type
2)
By
model
year
and
vehicle
type,
the
number
and
percentage
of
vehicles:
>>
DRAFT:
February
23,
2004
<<

8
i)
Failing
the
emissions
test
initially,
per
test
type
ii)
Failing
the
first
retest
per
test
type
iii)
Passing
the
first
retest
per
test
type
iv)
Initially
failed
vehicles
passing
the
second
or
subsequent
retest
per
test
type
v)
Initially
failed
vehicles
receiving
a
waiver,
and
vi)
Vehicles
with
no
known
final
outcome
(
regardless
of
reason)
vii)
Passing
the
on­
board
diagnostic
check
viii)
Failing
the
on­
board
diagnostic
check
ix)
Failing
the
on­
board
diagnostic
check
and
passing
the
tailpipe
test
(
if
applicable)
x)
Failing
the
on­
board
diagnostic
check
and
failing
the
tailpipe
test
(
if
applicable)
xi)
Passing
the
on­
board
diagnostic
check
and
failing
the
I/
M
gas
cap
evaporate
system
test
(
if
applicable)
xii)
Failing
the
on­
board
diagnostic
check
and
passing
the
I/
M
gas
cap
evaporate
system
test
(
if
applicable)
xiii)
Passing
both
the
on­
board
diagnostic
check
and
the
I/
M
gas
cap
evaporate
system
test
(
if
applicable)
xiv)
Failing
both
the
on­
board
diagnostic
check
and
the
I/
M
gas
cap
evaporate
system
test
(
if
applicable)
xv).
MIL
is
commanded
on
and
no
codes
are
stored
xvi)
MIL
is
not
commanded
on
and
codes
are
stored
xvii)
MIL
is
commanded
on
and
codes
are
stored
xviii)
MIL
is
not
commanded
on
and
codes
are
not
stored
xix)
Readiness
status
indicates
that
the
evaluation
is
not
complete
for
any
models
supported
by
on­
board
diagnostic
systems
3)
The
initial
test
volume
by
model
year
and
test
station
4)
The
initial
test
failure
rate
by
model
year
and
test
station
5)
The
average
increase
or
decrease
in
vehicle
emission
levels
after
repairs
by
model
year,
and
vehicle
type
for
vehicles
receiving
a
mass
emissions
test
Quality
Assurance
Summary
The
program
shall
submit
to
EPA
by
July
of
each
year
a
report
providing
basic
statistics
on
the
quality
assurance
program
for
January
through
December
of
the
previous
year,
including:

1)
Number
of
inspection
stations
and
lanes:
i)
Operating
throughout
the
year
ii)
Operating
for
only
part
of
the
year
>>
DRAFT:
February
23,
2004
<<

9
2)
The
number
of
inspection
stations
and
lanes
operating
throughout
the
year:
i)
Receiving
overt
performance
audits
in
the
year
ii)
Not
receiving
overt
performance
audits
in
the
year
iii)
Receiving
covert
performance
audits
in
the
year
iv)
Not
receiving
covert
performance
audits
in
the
year
v)
That
has
been
shut
down
as
a
result
of
overt
performance
audits
3)
The
number
of
covert
audits:
i)
Conducted
with
the
vehicle
set
to
fail
per
test
type
ii)
Conducted
with
the
vehicle
set
to
fail
any
combination
of
two
or
more
test
type
iii)
Resulting
in
a
false
pass
per
test
type
iv)
Resulting
in
a
false
pass
for
any
component
checks
of
two
or
more
test
types
4)
The
number
of
inspectors
and
stations:
i)
That
were
suspended,
fired,
or
otherwise
prohibited
from
testing
as
a
result
of
covert
audits
ii)
That
were
suspended,
fired,
or
otherwise
prohibited
from
testing
for
other
causes
iii)
That
received
fines
5)
The
number
of
inspectors
licensed
or
certified
to
conduct
testing
6)
The
number
of
hearings:
i)
Held
to
consider
adverse
actions
against
inspectors
and
stations
ii)
Resulting
in
adverse
actions
against
inspectors
and
stations
7)
The
total
amount
collected
in
fines
from
inspectors
and
stations
by
type
of
violation
8)
The
total
number
of
covert
vehicles
available
for
undercover
audits
over
the
year
9)
The
number
of
covert
auditors
available
for
undercover
audits
Quality
Control
Summary
The
program
shall
submit
to
EPA
by
July
of
each
year
a
report
providing
basic
statistics
on
the
quality
control
program
for
January
through
December
of
the
previous
year,
including:

1)
The
number
of
emission
testing
sites
and
lanes
in
use
in
the
program
2)
The
number
of
equipment
audits
by
station
and
lane
3)
The
number
and
percentage
of
stations
that
have
failed
equipment
audits
4)
The
number
and
percentage
of
stations
and
lanes
shut
down
as
a
result
of
equipment
audits
Enforcement
Summary
>>
DRAFT:
February
23,
2004
<<

10
1)
All
varieties
of
enforcement
programs
shall,
at
a
minimum,
submit
to
EPA
by
July
of
each
year
a
report
providing
basic
statistics
on
the
enforcement
program
for
January
through
December
of
the
previous
year,
including:
i)
An
estimate
of
the
number
of
vehicles
subject
to
the
inspection
program,
including
the
results
of
an
analysis
of
the
registration
data
base
ii)
The
percentage
of
motorist
compliance
based
upon
a
comparison
of
the
number
of
valid
final
tests
with
the
number
of
subject
vehicles
iii)
The
total
number
of
compliance
documents
issued
to
inspection
stations
iv)
The
number
of
missing
compliance
documents
v)
The
number
of
time
extensions
and
other
exemptions
granted
to
motorists
vi)
The
number
of
compliance
surveys
conducted,
number
of
vehicles
surveyed
in
each,
and
the
compliance
rates
found
2)
Registration
denial
based
enforcement
programs
shall
provide
the
following
additional
information:
i)
A
report
of
the
program's
efforts
and
actions
to
prevent
motorists
from
falsely
registering
vehicles
out
of
the
program
area
or
falsely
changing
fuel
type
or
weight
class
on
the
vehicle
registration,
and
the
results
of
special
studies
to
investigate
the
frequency
of
such
activity
ii)
The
number
of
registration
file
audits,
number
of
registrations
reviewed,
and
compliance
rates
found
in
such
audits
3)
Computer­
matching
based
enforcement
programs
shall
provide
the
following
additional
information:
i)
The
number
and
percentage
of
subject
vehicles
that
were
tested
by
the
initial
deadline,
and
by
other
milestones
in
the
cycle
ii)
A
report
on
the
program's
efforts
to
detect
and
enforce
against
motorists
falsely
changing
vehicle
classifications
to
circumvent
program
requirements,
and
the
frequency
of
this
type
of
activity
iii)
The
number
of
enforcement
system
audits,
and
the
error
rate
found
during
those
audits
4)
Sticker­
based
enforcement
systems
shall
provide
the
following
information
in
addition
to
the
general
requirements:
i)
A
report
on
the
program's
efforts
to
prevent,
detect,
and
enforce
against
sticker
theft
and
counterfeiting,
and
the
frequency
of
this
type
of
activity
ii)
A
report
on
the
program's
efforts
to
detect
and
enforce
against
motorists
falsely
changing
vehicle
classifications
to
circumvent
program
requirements,
and
the
frequency
of
this
type
of
activity
iii)
The
number
of
parking
lot
sticker
audits
conducted,
the
number
of
vehicles
surveyed
in
each,
and
the
noncompliance
rate
found
during
those
audits
(
2)
Biennial
Report
>>
DRAFT:
February
23,
2004
<<

11
In
addition
to
the
above
annual
reports,
programs
shall
submit
to
EPA
by
July
of
every
other
year,
biennial
reports
addressing:

1)
Changes
made
in
program
design,
funding,
personnel
levels,
procedures,
regulations,
and
legal
authority,
with
detailed
discussion
and
evaluation
of
the
impact
on
the
program
of
all
such
changes
2)
Any
weaknesses
or
problems
identified
in
the
program
within
the
two­
year
reporting
period,
what
steps
have
already
been
taken
to
correct
those
problems,
the
results
of
those
steps,
and
any
future
efforts
planned
(
ii)
Respondent
Activities
In
preparing
to
submit
the
first
round
of
the
above­
listed
summary
data
for
this
information
collection,
the
respondent
program
must
pursue
the
following
activities.
Several
of
these
activities
are
essentially
one­
time
efforts
(
such
as
pursuing
legal
authority
and
constructing
testing
sites)
required
to
comply
with
the
Act's
mandate
that
such
programs
be
implemented
in
the
first
place,
while
others
are
activities
that
are
currently,
voluntarily
conducted
in
better­
run
I/
M
programs
for
the
sake
of
program
implementation,
management,
and
enforcement,
and
would
therefore
be
pursued
regardless
of
this
information
collection
(
see
Attachment
3).
Such
activities
have
been
identified
here
as
common
business
practice
(
CBP),
even
though,
properly
speaking,
the
respondent
entities
are
representatives
of
state
government
agencies
or
departments.
Respondent
activities
have
been
separated
for
the
annual
and
biennial
reports,
and
separate
burden
estimates
are
provided
for
each
in
section
6
of
this
submittal.

Annual
Report
!
Read
the
I/
M
regulation
(
CBP)

!
Review
the
regulatory
provisions
addressing
the
annual
reporting
requirement
and
assess
respondent
responsibility
!
Gather
test
and
quality
control
information
and
review
for
accuracy
(
CBP)

!
Analyze
the
test
and
quality
control
data.(
CBP)

!
Based
upon
analysis
of
data,
begin
enforcement
efforts
against
motorists,
stations,
and
inspectors
(
CBP)

!
Complete
written
or
electronic
"
paperwork"
associated
with
enforcement
and
program
oversight
efforts
(
CBP)
>>
DRAFT:
February
23,
2004
<<

12
!
Store,
file,
and
maintain
all
relevant
program
records
and
information
(
CBP)

!
Assemble
existing
quarterly
and
other
relevant
reports
in
preparation
for
summarization
!
Prepare
annual
summaries
of
program
operating
statistics
for
the
enforcement
mechanism,
the
quality
assurance
system,
the
quality
control
program,
and
the
testing
element
based
upon
existing,
internal
quarterly
reports
!
Review
summary
information
for
accuracy
!
Prepare
and
submit
annual
report
to
EPA
Biennial
Report
In
addition
to
the
above
activities
associated
with
the
submittal
of
the
annual
information
collection,
under
the
current
I/
M
rule,
all
I/
M
programs,
both
basic
and
enhanced,
must
also
submit
a
biennial
report
assessing
any
program
change,
including
weaknesses
identified
and
improvements
made
since
the
previous
report.
Again,
many
of
these
activities,
given
their
importance
in
the
areas
of
effective
program
management
and
helping
to
ensure
that
program
resources
are
not
wasted
or
abused,
will
need
to
be
pursued
internally
regardless
of
this
external
reporting
requirement.
As
such,
many
of
these
activities
are
designated
as
CBP.

!
Track
and
record
all
changes
made
in
program
design,
funding,
personnel
levels,
procedures,
regulations,
and
legal
authority
!
Conduct
an
evaluation
of
the
impact
on
the
program
of
all
such
changes
(
CBP)

!
Conduct
periodic
internal
investigations
to
discover
and
correct
weaknesses
(
CBP)

!
Track
and
record
all
such
weaknesses
or
problems
identified
in
the
program
within
the
two­
year
reporting
period,
and
the
steps
taken
to
correct
those
problems
!
Evaluate
the
results
of
those
steps
(
CBP)

!
Assemble
and
report
the
above
required
information,
including
any
future
efforts
planned
>>
DRAFT:
February
23,
2004
<<

13
5.
THE
INFORMATION
COLLECTED
­
AGENCY
ACTIVITIES,
COLLECTION
METHODOLOGY,
AND
INFORMATION
MANAGEMENT
5(
a)
Federal
Agency
Activities
In
reinstating
this
information
collection,
it
is
estimated
that
EPA
will
need
to:

!
Prepare
ICR
reinstatement
(
one­
time
activity)

!
Answer
respondent
questions
!
Audit
or
review
data
submissions
!
Store
data
5(
b)
Collection
Methodology
and
Management
A
portion
of
this
information
has
been
assembled
in
one
form
or
another
since
the
inception
of
I/
M
programs
as
a
result
of
the
Clean
Air
Act
Amendments
of
l977
which
required
urban
areas
failing
to
meet
the
National
Air
Quality
Standards
to
implement
in­
use
vehicle
I/
M
programs.
Historically,
the
sources
of
this
information
have
included,
among
other
things,
on­
site
audits
by
EPA
personnel,
internal
program
data
reports
courtesy­
copied
to
EPA
(
see
Attachment
2)
and
raw
program
data
submitted
to
EPA
for
analysis.

Respondents
will
have
the
option
to
supply
their
data
either
in
hardcopy
or
electronically,
and
are
free
to
adopt
whatever
reporting
format
results
in
the
least
burden
for
the
respondent,
while
also
addressing
the
data
elements
listed
in
this
ICR.

5(
c)
Small
Entity
Flexibility
This
section
is
not
applicable.
Our
respondents
are
not
small
business
entities
but
state
governments
and
their
representatives.

5(
d)
Collection
Schedule
The
first
annual
report
due
under
this
reinstated
ICR
shall
be
submitted
during
the
first
July
following
approval
of
reinstatement,
and
shall
be
submitted
annually
thereafter.
The
first
biennal
report
shall
be
due
one
year
following
the
first
annual
report,
and
shall
be
submitted
biennally
thereafter.
>>
DRAFT:
February
23,
2004
<<

14
6.
ESTIMATING
THE
BURDEN
AND
COST
OF
THE
COLLECTION
In
simple
terms,
EPA
calculated
the
additional
burden
and
cost
estimates
for
this
ICR
based
on
the
following
equation:

Number
of
state
x
number
of
burden
hours
x
cost
per
hour
implementing
I/
M
for
each
state
to
comply
with
Program
I/
M
ICR
The
following
paragraphs
in
Section
6(
a)
describe
the
information
relied
upon
for
the
first
two
variables
in
this
ICR
for
state
and
local
burden
hour
estimates.
Section
6(
b)
describes
how
state
and
local
costs
were
estimated.
Section
6(
c)
describes
how
the
federal
burden
hours
and
costs
were
estimated
for
this
ICR.

6(
a)
Estimating
Respondent
Burden
The
burden
estimates
for
both
the
recordkeeping
and
annual
reporting
requirements
were
made
using
professional
judgement.
Recordkeeping
activities
are
assumed
to
be
routine,
automated,
and
conducted
primarily
for
the
effective
management
of
the
program.
Nonetheless,
an
hour
of
burden
has
been
assumed
for
this
category.
The
estimate
of
the
burden
for
information
gathering
by
technical
staff
for
the
annual
report
is
a
conservative
one
based
upon
an
informal
interview
with
a
representative
of
the
Louisville,
Kentucky
state
I/
M
program.
Responses
were
based
upon
the
representative's
experience
with
compiling
similar,
internal
quarterly
and
annual
reports,
as
well
as
pre­
audit
data
submissions
for
EPA.

Estimates
of
respondent
burden
for
the
biennial
report
are
based
upon
OTAQ
program
management
staff
experience
with
the
assemblage
and
compilation
of
materials
for
the
office's
annual
Assurance
Letter
under
the
Federal
Managers
Financial
Integrity
Act
(
FMFIA).
In
many
ways,
the
biennial
report
most
resembles
the
Assurance
Letter
in
its
requirement
that
program
changes
and
internal
control
activities
be
tracked,
evaluated,
and
reported.

6(
b)
Estimating
Respondent
Costs
Given
that
the
respondents
to
this
ICR
will
be
state
government
employees,
in
managerial,
technical,
and
clerical
positions,
the
General
Schedule
on
Federal
employee
pay
system
was
adopted
as
a
yardstick
for
estimating
hourly
labor
rates.
To
ensure
that
our
estimates
are
conservative,
only
the
higher
ranges
of
these
categories
were
used.
For
the
clerical
category,
GS­
8
was
used,
while
GS­
13
was
selected
to
represent
the
technical
position,
and
GS­
15
was
>>
DRAFT:
February
23,
2004
<<

2
January
2004
U.
S.
Office
of
Personal
Management,
Salary
Table
2004­
GS,
2004
General
Schedule,
http://
www.
opm.
gov/
oca/
04tables/
html/
gs_
h.
asp
15
used
as
the
base
for
calculating
management
labor
rates.
First,
EPA
is
assuming
that
state
and
local
burden
hours
would
be
completed
by
an
experienced
staff
person
being
paid
hourly
rates
at
a
GS­
8,
step
3,
GS­
13,
Step
3
and
GS­
15,
step
3
of
federal
government
employee
salary.
2
Second,
EPA
multiplied
these
hourly
rates
by
the
standard
government
overhead
factor
of
1.6.
This
calculation
results
in
a
state
and
local
cost
of
$
26.69/
burden
hour,
$
50.83/
burden
hour,
and
$
70.67/
burden
hour,
respectively.
Calendar
year
2004
was
chosen
as
the
base
year.
The
resulting
labor
rates
are
shown
in
the
table
below:

Table
1.
Hourly
Labor
Rates,
By
Category,
Adjusted
for
Overhead
Factor
Hours
Rates
Clerical
(
GS­
8.3)
Technical
(
GS­
13.3)
Managerial
(
GS­
15.3)

2004
base
year
16.68
31.77
44.17
Adjusted
by
the
overhead
factor
1.6
26.69
50.83
70.67
Table
2.
Respondent
Burden
Hours
&
Cost
for
Annual
Report
Collection
Activity
Managerial
Hours
Technical
Hours
Clerical
Hours
Burden
Hours
Annual
Costs
1.
Read
regulatory
provisions
1
1
2
$
121.50
2.
Assess
data
requirements
8
8
$
406.64
>>
DRAFT:
February
23,
2004
<<

16
3.
Assemble
reports
&
data
16
16
$
813.28
4.
Review
information
for
accuracy
8
8
$
406.64
5.
Summarize
information
4
4
$
203.32
6.
Prepare
and
submit
report
4
1
5
$
230.01
7.
Record,
store,
&
maintain
files
1
1
$
50.83
Total
1
42
1
43
$
2,232
Table
3.
Respondent
Burden
Hours
&
Cost
for
Biennal
Report
Collection
Activity
Managerial
Hours
Technical
Hours
Clerical
Hours
Burden
Hours
Annual
Costs
1.
Track/
record
program
change
16
16
$
813.28
2.
Track/
record
weakness/
correction
16
16
$
813.28
3.
Assemble/
report
findings/
future
plans
16
32
4
52
$
2,864.04
Total
16
64
4
84
$
4,491
Annual
burden
hours
per
respondent
=
(
Hr.
of
annual
report)
+
(
Hr
of
biennial
report)/
2
=
43
+
84/
2
=
85
Annual
burden
cost
>>
DRAFT:
February
23,
2004
<<

3
Page
1­
1,
"
United
Satates
Motor
vehicle
Inspection
and
Maintenance
(
I/
M)
Programs,"
by
Sierra
Research,
Inc.
November
2003.

17
per
respondent
=
$
2,232
+
($
4,491)
/
2
=
$
4,478
Annual
total
burdent
hours
for
all
respondents:
=
(
annual
burdent
hours
per
respondent)
x
(
total
number
of
respondents3)
=
85
x
34
=
2,890
Annual
total
burden
costs
for
all
respondents
=
(
annual
burden
cost
per
respondent)
x
(
total
number
of
respondents)
=
$
4,478
X
34
=
$
152,252
6(
c)
Estimating
Agency
Burden
and
Cost
The
estimates
of
each
class
of
hours
needed
to
handle
the
burdens
associated
with
this
ICR
are
based
upon
the
experience
EPA
has
had
in
handling
similar
tasks
such
as
assembling
and
processing
pre­
audit
information
from
individual
states
on
an
as­
needed
basis.
The
labor
categories
and
hourly
labor
rates
are
the
same
as
those
used
for
the
respondent
burden
estimates,
and
are
decidedly
conservative.

Table
4.
Agency
Burden
for
One
Time
ICR
Costs
One­
time
Activity
Managerial
Hours
Technical
Hours
Clerical
Hours
Burden
Hours
Annual
Costs
ICR
reinstatement
2
80
1
83
$
4,234.43
Total
2
80
1
83
$
4,234.43
Table
5.
Agency
Burden
Hours
&
Cost
per
Annual
&
Biennial
Report
>>
DRAFT:
February
23,
2004
<<

18
Collection
Activity
Manag
erial
Hours
Technical
Hours
Clerical
Hours
Burden
Hours
Annual
Costs
1.
Answer
questions
4
34
38
$
2,010.9
2.
Review
data
submissions
34
34
$
1,728.22
3.
Record
data
submission
34
34
$
907.46
4.
Store
data
34
34
$
1,728.22
Total
4
102
34
140
$
6,374.80
Annual
total
burdent
hours
for
agency
=
(
annual
agency
burden
hours)
+
(
average
start
up
hours
over
3
years)
=
140
+
83/
3
=
168
Annual
total
burden
cost
for
agency
=
(
annual
agency
burden
costs)
+
(
Average
start
up
cost
over
3
years)
=
$
6,375
+
$
4,235/
3
=
$
7,787
6(
d)
Bottom
Line
Burden
Hours
and
Costs
The
bottom
line
burden
hours
and
costs
appear
in
Table
6.
Total
annnual
respondent
burden
associated
with
this
ICR
is
estimated
to
be
approxmately
2,890
burden
hours.
The
corresponding
total
annual
respondent
costs
are
estimated
to
be
$
152,252.
Total
national
burden,
including
respondent
burden
and
EPA
burden,
is
estimated
to
be
3,058
hours
annually.
The
total
national
cost,
for
respondents
and
EPA,
is
estimated
to
be
$
160,039
annually.

Table
6.
Bottom
Line
Annual
Burden
and
Cost
>>
DRAFT:
February
23,
2004
<<

19
number
of
respondents
34
33
states
plus
D.
C.
(
2003)

total
annual
respondents
34
one
response
per
respondent
hours
per
respondent
85
table
2
and
3
cost
per
respondent
$
4,478
table
2
and
3
total
respondent
hours
2890
hours
per
resp.
x
34
total
respondent
cost
$
152,252
cost
per
resp.
x
34
total
agency
hours
168
table
5
total
agency
cost
$
7,787
table
5
total
burden
hours
(
resp.
+
agency)
3058
total
respondent
hours
+
total
EPA
hours
total
burden
cost
(
resp.
+
agency)
$
160,039
total
respodent
cost
+
total
EPA
cost
6(
e)
Reasons
for
Change
in
Burden
The
total
annual
burden
estimated
in
1992,
I/
M
Program
Proposed
Rule
ICR,
approved
by
OMB,
was
2,282
hours.
The
annual
burden
hours
per
respodent
was
85
hours.
As
indicated
in
Table
6,
the
annual
burden
hours
per
respondent
is
estimated
to
be
85
hours
in
this
ICR
but
the
estimate
of
total
respondents
burden
has
changed
to
2890
hours.
The
reason
for
this
change
is
due
to
the
fact
that
the
original
ICR
reflected
phased­
in
program
inplementation,
while
the
current
reinstatement
reflects
full
program
implementation.

6(
f)
Burden
Statement
Public
reporting
burden
for
collections
included
in
this
ICR
is
detailed
above.
The
annual
burden
per
respondent
is
estimated
to
be
85
hours,
and
the
total
annual
respondent
burden
imposed
by
these
collections
is
estimated
to
be
2,890
hours
(
34
respondents).
These
estimates
include
time
for
summarizing
data
as
well
as
reporting
summaries.
Burden
means
the
total
time,
effort,
or
financial
resources
expended
by
persons
to
generate,
maintain,
retain,
or
disclose
or
provide
information
to
or
for
a
Federal
agency.
This
includes
the
time
needed
to
review
instructions;
develop,
acquire,
install,
and
utilize
technology
and
systems
for
the
purposes
of
collecting,
validating,
and
verifying
information,
processing
and
maintaining
information,
and
disclosing
and
providing
information;
adjust
the
existing
ways
to
comply
with
any
previously
applicable
instructions
and
requirements;
train
personnel
to
be
able
to
respond
to
a
collection
of
information;
search
data
sources;
complete
and
review
the
collection
of
information;
and
transmit
>>
DRAFT:
February
23,
2004
<<

20
or
otherwise
disclose
the
information.
An
agency
may
not
conduct
or
sponsor,
and
a
person
is
not
required
to
respond
to,
a
collection
of
information
unless
it
displays
a
currently
valid
OMB
control
number.
The
OMB
control
numbers
for
EPA's
regulations
are
listed
in
40
CFR
part
9
and
48
CFR
chapter
15.

To
comment
on
the
Agency's
need
for
this
information,
the
accuracy
of
the
provided
burden
estimates,
and
any
suggested
methods
for
minimizing
respondent
burden,
including
the
use
of
automated
collection
techniques,
EPA
has
established
a
public
docket
for
this
ICR
under
Docket
ID
No.
OAR­
2004­
0012,
which
is
available
for
public
viewing
at
the
Air
and
Radiation
Docket
and
Information
Center,
in
the
EPA
Docket
Center
(
EPA/
DC),
EPA
West,
Room
B102,
1301
Constitution
Ave.,
NW,
Washington,
DC.
The
EPA
Docket
Center
Public
Reading
Room
is
open
from
8:
30
a.
m.
to
4:
30
p.
m.,
Monday
through
Friday,
excluding
legal
holidays.
The
telephone
number
for
the
Reading
Room
is
(
202)
566­
1744,
and
the
telephone
number
for
the
Air
Docket
is
(
202)
566­
1742.
An
electronic
version
of
the
public
docket
is
available
through
EPA
Dockets
(
EDOCKET)
at
http://
www.
epa.
gov/
edocket.
Use
EDOCKET
to
submit
or
view
public
comments,
access
the
index
listing
of
the
contents
of
the
public
docket,
and
to
access
those
documents
in
the
public
docket
that
are
available
electronically.
Once
in
the
system,
select
"
search,"
then
key
in
the
docket
ID
number
identified
above.
Also,
you
can
send
comments
to
the
Office
of
Information
and
Regulatory
Affairs,
Office
of
Management
and
Budget,
725
17th
Street,
NW,
Washington,
DC
20503,
Attention:
Desk
Office
for
EPA.
Please
include
the
EPA
Docket
ID
No.
(
OAR­
2004­
0012)
and
OMB
control
number
(
2060­
0252)
in
any
correspondence.
