Title: chattanoogaRESULTS
1chattanoogaRESULTS
- Kickoff Meeting
- September 30, 2003
2chattanoogaRESULTSKickoff Meeting September
30, 2003
- I. Why are we doing this?
- II. Models and Caveats
- III. Regular Reporting Schedule and Meetings
- IV. Format of Reports
- V. Format of Meetings
- VI. What Next?
3Why are we doing this?
- Information
- Defining Goals and Performance Targets
- Measuring Success/Failure of Strategies
- Accountability
4(No Transcript)
5(No Transcript)
6Models and Caveats
- What other cites are doing COMPSTAT and
CITISTAT - Lessons to Learn
- - Importance of Measurement What to Count and
What Gets Discounted - - Focusing on Output vs. Input
- - Intended vs. Unintended Consequences
- - Timely and Accurate Data
7What Performance Measurement Really Means
- Figure out what counts What are the priorities
for City government? What are we promising to
our citizens? - Count it Develop a series of measurements that
can answer policymakers (Mayor, Council) and
citizens when they ask how are we doing? - Hold people accountable for results Use
performance measurements to hold policymakers,
department heads, supervisors and staff
accountable for results.
8COMPSTAT
- Compstat has emerged as the NYPDs most
permanent, far reaching, and widely imitated
innovation It relentlessly boosts crime fighting
while remolding the department The innovative
process radiates throughout the NYPD as the
energizer of strikingly creative decision
making. - Eli Silverman, NYPD Battles Crime (1999)
9How CompStat Works
10CompStat Gets Results
11(No Transcript)
12Lessons to Be Learned
- What Counts What Matters
- The problem is that what doesnt get counted gets
discounted. - Performance measures will change over time.
- Inputs vs. Outputs
- Mayor Goldsmith and Potholes Potholes filled vs.
smoothness of streets - But inputs help to measure efficiency
- Intended vs. Unintended Consequences
- A corollary to what doesnt get counted
- The need to constantly be aware of the need to
add and subtract measures - E.g. Reduce overtime Increased hiring
- Timely and Accurate Data
- Baltimore and 311 Service Requests
- Dallas and Performance Goals
13Regular Reporting Schedule and Meetings
- Indicators
- Eventually more than 100 indicators reported on a
monthly basis - All City departments and major outside agencies
- Departments reviewed indicators
- Phase in of indicators starting with those where
data is already available (e.g. PD monthly
reports, 311, budget, OT) -
- All data will be reported on a monthly basis
even for those departments and agencies that only
have quarterly reporting meetings - Data collection is in most cases primarily
the responsibility of the department.
Information Services will extract data from data
bases. - Concerns about data accuracy will be resolved by
the Management and Budget staff in consultation
with departments and IS. - Management and Budget analysts will be
responsible for analyzing data, but not for
explaining trends.
14Format of Reports
- Goal will be to use timely data prior month
- Departments will have the opportunity to review
both data and analysis - Reports will focus on month to month and YTD
trends. Again, where possible the focus will be
on outputs not inputs. - Department specific reports (e.g. response to
service requests) vs. Citywide reports (OT and
telecommunications)
15Reporting Models
- Biweekly Overtime Reports
- Monthly Crime Strategy Report
- 311 Service Request Resolution Report
- Report will consist of
- Analysis by MBA staff
- Summary charts
- Backup data
16RESULTS Meetings
- Weekly two hour meetings with two departments or
agencies presenting - Departments will drive the meeting presentation
of data and indicators - Question and answers by Mayor, Chief of Staff,
OPR Director - MBA will develop follow up agenda for next meeting
17What Next?
- Setting performance targets and measuring
success/failure - Publication of data on a regular basis
- Development of more output oriented measurements
- Focusing on quality of performance, not just
quantity - Success of SNI
- Quality of parks