Ces 2013 - Doing Developmental Evaluation at the System Level
Selecting Performance Indicators for Community Settings
1. 111
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P1
Reed Early, MA CE
CES National Conference
Toronto, 2013
Agenda
3:15pm criteria for good performance indicators, and examples
3:30 styles of indicator selection
3:45 lessons learned and traps to avoid selecting indicators
Questions welcome anytime
Mon 3:15 – 4:45pm
Main Mezzanine, Tudor 8 Room
Handout and Powerpoint at http://www3.telus.net/reedspace/shared/
Performance
Indicators
withinand
across
Community
settings
2. 222
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P2
Performance Indicators (PI) Within and Across Community Settings
Performance Indicators – Some everyday examples:
Light bulb – i.e. brightness compared to candles or watts
Paint and shingles - i.e. durability over time
Business – i.e. 3rd
quarter earnings compared to last year
Behavior change class – i.e. did they quit smoking
Grade 12 graduates – i.e. employed or in university
Styles of Indicator Selection
Management (goals and objectives)
Strategic (strategies and targets)
Consultative (focus group or survey data collection and analysis)
Best Practice (borrow from the best, adapt, evaluate and choose)
Performance Measurement Systems
a human and computer based system to measure indicators, i.e. HOMES
client based indicators i.e. user satisfaction surveys, exit interviews
community results based indicators, i.e. Tools For Action Series A resource guide for
designing a community indicator project. a report by SPARC BC, April 2008
Community Objective and Potential Indicators, United Way of Greater Milwaukee,
Planning, Allocation & Monitoring Division
Lessons learned
Try to get good indicators
Make sure indicator is not driving performance
Don’t sacrifice Meaning for Measurability (or Relevance for Rigor)
Remember - indicators may need revision
Use 2+ indicators for anything important
Enrich indicators by discussing limitations
Traps to avoid in Success Indicators (Scriven)
Teaching the test ( indicator abuse)
Indicator becomes the mission
Indicator displaces the problem
Watch for letting IT drive the choice of indicators (putting cart before horse)
Doing it because its popular
Quick and dirty evaluation
Take Away Notes
a) Performance indicators relate to logic models
b) Indicator selection requires care, time and conscious thought
c) Different styles are: Management, strategic, consultative, best practice
d) Indicators should be selected and evaluated against criteria
Handout and Powerpoint at http://www3.telus.net/reedspace/shared/
3. 333
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P3
Performance Indicators (PI) Within and Across Community Settings
Participants: Community agency? Federal? Provincial? Municipal? Academic?
..the organizational dilemma is: at the top one works on the right problems but with the wrong
information, and at the bottom one works with the right information but on the wrong problems...
Arnold J. Meltsner
Question - think of an example of the above
1) Common fallacies of selecting performance indicators
a) Leave it to management - just ask the people at the top
b) Its easy – anyone can do it (or you’re not a good manager)
c) Its obvious - just go find some (that will make us look good)
d) Its quick - form a subcommittee (and tell us tomorrow)
Activity 1 - Obstacles to selecting indicators (3-4 min) (Chart)
> Close your eyes and write your name on a card (no visual feedback)
> Turn it over and write your place of birth using your non-dominant hand (new task)
> Swap cards with the person sitting next to you and verbally instruct them to write your age in
months without telling them what it actually is (unfamiliar units, indirect communication).
> Optional - Provide lengthy written instructions via a Policy and Procedure Manual on how to
measure your success on the card (awkward obfuscated instructions)
2) Beliefs and values that help get buy in. Convince people that:
Information is a good thing – accurate information is even better
Performance information will empowers them to be more efficient and effective
Information and experience combined make knowledge
Knowledge is to be shared - knowledge is not “power over”
Knowledge is to drive action - or it is meaningless
3) Performance Indicators – Some everyday examples:
a) Light bulb – i.e. brightness compared to candles or watts
b) Paint and shingles - i.e. durability over time
c) Business – i.e. 3rd
quarter earnings compared to last year
d) Behavior change class – i.e. did they quit smoking
e) Grade 12 graduates – i.e. employed or in university
4) PI should (in aggregate):
a) Measure results, short range (outcomes) as well as long range (impacts)
4. 444
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P4
Obsession with outcomes…if all you have is a hammer...every problem becomes a nail
b) Measure outputs, process, delivery (to know what caused the indicator to shift)
c) Measure elements of a “theory of change” i.e. the determinants of success
d) Assess agency resources (capacity is necessary but not sufficient for success)
5) Logic Models (Chart)
a) The best tool to start with
b) Include at least activities, outputs, and outcomes
6) Success Measures Proximity - (Chart)
a) Field of influence – include measures outside the walls of the organization, but within
the limits of measurability
b) Timing – include measures during and immediately after the program
c) Consider long term measures
7) Styles of Indicator Selection (Chart)
a) Management (goals and objectives)
b) Strategic (strategies and targets)
c) Consultative (focus group or survey data collection and analysis)
d) Best Practice (borrow from the best, adapt, evaluate and choose)
8) Management (Chart)
a) Indicators reflect goals and objectives – i.e. related to logic model outcomes
b) Useful in service or prevention oriented agencies
c) Often mandated via legislation i.e. service plans
d) This style may be used by senior managers to independently choose the indicators
9) Strategic (Chart)
a) Indicators reflect strategies and targets – i.e. activities, outputs and near outcomes
b) Useful for units that produce a quantifiable or tangible products
c) Often adopted as a result of accountability or accreditation initiative
d) May be used by performance team in a larger agency
10) Consultative (Chart)
a) Indicators based on focus group or survey of employees (or clients)
b) Useful for decentralized, flat organizations
c) Requires time and effort and original data collection
d) May be used by smaller agencies practicing more democratic management
11) Best Practice (Chart)
a) Selected from literature/Internet, from similar leading agencies, and adapted
b) Useful to any agency with resources to search and research
c) Requires time, effort, and openness to experimentation
d) May provide benchmark comparability
5. 555
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P5
Our best effort should be spent on finding out what funders, clients and other
stakeholders define as success.
Guy Leclerc
The priest, the cabbie and Saint Peter
12) Health examples (Chart)
a) Immunization rate
b) Infant mortality rate
c) Hospital acquired infections
d) Inpatient mortality rate
e) Cost per bed/day
13) Social Services examples (Chart)
a) Child Behavior Checklist
b) Early unmarried childbearing
c) School attendance and graduation rate
d) Children in families below poverty line
e) Youth unemployment
f) Cost per child in care
14) Industry Strategic Indicators (Chart and example)
a) Sales
b) Sales of new products
c) Value added to raw material consumed
d) Cost savings to industry i.e. reduced training and down time
e) Increased market share %
f) Increased geographic penetration
15) Continuing Education (Montague) (Chart)
a) Resources (staff, funding)
b) Reach (target market, consumers)
c) Relevance (meaningful)
d) Results - Educational (meets the mandate)
e) Results - Financial (affordable)
16) Performance Measurement Systems
a) a human and computer based system to measure indicators, i.e. HOMES
b) client based indicators i.e. user satisfaction surveys, exit interviews
c) community results based indicators, i.e. Tools For Action Series A resource guide for
designing a community indicator project. a report by SPARC BC, April 2008
(the above should not be confused with forms of accreditation and audit – such as ISO9000, CARF, COA, etc)
d) Community Objective and Potential Indicators, United Way of Greater Milwaukee,
Planning, Allocation & Monitoring Division
17) How do YOU develop Performance Indicators?
18) Conference performance indicators of success
6. 666
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P6
e) Participant satisfaction
f) Citations to conference
g) Publications from conference
h) Networking and outside connections
i) Sleeper effects (year later citations?)
j) Diffusion of benefits (client benefits?)
k) ……
19) Criteria for a performance indicator:
A. Authoritative - commonly agreed to be true (i.e. speedometer)
B. Economical – only what’s needed (not all the instruments of a 747)
C. Ethical - (i.e. urgent child safety issues cannot be “monitored”)
D. Feasible – possible to measure (i.e. difficulty of assessing safe sex practices)
E. Logical - outputs vs outcomes (i.e. #pamphlets given do not equal #pamphlets read)
F. Manageable - suggest 10 at a time and no more than 40 overall
G. Measurable – qualitative or quantitative (i.e. trust level on a scale of 1-10)
H. Reliable - accurate (i.e. don’t ask literacy of homeless) (use split half, test retest)
I. Specific – at the right level of precision (i.e. don’t ask satisfaction on 100 point scale)
J. Visible/accessible like a car dashboard - (i.e. original cars had gas gauge on the tank)
K. Timely - (i.e. instant readout of gas economy versus computed mpg, conversions etc)
L. True - measure of success (i.e. logically measures end goal - face validity)
M. Valid – exact or close proxy (i.e. not # rings to answer phone) (construct validity)
20) Types of indicators (Charts)
l) Analog and continuous
m) Digital and Logical
n) Qualitative and Spatial
o) Ratio and Rates
21) When NOT to do Performance Measures
p) Low dosage - program too weak
q) Immature - program continuously evolving
r) Amorphous - no explicit or credible logic/theory
s) The good cause - program with no goals
t) Impact is already well known
u) Poor delivery model
v) Unethical
w) Nothing to compare to
x) A negative finding cannot be accepted
y) A ridiculous waste of time
22) Lessons learned
z) Try to get good indicators
aa) Make sure indicator is not driving performance
bb) Don’t sacrifice Meaning for Measurability (or Relevance for Rigor)
cc) Remember - indicators may need revision
dd) Use 2+ indicators for anything important
7. 777
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P7
ee) Enrich indicators by discussing limitations
23) Traps to avoid in Success Indicators (Scriven)
ff) Teaching the test ( indicator abuse)
gg) Indicator becomes the mission
hh) Indicator displaces the problem
ii) Watch for letting IT drive the choice of indicators (putting cart before horse)
jj) Doing it because its popular
kk) Quick and dirty evaluation
24) Take Away Notes
ll) Performance indicators relate to logic models
mm) Indicator selection requires care, time and conscious thought
nn)Different styles are:
i) Management, strategic, consultative, best practice
oo) Indicators should be selected and evaluated against criteria
In government - Performance indicators cover the range of activities:
financial performance: appropriation mechanism, source and application of funds, prudence,
diligence, probity, integrity and financial accounting and reporting;
legal compliance: fairness, equity and probity: the extent to which the agency has met its
legislative requirements and its standards of conduct (such as human rights, employment equity, and
conflict of interest guidelines);
operational performance: achievement of outputs targets, delivery systems for the goods and
services produced in an economical, efficient and cost-effective manner;
organizational performance: overall capability of the organization and the interactions among
strategic planning, management structures and processes, human, material and financial resources, all
in relation to the mission and goal and the demands of the external environment. Management
direction, working environment, appropriate control systems, monitoring and reporting systems (on
inputs, and outputs);
program performance: information on policy intent, on the continued relevance,
appropriateness and responsiveness of programs to the policy (clear objectives, clear goals, outputs,
acceptance, intended and unintended outcomes, results, impacts); cost-effectiveness;
institutional performance: the ability of the organization, to have reached its purposes, to
have fulfilled its mission, to have succeeded, in effect;
Separate is the performance of individuals in the organization including employee performance, board
performance, executive committee performance, management performance, administrative performance,
and team performance, not to be confused with program/service performance indicators.
Guy Leclerc: Accountability, Performance Reporting, Comprehensive Audit