Improving healthcare in an organization requires individuals with the capability to design, test and implement improved processes in an organization with the capacity to support the scale and spread of improvement. If improvement capability is not widespread in the workforce then an intervention is needed to create the capability. In response to this challenge, Cincinnati Children’s designed and implemented a comprehensive Improvement Science curriculum to build capability. The program has achieved measurable improvements in both process and outcome measures of patient care and business processes. Incorporating unique design principles, this intervention served as a catalyst for quality transformation.
In this workshop we will share our perspective and provide examples with data that illustrates:
• Building support and buy-in through the design of participant selection.
• Creating an intervention to build capability that includes training but involves more than training.
• A comprehensive model based on competencies
• Expanding the four-level Kirkpatrick model evaluation with additional levels that encompass economic impact and network impact.
• Using self-assessment to evaluate learning outcomes.
College Call Girls in Haridwar 9667172968 Short 4000 Night 10000 Best call gi...
Advancing the Methods of Evaluation of Quality and Safety Practice and Education Workshop
1. Academy for Healthcare Improvement
Advancing the Methods of Evaluation of Quality and
Safety Practice and Education Workshop
Cincinnati Children’s Hospital
Improvement Science Education
Gerry Kaminski, MS, DA
Dan McLinden, EdD
May 29, 2014
2. Workshop Objectives
1. Building Improvement Capability :
a) A unique implementation model -Leaders First
b) Critical instructional design components
c) Achieving measurable process and outcome results
d) A comprehensive model based on competencies
2. Program evaluation design and implementation
a) Starting with theory
b) Collecting six types of data
3. Who Is Cincinnati Children’s?
Established in 1883
Patient Volumes FY11:
Admits 30,951
Pt Encounters 1,087,260
ED visits 121,875
• ~13,000 employees
• >$100 million in NIH funded research
• >$1.3 billion in revenue
• Third best children’s hospital in the country in U.S. News &
World Report ranking
4. Who Is Cincinnati Children’s?
Main Campus
Located in the center of the city:
• Full service, not-for-profit pediatric
academic medical center
• 511 beds (475 licensed, 36
residential)
College Hill Campus
Psychiatric hospital for
children and adolescents
Liberty Township
August 2008
• 24/7 ED
• 12 short stay beds
• Pediatric Medical & Surgical Clinics
• Imaging & Lab Services
• Surgical Services
13 Neighborhood Locations
5. Who Is Cincinnati Children’s?
Our Vision: To be the leader in improving child health
Our Mission: Cincinnati Children’s will improve child health
and transform delivery of care through fully integrated,
globally recognized research, education and innovation.
For patients from the community, the nation and the world,
the care we provide will achieve the best:
• medical and quality of life outcomes
• patient and family experiences and
• value
today and in the future.
6. Our Quality Journey
1994 2012
1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012
2001
Robert Wood Johnson Foundation
Pursuing Perfection (P2) grant –
Acute evidenced-based care & CF
1994
Evidence based
guidelines developed
1998
National family
centered care conference
1999
· IOM Report: To Err is Human
· Launched Strategic planning process
2001
· Strategic plan called for complete transformation
· IOM Report: Crossing the Quality Chasm
· Focus on 6 dimensions of quality
2002
Business Units incorporated
IOM dimensions into dashboards
2004
Strategic Planning focus on
integration of all 3 missions
CSI Teams Launched
Application of reliability science
2006
Launched Intermediate Improvement Science Series (I2S2)
2008
· CHCA Race for Results Award – reduction in PICU mortality due to reduction in hospital acquired infections
· Picker Institute Award- family centered care
· Codman Award for SSI reduction
2006
AHA McKesson Quest for Quality Award
2007
Launched Academic Collaborative
2008
Serious safety events
reduced from 14 to 7
50% reduction from 2007
2010
2010 - James M Anderson
Center Launched
New strategic plan with focus on
safety, and chronic disease and population health
2010
RCIC Launched
2011
AILS Launched
2012
AC External Advisory
Council Convened
8. • Improvement Capability
– An individual’s knowledge and skill to design
improvement initiatives to achieve measurable results and
the ability to execute (i.e. develop, test, measure and
implement changes) improvement efforts and sustain
results.
• Improvement Capacity
– An organization’s resources which enable it to initiate and
sustain a transformation effort. This includes capable
individuals but also structures, processes, infrastructure
including quality experts and measurement experts.
Capability vs Capacity
9. Exercise #1
1. Are you satisfied with speed and depth of integration of
quality improvement and achieving results in your
organization?
2. If you are satisfied list what you think are key drivers for
this success.
3. If you are not satisfied list the key barriers.
4. Take 2 minutes to list items and be prepared to share
your ideas.
11. Implementation Approach: Interprofessional Leaders First
Advantages
• Builds supportive network of peers
• High impact on culture change
• Develop QI coaching skills
• Align with strategic goals
• Accelerate mental model shift from research to QI and research
• See the organization as a system of interdependent parts
• Requires significant time commitment from busy leaders but dosage is
important
Disadvantages
• Project leaders struggle to teach team members as they work
• Can create a perception of exclusivity
• Requires significant time commitment from busy leaders but dosage is
important
12. Core Course:
Intermediate Improvement Science Series (I2S2)
– An improvement Science course based loosely on the Brent
James, MD Advanced Training Program (ATP)
– Intentional class participant selection with a goal of 1/3
each-nurses and allied health professionals, physicians,
nonclinical leaders
– Designed to develop QI leaders
– Assumed participants would have a basic understanding
of improvement science from working on previous QI
projects
15. Deming’s System of Profound Knowledge
Understanding Variation
• Special and Common cause
variation
• Run charts and control charts
• Segmentation, process, outcome
and balancing measures
Psychology/Change Management
• Engaging stakeholders including parents
• Dealing with resistance
• Intrinsic and extrinsic motivation
• Developing and leading teams
Appreciation of the System
• CCHMC as an interdependent system
of processes
• Process management
• Customer-supplier relationships
Theory of Knowledge/Action Learning
• Rapid cycle testing
• PDSA’s and PDSA ramps
• The Model for Improvement – The
Improvement Guide: a Practical
Approach to Enhancing Organizational
Performance, Langley, Moen, Nolan et
al, 2nd ed., 2009
16. Leadership Topics
• Business Case for Quality
• Transformational Leadership
• Chronic Care Improvement
• Managing a Portfolio of Projects
• Implementation and Sustaining
• Patient Safety
• Research and Improvement
18. I²S² Instructional Design/Session Structure
• An intact multidisciplinary cohort of 25 – 30 students
– Stimulates interaction and learning
– Reinforces cultural change
– Allows students to see CCHMC as an interdependent
system
• Table seating arrangements changed at each session
• Project presentations and feedback in each session
• Six 2-day sessions over a 6 month period – allows time for
reflection and abstract conceptualization
• All sessions off-site, require 100% attendance
20. Overall Focus and Team Leader Aim Results by end of program
ED care of oncology and bone
marrow transplant patients with
fever and immunocompromised
(led by physician faculty)
Increase the percent of patients with fever and
immunocompromised who receive their first
antibiotic within 90 minutes of arrival from
28% to 90%
Increased to 90% the patients receiving
antibiotics in 90 minutes
Improve ED infusion stop time
documentation (led by business
director)
Increase infusion stop time documentation
from 20% to 95% of the time by 10/16/2010
Increased infusion stop time
documentation from 20% to 58%
Rapid strep turn-around time in the
ED (led by advance practice nurse)
Decrease lab turnaround time from 60 minutes
to 30 minutes for rapid strep tests
Decreased from median of 60 minutes
to median of 20 minutes
Transport team direct admissions
(led by ED faculty)
Reduce the transport team’s rate of failure to
directly admit eligible patients from 22% to
less than 10% by 10/1/2008
Reduced failure rate to 8%
Cardiac monitor alarm compliance
in the CBDI (led by CBDI faculty)
To Increase compliance with the “A5N cardiac
monitor alarm care process*” from 40% to
90% by June 13, 2013
Achieved 93% and sustained. Also
reduced overall number of cardiac
monitor alarms per day from 175 to 80.
Mitigation plans for critical safety
risk patients in the CICU (led by RN
clinical manager)
CICU team will increase % of critical safety risk
patients appropriately identified and
communicated with a mitigation plan from
24.8% to 95% by 6/7/13
Met goal of 95% with several points
exceeding goal. Implementing
sustainability plan
I²S² Project Examples
21. Leverage Point Target Audience Competencies
Macrosystem – entire hospital Senior leaders: chief executive
officer; senior vice presidents;
vice presidents
• Lead the entire healthcare system based on Deming’s
System of Profound Knowledge17
Mesosystem – clinical system
improvement site of care
teams, medical, surgical
divisions, and nursing and
allied health leadership
Clinical system improvement
team leaders; physician division
heads; assistant vice presidents;
directors; strategic improvement
project team leaders
• Lead strategic improvement teams/complex/cross-
functional projects and achieve improvement
• Articulate the role of the department/ unit/ division
as a sub-system that is an interdependent part of the
larger CCHMC system
• Lead an interprofessional leadership team to achieve
desired clinical outcomes in populations of patients
• Coach others to do improvement
• Disseminate results via external presentations and
publications
Microsystem – department
inpatient units, clinics,
operating rooms, etc.
Clinical managers; physician
leaders
• Lead small teams/narrow-scoped projects in a small
microsystem and achieve improvement
• Lead microsystem efforts to remove defects and
waste from processes of daily work
• Effectively participate in cross-functional and
strategic improvement teams
Individual contributors –
frontline improvers
All frontline non-management
staff
• Engage in improvement in daily work
• Effectively participate in improvement teams
Competency Model
22. Competency Model - Current Programs
• Basic
− Online modules
− Rapid Cycle Improvement Collaborative (RCIC)
• Intermediate
− Intermediate Improvement Science Series (I2S2)
• Advanced and Post-Doctoral
− Advanced Improvement Methods (AIM)
− Advanced Improvement Leadership Systems (AILS)
− Quality Scholars Program (QSP)
23. Current Quality Improvement Educational Programs − Basic and Intermediate
Title Aim Structure Target Audience
Basic:
Online Modules To familiarize people with the basic concepts
of quality improvement and enable them to
be more effective quality improvement team
members
Online:
• Introduction to quality
• Quality improvement
measurement basics
• Introduction to
reliability – under
development
All employees
Rapid Cycle
Improvement
Collaborative
(RCIC)
To achieve measurable improvement in a
focused, narrow-scoped project in 120 days;
to build capability using the Model for
Improvement24 and basic quality
improvement skills
• Whole-day leader
orientation
• Six half-day sessions for
small teams
• Sessions include
instructions,
application, and work
on the project
Staff who are leading or
who are members of
small teams sent to the
collaborative by the
department/division
leader or business
director
Intermediate and Improvement leader:
Intermediate
Improvement
Science Series
(I2S2)
To develop an intermediate level of
knowledge and skill to do improvement and
to lead improvement; to get results on a
specific project
• Six two-day off-site
sessions over a six-
month period
• Work on a project
between sessions with a
coach
• Extensive readings
Organizational leaders,
including physicians,
nurses, allied health and
non-clinical leaders,
generally at the director
level or above
24. Current Quality Improvement Educational Programs-Advanced
Title Aim Structure Target Audience
Advanced
Advanced
Improvement
Methods (AIM)
To enhance knowledge and skills to apply the science of
improvement to the design, implementation, and study of
quality improvement initiatives in clinical settings, and to
apply improvement theory and methods to the leadership
of projects involving research, clinical care and operations
• Four two-day sessions
• Four 90-minute conference calls
• Work on a project and readings
between sessions
Faculty
Advanced
Improvement
Leadership Systems
(AILS)
To enable multi-disciplinary care delivery system
leadership teams to effectively lead a system of care to
achieve outstanding outcomes at a competitive cost,
manage a portfolio of projects to achieve goals in multiple
strategic areas and to deliver on the CCHMC strategic plan
goals through effective alignment.
• Seven half-day sessions
• Support from a QI account
manager
• Extensive work between sessions
with the leadership team
Inter-professional
clinical leadership
teams including
physician, nursing or
other clinical leaders
and business leaders
Post-Doctoral
Quality Scholars
Program (QSP)
To build extraordinary improvement capability in faculty
who will transform health and the health care delivery
system for children; to develop faculty leaders who will
advance the scholarship of health care improvement.
The specific aims are to enable inter-professional, post-
doctoral trainees to develop the conceptual,
methodological, practical, and leadership skills to:
1) design, develop, test, implement and spread
innovations in health care delivery using a variety of
methods in real-world practice settings,
2) accurately measure health and health care quality, cost
and value, and
3) undertake research that creates new knowledge and
translates evidence into clinical and community practice
settings
• Three-year curriculum
• Career development
• Didactic training resulting in a
Master’s Degree in Clinical and
Translational Research,
completion of I2S2 and AIM, and
experiential QI research activities
supervised by mentoring teams
• Training tracks:
• Independent improvement
investigator
• System-wide improvement leader
Post-doctoral scholars
in children’s health
care
25. Exercise #3 − 10 minutes
Goal: Assess the readiness of your organization to build
improvement capability and plan next steps to move forward.
A. Use the handout to assess your readiness in the following areas:
1. Multidisciplinary senior leadership support.
2. Basic QI support structure.
3. Small cadre of early adopters trained.
4. Identification of desirable strategically aligned QI projects.
5. Identification of critical improvement leaders at the macro, meso and
microsystem levels.
6. Identification of desired QI competencies at each level.
7. Conceptual framework chosen.
8. Core QI methodology selected.
B. Rate the ease of change & strategic priority of each condition.
C. Retain these handouts to work on a plan at the end of the session.
26. Exercise #3 – Handout B
Setting Priorities: Ease of Change and Strategic Importance Diagram
(Easy) 6
5
4
3
2
(Hard) 1
Strategic Importance
Ease
of
Change
0 1 2 3 4 5
28. Phase Goal Method Credibility Use
Program theory
Participation
Reaction
Learning
Application
Impact
Economics
Exercise #4
Evaluation Planning
29. Evaluation models
Belfield C, Thomas H, Bullock A, Enyon, R, & Wall, D. (2001). Measuring effectiveness for best evidence medical
education: a discussion. Medical Teacher, 23(2), 164–70.
Kirkpatrick DL. (1994). Evaluating training programs: The four levels. San Francisco Berrett-Koehler.
Phillips JJ. (2003).Return on Investment in training and performance improvement programs. Boston Butterworth –
Heinemann.
30. Program theory
1 Parry, GJ, et al (2013) Recommendations for Evaluation of Health Care Improvement Initiatives. Academic Pediatrics,
13(6), S23-S30.
2 Cooksy, L. J., Gill, P., & Kelly, P. A. (2001). The program logic model as an integrative framework for a multi-method
evaluation. Evaluation and Program Planning, 119-128.
-----------------------
Dixon-Woods, M; Bosk, CL; Aveling, EL; Goeschel, CA & Pronovost, PJ (2011). Explaining Michigan: Developing an Ex
Post Theory of a Quality Improvement Program. Milbank Quarterly, 89(2), 167-205.
Chris L. S. Coryn, CLS; Noakes, LA; Westine, CD & Schroter, DC (2011). A Systematic Review of Theory-Driven
Evaluation Practice From 1990 to 2009. American Journal of Evaluation, 32(2), 199-226.
31. Program Theory – Logic Models
McLaughlin and Jordan (2004). Using logic models. In J. S. Wholey, H.P. Hatry & K.E. Newcomer (Ed). Handbook of Practical program
evaluation San Francisco, CA: John Wiley & Sons.
W.K. Kellogg Foundation. Using Logic Models to bring together planning, evaluation and action: Logic Model Development Guide. Battle
Creek, Michigan. http://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide
Centers for Disease Control & Prevention. Logic models. http://www.cdc.gov/oralhealth/state_programs/pdf/logic_models.pdf
Inputs Outputs Outcomes
Resources Activities Participation Learning Application by
participants
Organizational
outcomes
Short term Intermediate Long
32. Personal support and resources
available for QI work
Norms – participation, family / patient-
centered, affiliation, risk-taking
Accountability, psychological safety
Engagement – MDs, other clinical,
administrative
Impact
Long-Term
Outcomes
Short-Term
Outcomes
Training
Intervention
Participant
Meso and Microsystems
Perceptions of QI Training
Benefit
Compatibility
Complexity/Simplicity
Trialability
Observability
Support of learning
Emphasis on teamwork,
collaboration
Emphasis on innovation
Perceptions of QI Culture
Top mgmt commitment
and involvement
Accountability
Characteristics
Orientation toward
change
Collaborative
QI as part of everyday job
Fostering of social /
knowledge exchange
Family / patient-centered
care
Necessary resources
available for QI and
QI training
Goals and evaluation
support QI
Advanced
QI Training
Program
(T2)
Satisfaction
/ Reaction
~~~~~~~~~~~~~~~
Learning of New
Knowledge and Skills
~~~~~~~~~~~~~~~
Planned Action
Application: Doing
/ leading of QI
during QI training
program
Application: Doing
/ leading of QI
after QI training
program
Value to individual,
organization, society
~~~~~~~~~~~~~~~
Organizational Culture
(e.g., Sustained use
of QI; Spread of QI)
~~~~~~~~~~~~~~~
Other participant and
stakeholder
perceptions of impact
I
N
P
U
T
S
O
U
T
C
O
M
E
S
Other Staff
Knowledge, skills, experience with QI
Engagement in “the problem” and QI initiative
Early involvement
Personal support; Accountability
Function well as team
Intermediate
QI Training
Program
(T1)
Culture
Experience
Tenure
Knowledge of,
experience with
QI
Motivation,
Expectations
Leadership
Alignment
Senior
leadership
emphasis on
and support for
quality
Macrosystem - CCHMC
During training,
project-specific
outcomes:
1. Patient / Family
2. Hospital Operations
3. Initial Spread of QI
4. Other
Post-training, project-
specific outcomes:
1. Patient / Family
2. Hospital Operations
Evaluation Logic Model
1. What components comprise and are
essential to a “quality” quality
improvement training program?
2. What short-term outcomes (e.g.,
satisfaction, evidences of learning, and
participant plans to apply learning) result
from training?
3. To what degree are training participants’
successful in their application of learning
from training? Are they successful at both
the doing and leading of QI work during
and post-training, and, if so, to what extent
are they successful?
4. What impact, both intended and
unintended, does QI training have on
outcomes for patients, their families, and
healthcare operations?
5. What contextual and individual factors (i.e.,
inputs) facilitate the success of a quality
improvement (QI) training initiative and its
outcomes?
33. Phase Goal Method Credibility Use
Program
theory
Shared understanding
(mental model) of
program and
evaluation
Logic model Iterative
development with
evaluator(s) and
stakeholders.
Guide inquiry by
evaluation team
and set
expectations with
stakeholders.
Participation
Reaction
Learning
Application
Impact
Economics
Exercise #4: Program Theory
36. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental
model) of program and evaluation
Logic model Iterative development with
evaluator(s) and
stakeholders.
Guide inquiry by evaluation
team and set expectations
with stakeholders.
Participation
Verify
interprofessional
assertion
Attendance
information by
role.
High Information is a
basis for future
inquiry to assess
the value of this
approach
Reaction
Learning
Application
Impact
Economics
Exercise #4: Participation
38. First a word about^measurement
Measurement resources
Bezrucko, N. (2005). Rasch Measurement in Health Sciences. Maple Grove, MN: JAM press.
Bond, TG & Fox, CM (2007). Applying the Rasch Model. Mahwah, NJ: Lawrence Erlbaum
Associates.
Boone, WJ, Staver, JR & Yale, MS. (2014). Rasch Analysis in the Human Sciences. New York,
NY: Springer.
Smith, EV & Smith RM. (2004). Introduction to Rasch Measurement. Maple Grove, MN: JAM
press.
The debate
Jamieson, S. (2004). Likert scales: how to (ab)use them. Medical Education, 38(12), 1217-1218.
Pell G. Uses and misuses of Likert scales. Med Educ 2005;39:97.
Jamieson S. Author’s reply. Med Educ 2005;39:970.
Carifio J, Perla R. Ten common misunderstandings, misconceptions, persistent myths and
urban legends about Likert scales and Likert response formats and their antidotes. Journal
of the Social Sciences 2007;3(3), 106–116.
39. Strongly Strongly
Disagree Agree
1 2 3 4 5
Most of the
ratings at the top
of the scale
I was satisfied with
this program.
The problem: Questions are often too easy
40. Satisfaction
I was satisfied with the instructor’s performance.
I was satisfied with the value of this educational program.
The physical environment was conducive to learning
Learning – I learned new knowledge and skills from this training.
Application – I will be able to apply the knowledge and skills learned in this class to my job.
Impact
This training will play a substantial role in improving medical and quality of life outcomes.
This training will play a substantial role in improving the experience of the patient, the family or
the providers.
This training will play a substantial role in improving the value of services provided by the hospital.
Value
This training was a worthwhile investment for the hospital.
This training was a worthwhile investment in my career development.
Kirkpatrick DL. Evaluating training programs: The four levels. San Francisco Berrett-
Koehler; 1994.
Phillips JJ. Return on Investment in training and performance improvement programs.
Boston Butterworth - Heinemann; 2003.
Ask Harder Questions at multiple levels
41. 30
40
50
– I learned new knowledge and skills from this training
– This training program will play a substantial role in dramatically improving medical and quality of life outcomes.
– This training program will play a substantial role in dramatically improving the experience of the patient, the
family or providers in our health care delivery system.
– This training program will play a substantial role in dramatically improving the value of services delivered by
the hospital.
– This training was a worthwhile investment in my professional development
– I will be able to apply the knowledge and skills learned in this class to my job
– Overall, I was satisfied with the quality of this educational program
McLinden, D. & Boone, W. (2009). More than smile sheets: Rasch Analysis of training
reactions in a Medical Center. Performance Improvement Quarterly. 22(3), 7-21.
Evaluate the evaluation instrument
42. Short Term outcomes: Reactions to the event
1
2
3
4
5
6
7
I2S2
RCIC
I2S2 n~350
RCIC n~ 1900
43. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental
model) of program and evaluation
Logic model Iterative development with
evaluator(s) and stakeholders.
Guide inquiry by evaluation
team and set expectations
with stakeholders.
Participation
Verify inter-professional assertion Attendance information by
role.
High Information is a basis for
future inquiry to assess the
value of this approach
Reaction
Assess program quality
and fidelity
• Post course
questionnaires
with standard
items
• Observations
Useful for
monitoring
• Establish program
quality from
standpoint of
student
participants
• Determine fidelity
of delivery to
design
Learning
Application
Impact
Economics
Exercise #4: Reaction
45. What is being measured?
Quality Improvement
knowledge and ability
46. Step 1: Articulate Purpose
Step 2: Identify Critical Components
Step 3: Create a Response Scale
Step 4: Stakeholder Review
Step 5: Deploy and Analyze Data
Rating
Value
Level Description
1 No Knowledge I cannot tell you what this skill, tool or method is.
2 Knowledgeable I can tell you what this skill, too, or method is AND give you
facts about it.
3 Basic application I can tell you what this skill, tool or method is AND give a
defined situation, I can apply it with assistance.
4 Analysis & application I have knowledge of the skill, tool, or method AND I can
analyze a situation and determine if it is needed AND then
independently and accurately apply it.
5 Highly experienced I have knowledge of this skill, too, or method AND I have a
high degree of experience correctly applying and adapting it
in various situations AND I can explain my decisions for doing
so.
6 Expert I have knowledge of this skill, tool, or method AND I have a
high degree of experience correctly applying and adapting it
AND I can teach others the theory behind it and coach them
in its use.
Instrument Development
47. McLinden, D; Farber, S & Kaminski, G. What did they learn: A learning
outcomes assessment for quality improvement training. Unpublished
manuscript.
48. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental
model) of program and evaluation
Logic model Iterative development with
evaluator(s) and stakeholders.
Guide inquiry by evaluation
team and set expectations
with stakeholders.
Participation
Verify inter-professional assertion Attendance information by
role.
High Information is a basis for
future inquiry to assess the
value of this approach
Reaction
Assess program quality and fidelity • Post course questionnaires
with standard items
• Observations
Useful for monitoring • Establish program quality
from standpoint of student
participants
• Determine fidelity of
delivery to design
Learning
Assess change in
participants attributable
to intervention
Self – Assessment pre
and post
High given the rigor
of design process and
that the
psychometric
properties are known
Establish that
learning has or has
not occurred as a
result of the
intervention. Identify
areas for intervention
with participants
Application
Impact
Economics
Exercise #4: Learning
50. What have participants achieved on projects?
Rating Status Definition
1 Forming team
Team has been formed, target population identified, aim determined, baseline
measurement initiated
2
Planning for the project
has begun
Team is meeting, discussion is occurring, key drivers identified, plans for the
project have been made
3 Activity, but no changes
Team actively engaged in development, research, discussion but no changes
have been tested
4
Changes tested, but no
improvement
Components of the model being tested but no improvement in measures, data
on key measures are reported
5 Modest improvement
Initial test cycles have been completed, evidence of moderate improvement in
process measures and/or a reduction in variation
6 Improvement
Some improvement in measures (3 consecutive data points), PDSA test cycles in
ramps
7 Significant improvement
Most components are implemented for the population of focus, evidence of
improvement in measures (4-5 consecutive data points) with at least some data
at goal, plans for spreading the improvement are in place if appropriate
8 Sustainable improvement
Sustained improvement in most measures as evidenced by data meeting special
cause rules, at least some data at goal, spread to a larger population has begun
if appropriate
9
Outstanding sustainable
results
All components implemented, all goals of the aim have been accomplished,
outcome measures at national benchmark levels
51. Rating Status Definition
1 Forming team
Team has been formed, target population identified, aim determined, baseline
measurement initiated
2
Planning for the project
has begun
Team is meeting, discussion is occurring, key drivers identified, plans for the
project have been made
3 Activity, but no changes
Team actively engaged in development, research, discussion but no changes
have been tested
4
Changes tested, but no
improvement
Components of the model being tested but no improvement in measures, data
on key measures are reported
5 Modest improvement
Initial test cycles have been completed, evidence of moderate improvement in
process measures and/or a reduction in variation
6 Improvement
Some improvement in measures (3 consecutive data points), PDSA test cycles
in ramps
7 Significant improvement
Most components are implemented for the population of focus, evidence of
improvement in measures (4-5 consecutive data points) with at least some data
at goal, plans for spreading the improvement are in place if appropriate
8
Sustainable
improvement
Sustained improvement in most measures as evidenced by data meeting special
cause rules, at least some data at goal, spread to a larger population has begun
if appropriate
9
Outstanding sustainable
results
All components implemented, all goals of the aim have been accomplished,
outcome measures at national benchmark levels
What have participants achieved on projects?
53. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental
model) of program and evaluation
Logic model Iterative development with
evaluator(s) and stakeholders.
Guide inquiry by evaluation
team and set expectations
with stakeholders.
Participation
Verify inter-professional assertion Attendance information by
role.
High Information is a basis for
future inquiry to assess the
value of this approach
Reaction
Assess program quality and fidelity • Post course questionnaires
with standard items
• Observations
Useful for monitoring • Establish program quality
from standpoint of student
participants
• Determine fidelity of
delivery to design
Learning
Assess change in participants
attributable to intervention
Self – Assessment pre and post High given that given rigor of
design process and
psychometric properties are
known
Establish that learning has or
has not occurred as a result of
the intervention
Application
Assess transfer of
learning
Assessment of in-
class projects
Projects represent
guided application
Determine the
level of application
achieved as a
predictor for future
application
Impact
Economics
Exercise #4: Application
54. Impact– If people apply what they know, what difference does
it make?
55. 0%
10%
20%
30%
40%
50%
60%
70%
0 1 to 3 4 to 6 7 to 9 10 or more
Number of Formal QI Projects
Participated in and/or are
participating in
Led and/or are leading
Sponsored and/or are
sponsoring
6 month Follow-up
Observed impact from successful QI projects (n =
254) 1
↑ Staff Interest in QI 3.57
↑ Productivity 3.41
↑ Patient experience 3.40
↑ Medical Outcomes 3.29
↓ Errors 3.26
↓ Costs 2.99
↑ Revenue 2.38
Knowledge/Experience coaching individuals & leading
teams in: (n = 270) 2
Applying the model for improvement 3.97
Use of a systems approach 3.83
Applying principles of change
management
3.81
Use of techniques for analyzing
variation
3.61
1Arrow indicates desired direction for 1-5 scale (2 = low impact, 3 = moderate impact, 4 = high impact)
21-6 scale (3 = basic application, 4 = analysis & application)
56. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental
model) of program and evaluation
Logic model Iterative development with
evaluator(s) and stakeholders.
Guide inquiry by evaluation
team and set expectations
with stakeholders.
Participation
Verify inter-professional assertion Attendance information by
role.
High Information is a basis for
future inquiry to assess the
value of this approach
Reaction
Assess program quality and fidelity • Post course questionnaires
with standard items
• Observations
Useful for monitoring • Establish program quality
from standpoint of student
participants
• Determine fidelity of
delivery to design
Learning
Assess change in participants
attributable to intervention
Self – Assessment pre and post High given that given rigor of
design process and
psychometric properties are
known
Establish that learning has or
has not occurred as a result of
the intervention
Application Assess transfer of learning Assessment of in-class projects
Projects represent guided
application
Determine the level of
application achieved as a
predictor for future
application
Impact
Assess on-going
application and impact
on outcomes
Follow-up
questionnaires
The participant’s
perspective on the
influence of QI on
outcomes.
Determine if
graduates continue
to lead and/or
coach others.
Economics
Exercise #4: Impact
58. Methods for economic evaluation
Training
Intervention
Cost of Faculty Time
Cost of Student Time
Facility and
Materials Cost
Population
Competent
Individuals
Outcomes
Monetized
outcomes
Non-monetized
outcomes
Cost Analysis Cost Effectiveness
Cost Benefit (ROI)
Satisfaction
Learning
Application Impact
Utility
Analysis
Sensitivity Analysis
1 Yates, B. T. (1994). Toward the incorporation of costs, cost-effectiveness analysis, and cost-benefit
analysis into clinical research. Journal of Consulting and Clinical Psychology, 62, 729-736.
“…empirically explore the entire system of linkages among
specific resources consumed…” 1
Value of a
trained person
59. Utility Analysis
Boudreau, J W. (1983). Economic considerations in estimating the utility of human resource productivity improvement programs.
Personnel Psychology, 36( 3), 551-576,
Cascio, W.F. (1989). Using utility analysis to assess training outcomes. In I.L. Goldstein. (Ed.), Training and development in organizations
(63-88).
Cascio, WF & Boudreau, JW (2008). Investing in people: Financial Impact of Human Resource Initiatives. Upper Saddle River, NJ: Pearson
Education, Inc.
Schmidt, F L , Hunter, J E , & Pearlman, K. (1982). Assessing the economic impact of personnel programs on workforce productivity.
Personnel Psychology, 35(2), 333-347.
U= (N)(T)(SD)(d)-C
number of
people
duration
of effect
cost of the
program
standard deviation of the
variation in job value magnitude of the effect
of the program
60. Utility Analysis for one cohort*
*Values are for illustrative purposes only and have
been altered for the purpose of this presentation in
order to maintain the confidentiality of proprietary
information
Description Variable Value
Gain in dollars resulting from the program U
Number of employees trained and
realizing the value in the effect size
N 27
Expected duration of benefits T 2.0
True difference in performance between
trained and untrained in SD units - Effect
Size
d(t) 0.50
Standard deviation of dollar valued job
performance
Sdy 10,000
$
Per person cost of training C 8,000
$
Persons per program Np 27
$
Program cost Cp 216,000
$
BreakEven Value for SDy Formula
SDy=(U+C)/(N*T*D) breakeven where U=0 8,000.00
$ (U+C)/(N)(T)(d)=Sdy
Calculation of U, the Net Value Values
(N)(T)(dt) 27
SDy 10,000
Value created 270,000
C=cost=(N)(Per Person Cost) 216,000
Utility 54,000
ROI for the calculation of U above (not
annualized).
25%
61. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental
model) of program and evaluation
Logic model Iterative development with
evaluator(s) and stakeholders.
Guide inquiry by evaluation
team and set expectations
with stakeholders.
Participation
Verify inter-professional assertion Attendance information by
role.
High Information is a basis for
future inquiry to assess the
value of this approach
Reaction
Assess program quality and fidelity • Post course questionnaires
with standard items
• Observations
Useful for monitoring • Establish program quality
from standpoint of student
participants
• Determine fidelity of
delivery to design
Learning
Assess change in participants
attributable to intervention
Self – Assessment pre and post High given that given rigor of
design process and
psychometric properties are
known
Establish that learning has or
has not occurred as a result of
the intervention
Application Assess transfer of learning Assessment of in-class projects
Projects represent guided
application
Determine the level of
application achieved as a
predictor for future application
Impact
Assess on-going application and
impact on outcomes
Follow-up questionnaires The participant’s perspective
on the influence of QI on
outcomes
Determine if graduates
continue to lead and/or coach
others.
Economics
Determine the economics
of the journey
Utility analysis Attributes value to
trained person; does
not quantify project
outcomes
Determine the cost
of the journey and
the economic benefit
(or loss) from that
journey.
Exercise #4: Economics
63. http://www.ted.com/talks/nicholas_christakis_the_hidden_influence_of_social_networks.html
Visualizing and analyzing: Networks
Christakis, NA & Fowler, JH. (2009). Connected: The amazing power of social networks
and how they shape our lives. New York, NY: Little, Brown & Co.
Christakis, N. A., & Fowler, J. H. (2007). The spread of obesity in a large social network
over 32 years. The New England Journal of Medicine, 357(4), 370–379.
Durland, M. M. & Fredricks, K. A. (Eds.), (2005a). New directions for evaluation: Social
network analysis in program evaluation: San Francisco, CA: Jossey-Bass.
Cross, R. & Parker, A. (2004). The Hidden Power of Social Networks: Understanding how
work really gets done in organizations. Harvard Business School Press: Boston, MA.
Fowler, J. J., & Christakis, N. A. (2008). Dynamic spread of happiness in a large social
network: Longitudinal analysis over 20 years in the Framingham Heart Study. British
Medical Journal.
66. After training some participants become faculty and then connect
(influence) other participants
67. Core Faculty
A dense network of graduates connected through a shared
experience and through connected faculty
68. Core Faculty
Project team
Project team
Project team
Project team
Project team
Each participant is leading (influencing) individuals on a project team
69. Core Faculty
Project team
Project team
Project team
Project team
Project team
Influence
Next
degree
How far out does influence propagate?
70. Phase Goal Method Credibility Use
Program theory
Shared understanding (mental model) of
program and evaluation
Logic model Iterative development with
evaluator(s) and stakeholders.
Guide inquiry by evaluation team and
set expectations with stakeholders.
Participation
Verify inter-professional assertion Attendance information by role. High Information is a basis for future
inquiry to assess the value of this
approach
Reaction
Assess program quality and fidelity • Post course questionnaires with
standard items
• Observations
Useful for monitoring • Establish program quality from
standpoint of student participants
• Determine fidelity of delivery to
design
Learning
Assess change in participants attributable to
intervention
Self – Assessment pre and post High given that given rigor of design
process and psychometric properties
are known
Establish that learning has or has not
occurred as a result of the
intervention
Application Assess transfer of learning Assessment of in-class projects Projects represent guided application
Determine the level of application
achieved as a predictor for future
application
Impact
Assess on-going application and impact on
outcomes
Follow-up questionnaires The participant’s perspective on the
influence of QI on outcomes
Determine if graduates continue to
lead and/or coach others.
Economics
Determine the economics of the journey Utility analysis Attributes value to trained person;
does not quantify project outcomes
Determine the cost of the journey and
the economic benefit (or loss) from
that journey.
Participation
Assess the design of
participation on
achieving a tipping
point.
Network analysis To be determined Possible model for
the design of other
learning
interventions
Exercise #4: Program theory
72. Exercise #5 - Next Steps
• What actions will you take next week?
• Who will need to be involved?
• What will be your biggest challenges with taking each action?
• What will you do to mitigate each challenge?
• USE THE HANDOUT TO PLAN YOUR NEXT STEPS.
73. Daniel McLinden, Ed.D.
Senior Director, Learning & Development Department
Associate Professor, Department of Pediatrics
Cincinnati Children's Hospital Medical Center
Office: (513)636-8933
Mobile: (513)739-9087
Email: daniel.mclinden@cchmc.org
Gerry Kaminski, MS, DA
Senior Director Improvement Science Education (retired)
Anderson Center for Health Systems Excellence
Cincinnati Children's Hospital Medical Center
kaminskigerry@gmail.com
Office: (513)706-3245
Contact information: