SlideShare ist ein Scribd-Unternehmen logo
1 von 38
Success Metrics
Presenters:
Robert Patton, Director, Implementation Services, Qualifacts
Brandi S. Sanders, Manager, Implementation Specialists, Qualifacts
Webinar - Tuesday, April 30, 2013 1:00pm – 2:30pm CST
Success Metrics – Agenda
• Success Metrics Detailed Review
(current)
–What they are and how to use them
• Success Metrics Summary
–What the data has shown so far
• Success Metrics (future)
–C3 Insights
–Open Minds Input
–Benchmarking User Groups
2
Success Metrics Detailed
Review (current)
Success Metrics Detailed Review (current)
• There are 4 different “types” of
metrics currently available
–Go Live/Post go live success metrics
–Agency year end with community
comparison
–Agency year end with peer comparison
–Community year end with monthly
breakdown
4
Post Go Live Success Metrics
• Background
– Most common reasons for failed implementations:
• Leadership issues
– Wrong people driving the project
– Lack of good bi-directional communication between leadership and staff
• Workflow issues
– Underestimating importance of workflows
– Lack of a full workflow walk-thru to identify gaps and bottlenecks
• Provider issues
– Absence of strong champions
– Failure of staff to understand their role in using the EHR
• Training issues
– Underestimating the amount/type of training required
– Failure to assure that providers complete training
• Eliminate the emotion of fear with a metric-driven EHR implementation
- Source: White Paper - http://www.healthit.gov/sites/default/files/ehr_implementation_white_paper.pdf
- EHR Implementation with Minimal Practice Disruption In Primary Care Settings
- Authored By: Jeff Hummel, MD, MPH & Peggy Evans, PhD, CPHIT – Qualis Health, Seattle,
Washington (WIREC consultant team contributing)
- November 2012
5
Post Go Live Success Metrics
• Purpose
– Designed to measure immediate customer success
post go live
– Key indicators for refresher training, effective use of
the system, system configuration adjustments, etc
• How Generated
– Via the CareLogic system (and Excel) by QSI staff*
• How/When Communicated
– 5, 10, 15, 45 days post go live via status calls
(implementation)
– 60, 90, 180, 365 days post go live via customers or
support as needed*
6
Post Go Live Success Metrics List (27*)
• Active Clients
• Active Organizations
• Active Programs and Activities
• Active Payer Plans
• Active Staff
• Activity to Service Document signed
• First Contact Date to First
Appointment
• Appointments by Type and Status
• Services Not Authorized
• Clients Seen
• Failed Activities
• Failed Claims
• Real Time Productivity
• Awaiting Claim Approval
• All Claims / Approved Claims
• Claim Batches
• Overall Claiming Lifecycle
• Activity to Claim Date
• Average Daily Billable Services
• Total Deposits
• Total Deposits Amount
• Total Approved Applied Amount
• Total Unapproved Applied Amount
• Total Unapplied Amount
• Total Unposted Amount
• Refund Amount
• Recoup Amount
• Month over Month Number of Claims
Generated and Revenue
7
Post Go Live Success Metrics (cont’d)
• Value:
– They infuse accountability and visibility (both
QSI/customer)
– They allow the identification of issues sooner than
later
– They remove the emotion from implementation and
provide the data necessary for executive sponsorship
(both QSI and customer)
– They set the baseline for “what to expect” post go live
(by month) so that customers can measure their
performance and so that QSI can know where to jump
in and make the customer as successful as possible
– They allow for the celebration of project successes
8
Example - Go Live – Appointments by Type
and Status
9
Example - Go Live – Active Counts, Failed
Activities, Failed Claims, Unstatused Appointments
10
Example - Go Live – Claims Pending Approval, Claim Batches, Total
Deposits, Payments Applied/Unapplied/Unposted
11
Example - Go Live – Executive (Project Health
Review)
12
Post Go Live Success Metrics (cont’d)
• Note:
– These are available to all customers post go
live
– They are generated by the IS (and put into
Excel for graphing), but have also been
made available for customers to run
– The PMs create the post go live health
report
– We are exploring what the longer term
solution is for generating/presenting this
data
13
Success Metrics
–Go Live/Post go live success metrics
–Agency year end with community
comparison
–Agency year end with peer
comparison
–Community year end with monthly
breakdown
14
Agency Year End with Community & Peer
Comparison Metrics
• Purpose
– Extension of the post go live metrics with additional
measurements and percentile/ranking of customer in
comparison to the CareLogic community (all eligible
QSI customers)
• “Peer” currently is based on total revenue billed through CareLogic (Under 1M, 1M to
5M, 5M to 10M, 10M to 20M, and 20M +)
• How Generated
– Via QSI technical resource during off-business hours
– Compiled and ranked manually via QSI resource (Excel)
• How/When Communicated
– Annually (currently)
15
Agency Year End with Community & Peer
Comparison Metrics List
• Rank
• Community/Peer Average
• Community/Peer Median
• Active Clients - On 01/01/12
• Active Clients - On 12/31/12
• Clients Seen
• Active Staff - On 01/01/12
• Active Staff - On 12/31/12
• % of Service Providers (of
Active Staff)
• Total Approved Claims
• Total Activities with Payment
(Paid)
• Total Expected Revenue
• Avg. Expected Rev by
Approved Claim
• Claim Lifecycle
• Avg Activity to Claim Days
• Avg Activity to Batch Days
• Avg Batch to Payment Applied
Days
• Medicare - Revenue
• Medicare - Claims
• Medicaid - Revenue
• Medicaid - Claims
• Other Payer – Revenue
• Other Payer - Claims
• Total Billable Activities
• Billable Activities/Day (&
Percentage Billable)
• Billable Activities/Service
Providers
• Non Billable Activities/Day (&
Percentage Non Billable)
• Appointments Kept
• Appointments CBC
• Appointments CBT
• Appointments DNS
• Avg Document Signature Time
(Hours)
• Avg Scheduled to Kept Days
• # of Customer Pentaho Reports
• Services Not Authorized
• # of Claims by Program
• # of Non Billable Activities by
Staff Credential
16
Agency Year End with Community & Peer
Comparison Metrics (cont’d)
• Value:
– They go beyond go live success and allow for
more/better data to drive agency success
– Their focus (upon root cause determination) is
more around workflow/process efficiency and
system adoption (vs immediate issue/training
like post go live)
– They empower high performers and low
performers to reach out (and for QSI to reach
out to them) for best practices sharing
(especially with customers who have “like”
business/operational size)
17
Example - Agency Year End with
Community & Peer Comparisons
18
QSI Community Peer
Example - Agency Year End with
Community & Peer Comparisons
Agency, Community,
& Peer
19
Example - Agency Year End with
Community & Peer Comparisons
Agency, Community,
& Peer
20
Example - Agency Year End with
Community
21
Agency Year End with Community & Peer
Comparison Metrics (cont’d)
• Note:
– These metrics include customers live during 2012 (but
if they went live late in 2012, there most likely was not
enough data to provide peer rankings)
– They were distributed at the conference and were sent
to all other eligible customers prior to 4/26
– There are some customers for whom we don’t have
the data (do not use the full system – i.e. billing)
– Once we have finished incorporating customer and
Open Minds feedback on how to expand the data (i.e.
by line of service, further narrowed by payer, etc) we
will publish the updated metrics
22
Success Metrics
–Go Live/Post go live success metrics
–Agency year end with community
comparison
–Agency year end with peer
comparison
–Community year end with monthly
breakdown
23
Community Year End with Monthly
Breakdown Metrics
• Purpose
– Designed (6 currently) to measure the overall use of
CareLogic by the Qualifacts community; providing
visibility into the industry in order to ask the correct
questions of their own business practices
• How Generated
– Via QSI technical resource during off-business hours
– Compiled and graphed manually via QSI resource
(Excel)
• How/When Communicated
– Annually (currently)
– Quarterly (if participate in Open Minds User Group)
24
Community Year End with Monthly
Breakdown Metrics List
• Expected Revenue by Funding Source
• Expected Revenue % by Funding Source
• Claim Lifecycle for 2012
• Claim Lifecycle January 2011 – December 2012
• Clinical Efficiency
• Appointment Status Comparison
25
Community Year End with Monthly
Breakdown Metrics (cont’d)
• Value
– They highlight general trends to provide proof of
improvement (and/or areas for improvement) over
time (demonstrating the ROI of the EHR investment)
– They are the baseline from which the Open Minds
user groups will be conducted (and upon which
additional benchmarking measurements will be
developed)
– For customers participating in the Open Minds User
Group, we will include (graph) their individual
performance against the benchmark data
26
Example - Community Year End with
Monthly Breakdown Metrics
27
Example - Community Year End with
Monthly Breakdown Metrics
28
Example - Community Year End with
Monthly Breakdown Metrics
29
Community Year End with Monthly
Breakdown Metrics (cont’d)
• Note:
– These metrics include customers live as of 12/31/11
(*claim lifecycle includes those live as of 2009).
– They measure usage over the course of 2012, with the
exception of one (Claim Lifecycle) which measures
across two years (2011 – 2012)
– They were distributed at the conference and were sent
to all other eligible customers prior to 4/26
– There are some customers whose data was either
incomplete or was a significant outlier and therefore
were excluded from the benchmark measurement
30
Success Metrics Overview – What’s
Useful/Relevant When?
• Stage = Existing (Prospect) - this is where/how we can show customers “what to
expect” post go live
– All example metrics
– Percentile summary
• Stage = Building (Implementation) – this is where customers can determine critical
success factors and what measurements will prove their success
– All example metrics
– Percentile summary
• Stage = Emerging (Go Live, 1 – 3 months post) – this is where we are focused on the
“must haves” to ensure successful go live
– Post go live success metrics
– Percentile summary
• Stage = Rising (Live less than 1 year, 3 – 12 months) – this is where we are focused
more on workflow/process and other system usage efficiencies
– Post go live success metrics
– Agency year end with community comparison
– Agency year end with peer comparison
– Community year end with monthly breakdown
• Stage = Thriving (Live greater than 1 year) – this is where we are focused more on big
picture and overall agency success (growth of business/adoption of new
functionality, etc)
– Agency year end with community comparison
– Agency year end with peer comparison
– Community year end with monthly breakdown
31
Success Metrics Summary
(What the data has shown so far)
Success Metrics Summary – What we Know
• In General:
– More than we did before
• Immensely helpful to implementations
– Positive feedback in the direction we’re heading
(data provided beyond post go live metrics is
helpful, but need more specific)
– Customers find they are doing better than what
they feel they are doing
– Believe longer term that overlay analytics
software is the right way to go for dashboards
33
Success Metrics Summary – What we Know
(cont’d)
• From the Data:
–Customers continue to improve over
time (claim lifecycle & clinical
efficiency)
–No show rates are lower than
industry*
–Revenues do increase*
34
Success Metrics (future)
Success Metrics (future)
– C3 Insights
• Impact of configurability of CareLogic
• Need further breakdowns
• Need more often (with more trending)
• Need “executive” views (dashboards/drill down)
– Open Minds Input
• Determination of metrics
• Design/development of dashboards
• Benchmarking User Groups
– Open Minds led reviews of metrics and how to use for
performance improvement and organizational readiness for
healthcare reform
– Original 5/22 start pushed in order to refine metrics further
36
Success Metrics – Supporting Materials
• Distributed at C3, Emailed (if not at C3), and to be posted to QSI Connect by 5/3
(http://connect.qualifacts.com/page/success-metrics-system-audit-queries)
– Qualifacts Success Metrics (Overview)
– Post Go Live Success Metrics
• Quick Reference Guide
• Full Guide
• Example Spreadsheet
• Example Post Go Live Health Report
– Agency Year End with Community & Peer Comparison Metrics
• Cover Sheet
• Quick Reference Guide
• Examples
• Percentile Summary
– Community Year End with Monthly Breakdown Metrics
• Cover Sheet
• Example
37
Q & A?
38

Weitere ähnliche Inhalte

Was ist angesagt?

Melissa Carter Resume 2015
Melissa Carter Resume 2015Melissa Carter Resume 2015
Melissa Carter Resume 2015Melissa Carter
 
Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...
Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...
Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...Alan McSweeney
 
Bpm Implementation Success Criteria And Best Practice
Bpm Implementation   Success Criteria And Best PracticeBpm Implementation   Success Criteria And Best Practice
Bpm Implementation Success Criteria And Best PracticeAlan McSweeney
 
3 Steps to Optimize Project Reporting using SharePoint
3 Steps to Optimize Project Reporting using SharePoint3 Steps to Optimize Project Reporting using SharePoint
3 Steps to Optimize Project Reporting using SharePointBrightWork
 
Resume of Cathy A Harris
Resume of Cathy A HarrisResume of Cathy A Harris
Resume of Cathy A HarrisCathy A Harris
 
7 datta-well point-selectingmetrics
7 datta-well point-selectingmetrics7 datta-well point-selectingmetrics
7 datta-well point-selectingmetricsMedical_Optima
 
Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...
Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...
Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...Aden Maine
 
Results based planning and management
Results based planning and managementResults based planning and management
Results based planning and managementBilal Naqeeb
 
Professional and Motivated
Professional and MotivatedProfessional and Motivated
Professional and MotivatedNykki Humphrey
 
Monitoring, evaluation and accountability staff presentation
Monitoring, evaluation and accountability staff presentationMonitoring, evaluation and accountability staff presentation
Monitoring, evaluation and accountability staff presentationkltpollock
 
Tyler Moser Resume_Final
Tyler Moser Resume_FinalTyler Moser Resume_Final
Tyler Moser Resume_FinalTyler Moser
 
The Need For Effective Early Engagement In Solution Architecture And Design
The Need For Effective Early Engagement In Solution Architecture And DesignThe Need For Effective Early Engagement In Solution Architecture And Design
The Need For Effective Early Engagement In Solution Architecture And DesignAlan McSweeney
 
Sage Grant Management Webinar
Sage Grant Management Webinar Sage Grant Management Webinar
Sage Grant Management Webinar Abila
 
2011 APA Measurable Outcomes in Planning - Minneapolis
2011 APA Measurable Outcomes in Planning - Minneapolis2011 APA Measurable Outcomes in Planning - Minneapolis
2011 APA Measurable Outcomes in Planning - MinneapolisJoseph Horwedel
 
Mark Davies cv Dec 16th 2015
Mark Davies cv Dec 16th 2015Mark Davies cv Dec 16th 2015
Mark Davies cv Dec 16th 2015mark davies
 

Was ist angesagt? (19)

Melissa Carter Resume 2015
Melissa Carter Resume 2015Melissa Carter Resume 2015
Melissa Carter Resume 2015
 
Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...
Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...
Developing an Acquisition Centre of Excellence for Effective Sourcing and Sup...
 
Cognos Business Intelligence
Cognos Business IntelligenceCognos Business Intelligence
Cognos Business Intelligence
 
Bpm Implementation Success Criteria And Best Practice
Bpm Implementation   Success Criteria And Best PracticeBpm Implementation   Success Criteria And Best Practice
Bpm Implementation Success Criteria And Best Practice
 
Hm 418 harris ch13 ppt
Hm 418 harris ch13 pptHm 418 harris ch13 ppt
Hm 418 harris ch13 ppt
 
3 Steps to Optimize Project Reporting using SharePoint
3 Steps to Optimize Project Reporting using SharePoint3 Steps to Optimize Project Reporting using SharePoint
3 Steps to Optimize Project Reporting using SharePoint
 
Resume of Cathy A Harris
Resume of Cathy A HarrisResume of Cathy A Harris
Resume of Cathy A Harris
 
7 datta-well point-selectingmetrics
7 datta-well point-selectingmetrics7 datta-well point-selectingmetrics
7 datta-well point-selectingmetrics
 
Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...
Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...
Royal Borough of Windsor and Maidenhead - Implementing InPhase: The first six...
 
Results based planning and management
Results based planning and managementResults based planning and management
Results based planning and management
 
Professional and Motivated
Professional and MotivatedProfessional and Motivated
Professional and Motivated
 
Monitoring, evaluation and accountability staff presentation
Monitoring, evaluation and accountability staff presentationMonitoring, evaluation and accountability staff presentation
Monitoring, evaluation and accountability staff presentation
 
Tyler Moser Resume_Final
Tyler Moser Resume_FinalTyler Moser Resume_Final
Tyler Moser Resume_Final
 
The Need For Effective Early Engagement In Solution Architecture And Design
The Need For Effective Early Engagement In Solution Architecture And DesignThe Need For Effective Early Engagement In Solution Architecture And Design
The Need For Effective Early Engagement In Solution Architecture And Design
 
Result based management
Result based management Result based management
Result based management
 
SSealey Resume_SHORT
SSealey Resume_SHORTSSealey Resume_SHORT
SSealey Resume_SHORT
 
Sage Grant Management Webinar
Sage Grant Management Webinar Sage Grant Management Webinar
Sage Grant Management Webinar
 
2011 APA Measurable Outcomes in Planning - Minneapolis
2011 APA Measurable Outcomes in Planning - Minneapolis2011 APA Measurable Outcomes in Planning - Minneapolis
2011 APA Measurable Outcomes in Planning - Minneapolis
 
Mark Davies cv Dec 16th 2015
Mark Davies cv Dec 16th 2015Mark Davies cv Dec 16th 2015
Mark Davies cv Dec 16th 2015
 

Ähnlich wie Customer Success Metrics

Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...
Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...
Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...Emtec Inc.
 
Financial Metrics That Mean to Nonprofits
Financial Metrics That Mean to NonprofitsFinancial Metrics That Mean to Nonprofits
Financial Metrics That Mean to NonprofitsIntacct Corporation
 
Measuring Business Analyst Impact
Measuring Business Analyst ImpactMeasuring Business Analyst Impact
Measuring Business Analyst ImpactASPE, Inc.
 
Wellpoint: Selecting, Managing and Deploying Metrics
Wellpoint: Selecting, Managing and Deploying MetricsWellpoint: Selecting, Managing and Deploying Metrics
Wellpoint: Selecting, Managing and Deploying MetricsMedical Optima
 
Consulting Brochure / The Performance Institute
Consulting Brochure / The Performance InstituteConsulting Brochure / The Performance Institute
Consulting Brochure / The Performance InstituteNicole Cathcart
 
Costing Centre of Expertise
Costing Centre of ExpertiseCosting Centre of Expertise
Costing Centre of Expertisefmi_igf
 
Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...RBFHealth
 
Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...RBFHealth
 
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptINTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptmodestuseveline
 
What ISO Management Systems can learn from Balanced Scorecard?
What ISO Management Systems can learn from Balanced Scorecard?What ISO Management Systems can learn from Balanced Scorecard?
What ISO Management Systems can learn from Balanced Scorecard?PECB
 
Module 4.2 - Performance management
Module 4.2 - Performance managementModule 4.2 - Performance management
Module 4.2 - Performance managementszpinter
 
Metrics for Charities & Non-Profits
Metrics for Charities & Non-ProfitsMetrics for Charities & Non-Profits
Metrics for Charities & Non-ProfitsChief Innovation
 
Philips SCS CxP Satmetrix Conf Presentation Jan 23
Philips SCS CxP Satmetrix Conf Presentation Jan 23Philips SCS CxP Satmetrix Conf Presentation Jan 23
Philips SCS CxP Satmetrix Conf Presentation Jan 23Kimberly Simpson
 
Developing a Customer Centric Research Program
Developing a Customer Centric Research ProgramDeveloping a Customer Centric Research Program
Developing a Customer Centric Research ProgramBusiness Over Broadway
 
Scottish comms network paul njoku - 14 and 15 may 2014
Scottish comms network   paul njoku - 14 and 15 may 2014Scottish comms network   paul njoku - 14 and 15 may 2014
Scottish comms network paul njoku - 14 and 15 may 2014Jane Robson
 
Customer Satisfaction in Shared Services
Customer Satisfaction in Shared ServicesCustomer Satisfaction in Shared Services
Customer Satisfaction in Shared ServicesScottMadden, Inc.
 
Targets That Work (for the Service Desk), Susan Storey
Targets That Work (for the Service Desk), Susan StoreyTargets That Work (for the Service Desk), Susan Storey
Targets That Work (for the Service Desk), Susan StoreyService Desk Institute
 

Ähnlich wie Customer Success Metrics (20)

Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...
Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...
Client Highlight- At Joint Commission: The Progression of a Planning & Foreca...
 
Financial Metrics That Mean to Nonprofits
Financial Metrics That Mean to NonprofitsFinancial Metrics That Mean to Nonprofits
Financial Metrics That Mean to Nonprofits
 
Measuring Business Analyst Impact
Measuring Business Analyst ImpactMeasuring Business Analyst Impact
Measuring Business Analyst Impact
 
Wellpoint: Selecting, Managing and Deploying Metrics
Wellpoint: Selecting, Managing and Deploying MetricsWellpoint: Selecting, Managing and Deploying Metrics
Wellpoint: Selecting, Managing and Deploying Metrics
 
Using online tools to help us assess our public legal education work
Using online tools to help us assess our public legal education work Using online tools to help us assess our public legal education work
Using online tools to help us assess our public legal education work
 
Consulting Brochure / The Performance Institute
Consulting Brochure / The Performance InstituteConsulting Brochure / The Performance Institute
Consulting Brochure / The Performance Institute
 
Costing Centre of Expertise
Costing Centre of ExpertiseCosting Centre of Expertise
Costing Centre of Expertise
 
Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Seven - Measureme...
 
Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...
Annual Results and Impact Evaluation Workshop for RBF - Day Three - Measureme...
 
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptINTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
 
What ISO Management Systems can learn from Balanced Scorecard?
What ISO Management Systems can learn from Balanced Scorecard?What ISO Management Systems can learn from Balanced Scorecard?
What ISO Management Systems can learn from Balanced Scorecard?
 
Module 4.2 - Performance management
Module 4.2 - Performance managementModule 4.2 - Performance management
Module 4.2 - Performance management
 
Benchmark webinar presentation
Benchmark webinar presentationBenchmark webinar presentation
Benchmark webinar presentation
 
Metrics for Charities & Non-Profits
Metrics for Charities & Non-ProfitsMetrics for Charities & Non-Profits
Metrics for Charities & Non-Profits
 
Philips SCS CxP Satmetrix Conf Presentation Jan 23
Philips SCS CxP Satmetrix Conf Presentation Jan 23Philips SCS CxP Satmetrix Conf Presentation Jan 23
Philips SCS CxP Satmetrix Conf Presentation Jan 23
 
Developing a Customer Centric Research Program
Developing a Customer Centric Research ProgramDeveloping a Customer Centric Research Program
Developing a Customer Centric Research Program
 
Scottish comms network paul njoku - 14 and 15 may 2014
Scottish comms network   paul njoku - 14 and 15 may 2014Scottish comms network   paul njoku - 14 and 15 may 2014
Scottish comms network paul njoku - 14 and 15 may 2014
 
Customer Satisfaction in Shared Services
Customer Satisfaction in Shared ServicesCustomer Satisfaction in Shared Services
Customer Satisfaction in Shared Services
 
Targets That Work (for the Service Desk), Susan Storey
Targets That Work (for the Service Desk), Susan StoreyTargets That Work (for the Service Desk), Susan Storey
Targets That Work (for the Service Desk), Susan Storey
 
LFA & Indicator.pdf
LFA & Indicator.pdfLFA & Indicator.pdf
LFA & Indicator.pdf
 

Mehr von Qualifacts

Meaningful Use Survivor: 4 Steps to a Successful Audit
Meaningful Use Survivor: 4 Steps to a Successful AuditMeaningful Use Survivor: 4 Steps to a Successful Audit
Meaningful Use Survivor: 4 Steps to a Successful AuditQualifacts
 
All EHR's Are Not Created Equal
All EHR's Are Not Created EqualAll EHR's Are Not Created Equal
All EHR's Are Not Created EqualQualifacts
 
Vermont EHR Incentive Program
Vermont EHR Incentive ProgramVermont EHR Incentive Program
Vermont EHR Incentive ProgramQualifacts
 
Meaningful Use and Electronic Health Records: What You Need to Know
Meaningful Use and Electronic Health Records: What You Need to KnowMeaningful Use and Electronic Health Records: What You Need to Know
Meaningful Use and Electronic Health Records: What You Need to KnowQualifacts
 
Connecticut EHR Program: MUforBH.com
Connecticut EHR Program: MUforBH.comConnecticut EHR Program: MUforBH.com
Connecticut EHR Program: MUforBH.comQualifacts
 
MUforBH & Qualifacts Presents: Understanding A/I/U
MUforBH & Qualifacts Presents: Understanding A/I/UMUforBH & Qualifacts Presents: Understanding A/I/U
MUforBH & Qualifacts Presents: Understanding A/I/UQualifacts
 
EHR EP Decision Tool
EHR EP Decision ToolEHR EP Decision Tool
EHR EP Decision ToolQualifacts
 

Mehr von Qualifacts (7)

Meaningful Use Survivor: 4 Steps to a Successful Audit
Meaningful Use Survivor: 4 Steps to a Successful AuditMeaningful Use Survivor: 4 Steps to a Successful Audit
Meaningful Use Survivor: 4 Steps to a Successful Audit
 
All EHR's Are Not Created Equal
All EHR's Are Not Created EqualAll EHR's Are Not Created Equal
All EHR's Are Not Created Equal
 
Vermont EHR Incentive Program
Vermont EHR Incentive ProgramVermont EHR Incentive Program
Vermont EHR Incentive Program
 
Meaningful Use and Electronic Health Records: What You Need to Know
Meaningful Use and Electronic Health Records: What You Need to KnowMeaningful Use and Electronic Health Records: What You Need to Know
Meaningful Use and Electronic Health Records: What You Need to Know
 
Connecticut EHR Program: MUforBH.com
Connecticut EHR Program: MUforBH.comConnecticut EHR Program: MUforBH.com
Connecticut EHR Program: MUforBH.com
 
MUforBH & Qualifacts Presents: Understanding A/I/U
MUforBH & Qualifacts Presents: Understanding A/I/UMUforBH & Qualifacts Presents: Understanding A/I/U
MUforBH & Qualifacts Presents: Understanding A/I/U
 
EHR EP Decision Tool
EHR EP Decision ToolEHR EP Decision Tool
EHR EP Decision Tool
 

Customer Success Metrics

  • 1. Success Metrics Presenters: Robert Patton, Director, Implementation Services, Qualifacts Brandi S. Sanders, Manager, Implementation Specialists, Qualifacts Webinar - Tuesday, April 30, 2013 1:00pm – 2:30pm CST
  • 2. Success Metrics – Agenda • Success Metrics Detailed Review (current) –What they are and how to use them • Success Metrics Summary –What the data has shown so far • Success Metrics (future) –C3 Insights –Open Minds Input –Benchmarking User Groups 2
  • 4. Success Metrics Detailed Review (current) • There are 4 different “types” of metrics currently available –Go Live/Post go live success metrics –Agency year end with community comparison –Agency year end with peer comparison –Community year end with monthly breakdown 4
  • 5. Post Go Live Success Metrics • Background – Most common reasons for failed implementations: • Leadership issues – Wrong people driving the project – Lack of good bi-directional communication between leadership and staff • Workflow issues – Underestimating importance of workflows – Lack of a full workflow walk-thru to identify gaps and bottlenecks • Provider issues – Absence of strong champions – Failure of staff to understand their role in using the EHR • Training issues – Underestimating the amount/type of training required – Failure to assure that providers complete training • Eliminate the emotion of fear with a metric-driven EHR implementation - Source: White Paper - http://www.healthit.gov/sites/default/files/ehr_implementation_white_paper.pdf - EHR Implementation with Minimal Practice Disruption In Primary Care Settings - Authored By: Jeff Hummel, MD, MPH & Peggy Evans, PhD, CPHIT – Qualis Health, Seattle, Washington (WIREC consultant team contributing) - November 2012 5
  • 6. Post Go Live Success Metrics • Purpose – Designed to measure immediate customer success post go live – Key indicators for refresher training, effective use of the system, system configuration adjustments, etc • How Generated – Via the CareLogic system (and Excel) by QSI staff* • How/When Communicated – 5, 10, 15, 45 days post go live via status calls (implementation) – 60, 90, 180, 365 days post go live via customers or support as needed* 6
  • 7. Post Go Live Success Metrics List (27*) • Active Clients • Active Organizations • Active Programs and Activities • Active Payer Plans • Active Staff • Activity to Service Document signed • First Contact Date to First Appointment • Appointments by Type and Status • Services Not Authorized • Clients Seen • Failed Activities • Failed Claims • Real Time Productivity • Awaiting Claim Approval • All Claims / Approved Claims • Claim Batches • Overall Claiming Lifecycle • Activity to Claim Date • Average Daily Billable Services • Total Deposits • Total Deposits Amount • Total Approved Applied Amount • Total Unapproved Applied Amount • Total Unapplied Amount • Total Unposted Amount • Refund Amount • Recoup Amount • Month over Month Number of Claims Generated and Revenue 7
  • 8. Post Go Live Success Metrics (cont’d) • Value: – They infuse accountability and visibility (both QSI/customer) – They allow the identification of issues sooner than later – They remove the emotion from implementation and provide the data necessary for executive sponsorship (both QSI and customer) – They set the baseline for “what to expect” post go live (by month) so that customers can measure their performance and so that QSI can know where to jump in and make the customer as successful as possible – They allow for the celebration of project successes 8
  • 9. Example - Go Live – Appointments by Type and Status 9
  • 10. Example - Go Live – Active Counts, Failed Activities, Failed Claims, Unstatused Appointments 10
  • 11. Example - Go Live – Claims Pending Approval, Claim Batches, Total Deposits, Payments Applied/Unapplied/Unposted 11
  • 12. Example - Go Live – Executive (Project Health Review) 12
  • 13. Post Go Live Success Metrics (cont’d) • Note: – These are available to all customers post go live – They are generated by the IS (and put into Excel for graphing), but have also been made available for customers to run – The PMs create the post go live health report – We are exploring what the longer term solution is for generating/presenting this data 13
  • 14. Success Metrics –Go Live/Post go live success metrics –Agency year end with community comparison –Agency year end with peer comparison –Community year end with monthly breakdown 14
  • 15. Agency Year End with Community & Peer Comparison Metrics • Purpose – Extension of the post go live metrics with additional measurements and percentile/ranking of customer in comparison to the CareLogic community (all eligible QSI customers) • “Peer” currently is based on total revenue billed through CareLogic (Under 1M, 1M to 5M, 5M to 10M, 10M to 20M, and 20M +) • How Generated – Via QSI technical resource during off-business hours – Compiled and ranked manually via QSI resource (Excel) • How/When Communicated – Annually (currently) 15
  • 16. Agency Year End with Community & Peer Comparison Metrics List • Rank • Community/Peer Average • Community/Peer Median • Active Clients - On 01/01/12 • Active Clients - On 12/31/12 • Clients Seen • Active Staff - On 01/01/12 • Active Staff - On 12/31/12 • % of Service Providers (of Active Staff) • Total Approved Claims • Total Activities with Payment (Paid) • Total Expected Revenue • Avg. Expected Rev by Approved Claim • Claim Lifecycle • Avg Activity to Claim Days • Avg Activity to Batch Days • Avg Batch to Payment Applied Days • Medicare - Revenue • Medicare - Claims • Medicaid - Revenue • Medicaid - Claims • Other Payer – Revenue • Other Payer - Claims • Total Billable Activities • Billable Activities/Day (& Percentage Billable) • Billable Activities/Service Providers • Non Billable Activities/Day (& Percentage Non Billable) • Appointments Kept • Appointments CBC • Appointments CBT • Appointments DNS • Avg Document Signature Time (Hours) • Avg Scheduled to Kept Days • # of Customer Pentaho Reports • Services Not Authorized • # of Claims by Program • # of Non Billable Activities by Staff Credential 16
  • 17. Agency Year End with Community & Peer Comparison Metrics (cont’d) • Value: – They go beyond go live success and allow for more/better data to drive agency success – Their focus (upon root cause determination) is more around workflow/process efficiency and system adoption (vs immediate issue/training like post go live) – They empower high performers and low performers to reach out (and for QSI to reach out to them) for best practices sharing (especially with customers who have “like” business/operational size) 17
  • 18. Example - Agency Year End with Community & Peer Comparisons 18 QSI Community Peer
  • 19. Example - Agency Year End with Community & Peer Comparisons Agency, Community, & Peer 19
  • 20. Example - Agency Year End with Community & Peer Comparisons Agency, Community, & Peer 20
  • 21. Example - Agency Year End with Community 21
  • 22. Agency Year End with Community & Peer Comparison Metrics (cont’d) • Note: – These metrics include customers live during 2012 (but if they went live late in 2012, there most likely was not enough data to provide peer rankings) – They were distributed at the conference and were sent to all other eligible customers prior to 4/26 – There are some customers for whom we don’t have the data (do not use the full system – i.e. billing) – Once we have finished incorporating customer and Open Minds feedback on how to expand the data (i.e. by line of service, further narrowed by payer, etc) we will publish the updated metrics 22
  • 23. Success Metrics –Go Live/Post go live success metrics –Agency year end with community comparison –Agency year end with peer comparison –Community year end with monthly breakdown 23
  • 24. Community Year End with Monthly Breakdown Metrics • Purpose – Designed (6 currently) to measure the overall use of CareLogic by the Qualifacts community; providing visibility into the industry in order to ask the correct questions of their own business practices • How Generated – Via QSI technical resource during off-business hours – Compiled and graphed manually via QSI resource (Excel) • How/When Communicated – Annually (currently) – Quarterly (if participate in Open Minds User Group) 24
  • 25. Community Year End with Monthly Breakdown Metrics List • Expected Revenue by Funding Source • Expected Revenue % by Funding Source • Claim Lifecycle for 2012 • Claim Lifecycle January 2011 – December 2012 • Clinical Efficiency • Appointment Status Comparison 25
  • 26. Community Year End with Monthly Breakdown Metrics (cont’d) • Value – They highlight general trends to provide proof of improvement (and/or areas for improvement) over time (demonstrating the ROI of the EHR investment) – They are the baseline from which the Open Minds user groups will be conducted (and upon which additional benchmarking measurements will be developed) – For customers participating in the Open Minds User Group, we will include (graph) their individual performance against the benchmark data 26
  • 27. Example - Community Year End with Monthly Breakdown Metrics 27
  • 28. Example - Community Year End with Monthly Breakdown Metrics 28
  • 29. Example - Community Year End with Monthly Breakdown Metrics 29
  • 30. Community Year End with Monthly Breakdown Metrics (cont’d) • Note: – These metrics include customers live as of 12/31/11 (*claim lifecycle includes those live as of 2009). – They measure usage over the course of 2012, with the exception of one (Claim Lifecycle) which measures across two years (2011 – 2012) – They were distributed at the conference and were sent to all other eligible customers prior to 4/26 – There are some customers whose data was either incomplete or was a significant outlier and therefore were excluded from the benchmark measurement 30
  • 31. Success Metrics Overview – What’s Useful/Relevant When? • Stage = Existing (Prospect) - this is where/how we can show customers “what to expect” post go live – All example metrics – Percentile summary • Stage = Building (Implementation) – this is where customers can determine critical success factors and what measurements will prove their success – All example metrics – Percentile summary • Stage = Emerging (Go Live, 1 – 3 months post) – this is where we are focused on the “must haves” to ensure successful go live – Post go live success metrics – Percentile summary • Stage = Rising (Live less than 1 year, 3 – 12 months) – this is where we are focused more on workflow/process and other system usage efficiencies – Post go live success metrics – Agency year end with community comparison – Agency year end with peer comparison – Community year end with monthly breakdown • Stage = Thriving (Live greater than 1 year) – this is where we are focused more on big picture and overall agency success (growth of business/adoption of new functionality, etc) – Agency year end with community comparison – Agency year end with peer comparison – Community year end with monthly breakdown 31
  • 32. Success Metrics Summary (What the data has shown so far)
  • 33. Success Metrics Summary – What we Know • In General: – More than we did before • Immensely helpful to implementations – Positive feedback in the direction we’re heading (data provided beyond post go live metrics is helpful, but need more specific) – Customers find they are doing better than what they feel they are doing – Believe longer term that overlay analytics software is the right way to go for dashboards 33
  • 34. Success Metrics Summary – What we Know (cont’d) • From the Data: –Customers continue to improve over time (claim lifecycle & clinical efficiency) –No show rates are lower than industry* –Revenues do increase* 34
  • 36. Success Metrics (future) – C3 Insights • Impact of configurability of CareLogic • Need further breakdowns • Need more often (with more trending) • Need “executive” views (dashboards/drill down) – Open Minds Input • Determination of metrics • Design/development of dashboards • Benchmarking User Groups – Open Minds led reviews of metrics and how to use for performance improvement and organizational readiness for healthcare reform – Original 5/22 start pushed in order to refine metrics further 36
  • 37. Success Metrics – Supporting Materials • Distributed at C3, Emailed (if not at C3), and to be posted to QSI Connect by 5/3 (http://connect.qualifacts.com/page/success-metrics-system-audit-queries) – Qualifacts Success Metrics (Overview) – Post Go Live Success Metrics • Quick Reference Guide • Full Guide • Example Spreadsheet • Example Post Go Live Health Report – Agency Year End with Community & Peer Comparison Metrics • Cover Sheet • Quick Reference Guide • Examples • Percentile Summary – Community Year End with Monthly Breakdown Metrics • Cover Sheet • Example 37

Hinweis der Redaktion

  1.