This document presents a framework for evaluating health IT projects. It consists of several components: [1] A project structure template to guide planning, preparation, evaluation, and dissemination of results. [2] Multidimensional evaluation methods that assess both qualitative and quantitative outcomes across technical, clinical, and organizational areas. [3] Criteria pools for selecting evaluation measures. [4] Guidelines for confidentiality, analysis, and reporting of results. The goal is to provide consistent, high-quality evaluation that identifies benefits and areas for improvement to inform future health IT implementations.
Race Course Road } Book Call Girls in Bangalore | Whatsapp No 6378878445 VIP ...
Â
A Framework for Health IT Evaluation
1. A Framework for Health IT
Evaluation
Jim Warren, Malcolm Pollock,
Sue White, Karen Day, Yulong Gu
The National Institute
Informatics for Health Innovation
2. Motivation
⢠We need to learn from our
health IT experiences
â The literature [only] indicates that
health IT works sometimes
(when reported), and in specific
areas and for specific outcomes*
* Lau F, Kuziemsky C, Price M, Gardner J. A
review on systematic reviews of health
information system studies. J Am Med Inform
Assoc;17(6):637-45, 2010.
3. National Health IT Plan
Creates a host of
innovative health IT
projects specifically
in need of evaluation
4. Need for an evaluation framework
⢠Evaluation of health IT is complex
â Evaluation has many aspects:
technology, procedure / workflow, culture
â Every aspect changes over time
⢠A framework provides
â Quality, consistency (and efficiency)
â Identifies key elements for:
planning, conduct, reporting and
dissemination
5. Methodology
⢠Literature review
â Including Lau (Canada),
Westbrooke (Aus), Greenhalgh
(UK)
⢠Application to 4 NZ electronic
referral (eReferral) projects
⢠Iterative feedback
Please consider yourself part of
the procedure!
6. The Framework
Intended use recommendations
Project structure template and guidelines
Moderation & Support
Confidentiality & Analysis
Criteria Pool
Examples
ethics guidelines guidelines
Multi-dimensional
Evaluation Methods
Results reporting template
Dissemination guidelines
7. Project plan template
Phase 1: Planning
Vision, Timeline, Staffing, Budgeting, Monitoring
Phase 2: Preparation
Governance, Local Commitment, Approach Definition, Formal Approvals
Phase 3: Evaluation
Iterative, Qualitative & Quantitative, Synthesis and Analysis
Phase 4: Dissemination
Interim (formative) and Summative
8. Multidimensional evaluation
⢠Not narrowly focused on a single
outcome
â Draw from criteria pool
â Thematic analysis of semi-structured
interviews
â Analysis of transactional data: this
framework is for systems seeing real health
delivery use, so there are records
⢠Want to know whatâs good, and what
could be better
⢠Action Research (AR) orientation
â Probe for solutions to problems identified
â Disseminate early and often
9. Criteria Pool (1)
Impact
⢠Work and communication patterns
e.g. relationship between types of
providers
⢠Organisational culture
e.g. Patient engagement
⢠Safety & quality
⢠Clinical effectiveness
(generally process measures only)
10. Criteria Pool (2)
⢠Product
â IT System Integrity
⢠Including standard compliance
â Usability
⢠Efficiency, learnability, satisfaction
â Vendor factors
⢠Process
â Project management
â Participant experience
â Leadership and governance
11. What can the âtransactionalâ data
tell us?
Quantitative
Use / uptake
Substantial? Sustained?
Cycle times
Message sizes and authorship
distribution
Qualitative
Message content themes
12. What can interviews and focus
groups tell us?
Key benefits / value-add
Perceived threats
Areas for improvement
E.g. usability barriers
Changes in culture and networks
Experience of the process of system
adoption
How might we do it better in the future?
And it builds stakeholder engagement â esp.
when people find out theyâve been listened to!
13. Dissemination of Findings
⢠Must put the results to work
⢠Interim
Inform and encourage
stakeholders, evoke their
feedback
⢠Summative
â Multi-modal approach
Stakeholder voices, as well as
stats
â Integral to project plan
â Create and sustain
communities of interest
â Learning continues
15. Further Dissemination Pathways
⢠Conventional written report
â Web presence (Intro page, HTML
Exec summary, PDF full report)
â Clear authorship, invite citation and
feedback
⢠Seminars
⢠Media
⢠Organisational channels (e.g.
newsletters)
⢠Academic publication
⢠Promote systemic usage
16. What have we learned?
⢠Take a pragmatic approach to
evaluation, using both qualitative
and quantitative data
⢠Iterate!
⢠Action Research (AR) orientation
⢠Dissemination of the findings is
integral and should reach all
stakeholders considering uptake of
similar technology
17. Questions?
Thank you!
jim@cs.auckland.ac.nz
(CD with full reports from NIHI stand)