SlideShare ist ein Scribd-Unternehmen logo
1 von 104
1
2
3
Introduction:
 From 1995s in the hope of :
Medical information systems involve:
Computer-
stored databases
• patient
information to
support medical
order entry
• Results
reporting
• Decision support
systems
• clinical
reminders
Comprehensive
systems
• coordinates
patient care
activities by
linking
computer
terminals in
patient care
areas to all
departments
Smaller separate
systems
•Link patient care
areas to only one
department
•Laboratory system
•Radiology system
•Pharmacy system
•Expert systems
•Computerized
databases
CPOE
• Concerns
about patient
safety and
medical error
• Need to Evaluation
 Unfortunately reports of system failures have continued
1.Anderson, J.G. and C. Aydin, Evaluating the Organizational Impact of Healthcare Information
Systems. 2005: Springer.pp 5-8.
Hope
• Increasing efficiency, reducing costs, and improving patient care
1995 • Healthcare applications appeared
4
Evaluation Definition:
 Evaluation can be defined as “the act of measuring or exploring properties
of a health information system (in planning, development , implementation,
or operation),the result of which informs a decision to be made concerning
that system in a specific context.”
 Evaluating should perform not only in technology assessment but also in
the social and behavioral processes
1.Ammenwerth, E., et al., Evaluation of health information systems—problems and challenges.
International Journal of Medical Informatics, 2003. 71(2-3): p. 125-135.
2. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
During IT Evaluation
Not only the technology itself
But also the interaction between IT
and human players in their information
processing
Must be
Considered
5
Evaluation is the means to assess:
5
Quality
Value
Impacts of IT in the health
care environment
Effects
6
Aim of evaluation
:
To provide the basis for a decision about the IT system investigated
Decision-making context is also the context of the evaluation .[2]
1- Yusof, M. M., et al. (2008). "Investigating evaluation frameworks for health information systems." Int J Med Inform
77(6): 377-385.
2-Brender, J., Handbook of Evaluation Methods for Health Informatics ,2006: Academic Press.p 9-11.
Improve performance
Outcomes
Safety
Effectiveness
7
Questions should be answered:
 Evaluation seeks to answer the :
Why (objective of evaluation),
Who (which stakeholders’ perspective is going to be evaluated)
When (which phase in the system development life cycle),
What (aspects or focus of evaluation)(What is it going to be used for ?)
How (methods of evaluation) questions [1],[2]
1- Maryati Mohd. Yusof, R.J.P., Lampros K. Stergioulas. Towards a Framework for Health Information Systems
Evaluation. in Proceedings of the 39th Hawaii International Conference on System Sciences. 2006. Hawaii IEEE.
2-SYMONS, V.J., A review of information systems evaluation-content ,context and process. Eur J Inf Syst, 1991. 1(3): p.
205-212.
8
1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2 006: Spr
inger .p 6.
9
SDLC
Maintenance
Implementation
Testing
Design
Analysis
Planning
10
Pre-implementation evaluation
Inform difficult decisions
Prior to starting any program
As well as post-implementation
evaluation
1.Nykanen, P., et al., Guideline for good evaluation practice in health informatics
(GEP-HI). Int J Med Inform, 2011. 80(12): p. 815-27.
11
Assessment:
There are two basic types of assessment:
I. Summative
II. Formative [1]
 Measured :
Qualitative and Quantitative:
Putting the results into a metric context .
 Qualitatively and subjectively
 Quantitatively and objectively :With the aid of a questionnaire
study..[2]
1-Brender, J., Handbook of Evaluation Methods for Health Informatics ,2006: Academic Pr
ess. pp 9-14.
2-Brender, J., Handbook of Evaluation Methods for Health Informatics ,2006: Academic Pre
ss. pp 20-29.
12
Formative and Summativ
e evaluation
Formative evaluation
Throughout the systems
lifecycle
It provides information for
improving the system under
development
Summative evaluation
Focused on assessing the effect or
outcome of the evaluation object
At a certain point of time after
implementation
1.Nykanen, P., et al., Guideline for good evaluation practice in health informatics
(GEP-HI). Int J Med Inform, 2011. 80(12): p. 815-27.
13
Quantitative & qualitative methods:
• Task analysis
• Interface design
• Time motion analysis
• Software log
• Questionnaire
Quantitative methods
• Think aloud protocol
• Unstructured interviews
Qualitative methods
To assess
• user satisfaction
• user-perceived
• usefulness
• usability of this
framework.
Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
14
Categories of most evaluatio
n studies:
I. CPR evaluation studies,
II. Telemedicine evaluation studies,
III. DSS evaluation studies.
 Evaluation can be conveniently classified into:
Objectivist and Subjectivist approaches . [2]
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
2- C.P. Friedman, J. Wyatt, Evaluation methods in biomedical informatics, 2nd ed., Springer Science
& Business,Media, New York, US, 2006.
15
Methodology and Methods:
A method is based on a well-defined theory and includes a
consistent set of techniques, tools and principles to
organize it.
Methodology
Method
metric
metric
measure
measure
Method
metric
Method
Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. p 13.
16
Methodology
• Methodology is supposed to:
(1) provide the answer to what to do next, when to do what, and how to do it
(2) to describe the ideas behind such choices and the suppositions (for instance
, the philosophical background) behind them.
 A methodology must comprise:
a) The basic philosophies and theories, so that a user of the methodology can
judge the validity of its use
b) Perspective
c) Assumptions
d) Areas of use
e) Applicable methods, tools, and techniques
Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. PP14-15.
17
Complexity of evaluation
in biomedical informatics
1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2006:
Springer .p 6.
Evaluation
methodology
Computer-based
information
systems
Medicine and
healthcare
delivery
Evaluation in
medical
informatics
18
19
User Requirements Specificati
on
• Analysis of Work Procedures
• Assessment of Bids
• Balanced Scorecard
• BIKVA
• Delphi: Qualitative assessment
• FieM Study
• Focus Group Interview
• Future Workshop
• Grounded Theory
• Heuristic Evaluation
• Interview (no standardized)
• KUBI
• Logical Framework Approach
• Organizational Readiness
• Pardizipp
• Questionnaire (no standardized)
• Requirements Assessment
• Risk Assessment
• Social Network Analysis
• Stakeholder Analysis
• SWOT
• Usability
• Video recording
• WHO: Framework for Assessment of
Strategies
20
Technical Development Phase
Used to provide feed-back for the technical development.
 Balanced Score card
 Clinical/Diagnostic Performance
 Cognitive Assessment
 Cognitive Walkthrough
 Heuristic Evaluation
 Risk Assessment
 SWOT
 Technical Verification
 Think Aloud
 Usability
21
Assessment Methods : Adapta
tion Phase
• Analysis of Work Procedures
• BIKVA
• Clinical/Diagnostic Performance
• Cognitive Assessment
• Cognitive Walkthrough
• Equity Implementation Model
• Field Study
• Focus Group Interview
• Functionality Assessment
• Grounded Theory
• Heuristic Evaluation
• Interview (nonstandardized)
• Prospective Time Series
• Questionnaire (nonstandardized)
• RCT, Randomized Controlled Tria
l
• Risk Assessment
• Root Causes Analysis
• Social Network Analysis
• SWOT
• Technical Verification
• Think Aloud
• Usability
• User Acceptance and Satisfaction
• Videorecording
Real operational assessment can take place.
22
Assessment Methods: Evaluat
ion Phase
1. Analysis of Work Procedures
2. Balanced Score card
3. BIKVA
4. Clinical/Diagnostic Performance
5. Cognitive Assessment
6. Cognitive Walkthrough
7. Delphi
8. Equity Implementation Model
9. FieMStudy
10. Focus Group Interview
11. Functionality Assessment
12. Grounded Theory
13. Heuristic Evaluation
14. Impact Assessment
15. Interview (nonstandardized)
16. KUBI
17. Prospective Time Series
18. Questionnaire (nonstandardized)
19. RCT, Randomized Controlled Trial
20. Risk Assessment
21. Root Causes Analysis
22. Social Network Analysis
23. Stakeholder Analysis
24. SWOT
25. Technical Verification
26. Think Aloud
27. Usability
28. User Acceptance and Satisfaction
29. Videorecording
30. WHO: Framework for Assessment
of Strategies
23
• Usability methods, such as heuristic evaluation, cognitive
walk-throughs , RCT and user testing, are increasingly used
to evaluate and improve the design of clinical software
applications.
• Evaluation studies do not focus solely on the structure and
function of information resources; they also address their
impact on persons who are customarily users of these
resources and on the outcomes of users’ interactions with
them to understand users’ actions.
1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2006: Springer .p 6.
2- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic
Press.
24
1- Analysis of Work Procedures
• Assessing what actually happens compared to the expectations.
They may clearly have a role as part of an evaluation study.
• Some well-known options :
1. The Learning Organization
2. Enterprise modeling
3. Business Process Reengineering
4. Use Cases and scenarios
5. Total Quality Management
6. Health Technology Assessment (HTA)
7. Computer-Supported Cooperative Work (CSCW)
8. Cognitive Task Analysis
1- Brender, J., Handbook of Evaluation Methods for Health Informatics
2006: Academic Press.
25
Analysis of Work Procedures
*Note: The use of diagramming techniques and
other forms of graphical modeling requires
experience and understanding the principles
and procedures involved.
Applied in
a health
care setting
Provide
clear
structure:
Framework
• Potential
Views
• Level of
systems
Systems analyses in
a health care
environment
We can use
Diagramming
techniques
26
Enterprise Modeling
Enterprise Architecture (EA) is a strategic activity and
planning tool for an enterprise, which facilitates decision-
making by enabling a conceptual view of the enterprise.
The main objective of an EA approach is to define the
layout of organizational components and relationships
among them.
27
Business Process Modeling
28
Use Case and Scenario
29
Health Technology Assessme
nt (HTA):
• HTA is concerned with the systematic evaluation of the
consequences of the adoption and use of new health
technologies and improving the evidence on existing
technologies. [1,2]
• One of the basic lessons learned in the area of HCI is
that usability evaluation should start early in the design
process
• Goodman and Ahn (1999) provide a basic overview of
HTA principles and methods. [3]
1. O'Reilly, D., K. Campbell, and R. Goeree, Basics of health technology assessment.
Methods Mol Biol, 2009. 473: p. 263-83.
2. Stevens, A., R. Milne, and A. Burls, Health technology assessment: history and de
mand. J Public Health Med, 2003. 25(2): p. 98-101.
3. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Acade
mic Press.
30
2- Balanced Scorecard
• Created in 1992 by Drs. Robert S. Kaplan and David P.
Norton, the Balanced Scorecard (BSC) is a revolutionary
way to handle strategy management.
• Balancing focus areas by means of a set of indicators for
a set of strategic objectives.
• The Balanced Scorecard is a tool used to measure an
organization’s activities and initiatives against its Vision,
Mission and Values as outlined in its Strategic Plan.
1. http://www.hrh.ca/balancedscorecard, Humber River Hospital, 10-20-2016
2. https://2gc.eu/- Balance Scorecards reports-10-20-2016-
2GC is a strategic execution consultancy with particular experience in
implementing the latest generation Balanced Scorecards. With a global client
list,
31
Balanced Scorecard is a Manage
ment System
Not just a
measurement system
You do not just measure
your heart beat
You use the
heart rate monitor
to manage your exercise
32
33
3-BIKVA :User Involvement in
Quality
Development A tool for making critical, subjective decisions about an
existing practice.
1. Identifying user satisfaction or dissatisfaction.
2. Summarizing the information in group interview.
3. User encouraged to identify the reasons behind the incidents and
interactions referred to.
4.Interview management in iterative process with the objective of
clarifying all the disparities that relate to the issues of quality
identified by the users.
5.conclusion will then be presented to the decision makers for an
assessment
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic
Press.
34
BIKVA :User Involvement in Q
uality
Development
 Developed in Denmark and it is an evaluation and quality
enhancement method .
 The evaluation method is interviewing. The evaluation
process starts from the clients, then moves to the front-line
staff (employees in direct contact with the clients) and
finally ends to managers and politicians. Clients are asked
to express and justify "why they are satisfied or dissatisfied
" with the services offered.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
(2016). "Linkki-hanke - yhteys työelämään." Retrieved 11/9/2016, 2016, from :
http://www.palmenia.helsinki.fi/linkki/.
35
3- Questionnaires
 Advantage:
• most people can manage it
 Different Types :
A. Open questions
B. Checklist questions
C. The Likert scale consisting of a bar with fields to tick on a scale
D. Multipoint scale
E. Semantic differential scale, which in tabular form uses columns wit
h a value scale (for instance, "extremely", "very“,….)
F. Categorical scale
 Need verification : need thorough scientific validation of the questi
onnaire in order to provide results leading to optimal action
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic
Press. p 164
36
Questionnaires:
• Questionnaires contain items that are pre-determined by the in
vestigators and consequently are of limited value in identifyi
ng new or emergent issues in the use of a system that the inve
stigators have not previously thought of. [1]
1.Kushniruk, A.W. and V.L. Patel, Cognitive and usability engineering methods
for the evaluation of clinical information systems. J Biomed Inform, 2004. 37(1
): p. 56-76.
37
4- Clinical/Diagnostic Perform
ance Measurement of the diagnostic performance in measures of accuracy
and precision for IT-based expert systems and decision-support
systems before the system is implemented.
 Measurement is doing in Technical Development phase but it can
continue during the operational phase (the Adaptation Phase and
Evolution Phase )
 The clinical performance of the systems (for diagnostic, prognostic ,
screening tasks, etc.) is typically measured with measures from
medicine, such as accuracy, precision, sensitivity, specificity, and
predictive values.
1.Kaplan, B., Evaluating informatics applications—clinical decision systems. Internationa
l Journal of Medical Informatics, 2001. 64: p. 15-37.
2. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic
Press.
38
39
*Point*
• Kaplan reviews the literature (only for the years
1997- 1998) on clinical performance of
decision-support systems for the period and
includes other review articles on this subject
and he concluded that most studies use an
experimental or randomized controlled clinical
trials design (RCT) to assess system
performance.
1.Kaplan, B., Evaluating informatics applications—clinical decision systems. International Journal of
Medical Informatics, 2001. 64: p. 15-37.
40
RCT is a trial in which subjects are randomly assigned to two groups:
one (the experimental group) receiving the intervention that is being
tested, and the other (the comparison group or controls) receiving an
alternative treatment.
The two groups are then followed up to see if any differences between
the result. This helps in assessing the effectiveness of the intervention.
40
41
Hierarchy of Study Types
Descriptive
•Case report
•Case series
•Survey
Analytic
Observational
•Cross sectional
•Case-control
•Cohort studies
Experimental
•Randomized
controlled trials
Strength of evidence for causality between a risk factor and outcome
41
42
‫مطالعات‬ ‫انواع‬
‫توصيفي‬ ‫تحليلي‬
‫مداخله‬‫اي‬‫اي‬ ‫مشاهده‬
‫باليني‬ ‫مايي‬‫ز‬‫آ‬‫ر‬‫كا‬
‫اجتماعي‬ ‫مايي‬‫ز‬‫آ‬‫ر‬‫كا‬
‫ميداني‬ ‫مايي‬‫ز‬‫آ‬‫ر‬‫كا‬
‫مقطعي‬
‫شاهدي‬ ‫مورد‬
‫كوهورت‬
‫اكولوژيك‬
‫مورد‬‫ش‬‫ر‬‫ا‬‫ز‬‫گ‬
‫د‬‫ر‬‫موا‬‫ش‬‫ر‬‫ا‬‫ز‬‫گ‬
Goal:
To test
hypotheses
about the
determinants
of disease
In contrast, the goal of intervention
studies is to test the efficacy of
specific treatments or preventive
measures by assigning individual
subjects to one of two or more
treatment or prevention options
42
43
5-RCT, Randomized Controlle
d Trial
 Randomized controlled trials are the ideal study design to
evaluate the effectiveness of health-care interventions.
 RCT is used to identify marginal differences between two or more
types of treatment.
 Useful in assessment studies of IT-based solutions for decision-
support systems and expert systems, only to a limited degree for
other types of systems.
1.Navaneethan, S.D., et al., How to design a randomized controlled trial. Nephrology (Carlton), 2010. 15(8): p.
732-9.
2. Bothwell, L.E. and S.H. Podolsky, The Emergence of the Randomized, Controlled Trial. N Engl J Med, 2016.
375(6): p. 501-4.
P 172
3- 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
44
All RCTs are controlled clinical trial
BUT
Not all controlled trials are RCTs
Randomization
simple cluster stratified
45
5-RCT, Randomized Controlle
d Trial
 An RCT is the most rigorous scientific method for
evaluating the effectiveness of health care
interventions.
 However, bias could arise when there are flaws in the
design and management of a trial.
1.Akobeng, A., Understanding randomized controlled trials. Archives of Disease in C
hildhood, 2005. 90(8): p. 840-844.
46
• ‫اثربخشی‬
• ‫عوارض‬RCT
‫روش‬ ‫بهترين‬ ‫انتخاب‬
‫پيشگيري‬
‫يا‬‫درمان‬
‫به‬ ‫رسيدن‬ ‫براي‬‫هدف‬
47
‫به‬ ‫ساده‬ ‫تصادفی‬ ‫روش‬ ‫به‬ ‫ها‬ ‫نمونه‬
‫و‬ ‫مورد‬ ‫گروههای‬‫ميشوند‬ ‫تقسيم‬ ‫شاهد‬
‫روشهای‬ ‫قويترين‬ ‫از‬ ‫يکی‬ ‫روش‬ ‫اين‬
‫ميباشد‬ ‫پزشکی‬ ‫درعلوم‬ ‫تحقيقاتی‬
‫از‬ ‫عبارتند‬ ‫که‬ ‫دارد‬ ‫مهم‬ ‫ويژگی‬ ‫سه‬:
‫به‬ ‫ها‬ ‫نمونه‬ ‫انتساب‬ ‫بودن‬ ‫تصادفی‬
‫بررسی‬ ‫مورد‬ ‫گروههای‬
(Random Allocation)
‫کنترل‬ ‫گروه‬ ‫وجود‬.
‫ها‬ ‫نمونه‬ ‫ومواجهه‬ ‫محقق‬ ‫وکنترل‬ ‫دخالت‬
‫فرضی‬ ‫علت‬ ‫با‬.
Randomization
48
National Cancer Institute at the National Institutes of Health;
https://www.cancer.gov/about-cancer/treatment/clinical-trials/what-are-
trials/randomization; 11/2/2016.
49
RCT usage:
I. By some patients being treated under the old s
ystem and others under the new system, or equ
ally with regard to staff;
II. By some departments keeping the old system a
nd other department getting the new system;
III. Comparing similar departments in different ho
spitals.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
50
‫نیست؟‬ ‫مناسب‬ ‫مواردی‬ ‫چه‬ ‫برای‬RCT
• Testing the etiology of disease
• rare outcome
• Low participation
• Exclusion
• Low cost
51
RCT IS NOT suitable for:
* ETIOLOGY AND CLINICAL COURSE
smoking and cancer
* RARE & PROLONGED OUTCOME
52
CASP
RCTs - a checklist
• Good randomisation procedures
• Patients blind to treatment
• Clinicians blind to treatment
• All participants followed up
• All participants analysed in the groups to
which they were randomised (intention to
treat)
53
6- Delphi
• Delphi may be characterized as a method for structuring
a group communication process so that the process is
effective in allowing a group of individuals, as a whole,
to deal with a complex problem. [1,3]
• When judgment is necessary with controlled opinion
feedback: we use Delphi
• Qualitative assessment of an effect
• Method for the prediction of the future
• Key advantage : avoids direct confrontation of the
experts.
• Tool for expert problem solving [1,2]
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
2.Okoli, C. and S.D. Pawlowski, The Delphi method as a research tool: an example, design considerations
and applications. Information & Management, 2004. 42(1): p. 15-29.
3-‫تكنيك‬‫دلفي‬:‫علوم‬ ‫در‬ ‫آموزش‬ ‫ايراني‬ ‫مجله‬ ،‫اباذري‬ ‫پروانه‬ ،‫نصيرياني‬ ‫خديجه‬ ،‫احمدي‬ ‫اله‬ ‫فضل‬ ،‫تحقيق‬ ‫در‬ ‫ابزاري‬‫بهار‬ ،‫پزشكي‬‫ت‬ ‫و‬‫ابستان‬-138
7185(:1)8‫؛‬
54
Delphi:
• Delphi survey, including guidelines for data collection, d
ata analysis and reporting of results .
• Three ways of Delphi methods:
1- Classic Delphi, 2- Policy Delphi, 3- Decision Delphi
• Preparation of a questionnaire, including the perils and
pitfalls associated with a Questionnaire approach
1.Okoli, C. and S.D. Pawlowski, The Delphi method as a research tool: an example, design considerations a
nd applications. Information & Management, 2004. 42(1): p. 15-29.
2.Kushniruk, A.W. and V.L. Patel, Cognitive and usability engineering methods for the evaluation of clinical informati
on systems. J Biomed Inform, 2004. 37(1): p. 56-76.
3.‫دلفي‬ ‫تكنيك‬:‫تا‬ ‫و‬ ‫بهار‬ ،‫پزشكي‬ ‫علوم‬ ‫در‬ ‫آموزش‬ ‫ايراني‬ ‫مجله‬ ،‫اباذري‬ ‫پروانه‬ ،‫نصيرياني‬ ‫خديجه‬ ،‫احمدي‬ ‫اله‬ ‫فضل‬ ،‫تحقيق‬ ‫در‬ ‫ابزاري‬‫بستان‬-1387
185(:1)8‫؛‬
55
56
Experts selection:
57
‫دلفی‬ ‫روش‬ ‫اجرايی‬ ‫مراحل‬:
‫دلفی‬ ‫روش‬ ‫اصول‬‫گرايی‬ ‫کثرت‬ ‫و‬ ‫جمعی‬ ‫خرد‬ ‫از‬ ‫استفاده‬‫شرکت‬ ‫نظرات‬
‫ميباشد‬ ‫کنندگان‬.
‫قابل‬ ‫منابع‬ ‫و‬ ‫موضوع‬ ‫گستره‬ ‫و‬ ‫دامنه‬ ‫به‬ ‫توجه‬ ‫با‬ ‫دلفی‬ ‫روش‬ ‫در‬
‫دسترس‬‫متفاوت‬ ‫کنندگان‬ ‫شرکت‬ ‫تعداد‬‫بود‬ ‫خواهد‬.
‫هم‬ ‫برای‬ ‫کنندگان‬ ‫شرکت‬ ‫که‬ ‫است‬ ‫اين‬ ‫بر‬ ‫تالش‬ ‫روش‬ ‫اين‬ ‫در‬‫ناشناس‬
‫بمانند‬ ‫باقی‬.
‫مراحل‬ ‫اين‬ ‫طی‬ ‫در‬ ‫که‬ ‫ميباشد‬ ‫تکرار‬ ‫دوره‬ ‫چندين‬ ‫شامل‬ ‫روش‬ ‫اين‬
‫که‬ ‫است‬ ‫آن‬ ‫بر‬ ‫سعی‬‫گروهی‬ ‫نظرات‬ ‫از‬ ‫استفاده‬ ‫حداکثر‬‫عمل‬ ‫به‬ ‫را‬
،‫آورده‬‫و‬ ‫مخالفت‬‫برسد‬ ‫حداقل‬ ‫به‬ ‫ناسازگاری‬.‫در‬ ‫معموال‬2‫يا‬3‫راند‬
‫يابد‬ ‫می‬ ‫پايان‬.
‫صورت‬ ‫به‬ ‫اول‬ ‫مرحله‬ ‫ليست‬ ‫چک‬‫مند‬ ‫ساختار‬ ‫نيمه‬ ‫يا‬ ‫و‬ ‫باز‬ ‫فيلد‬
(‫جديد‬ ‫های‬ ‫آيتم‬ ‫پيشنهاد‬ ‫جهت‬ ‫باز‬ ‫فيلد‬ ‫همراه‬ ‫به‬)‫می‬ ‫طراحی‬
‫گردد‬.
‫صورت‬ ‫به‬ ‫مراحل‬ ‫ساير‬ ‫ليست‬ ‫چک‬‫مند‬ ‫ساختار‬‫گردد‬ ‫می‬ ‫طراحی‬( .‫بدون‬
‫باز‬ ‫فيلد‬)
‫از‬ ‫معمول‬ ‫طور‬ ‫به‬‫معيار‬5‫ليکرت‬ ‫تايی‬‫گردد‬ ‫می‬ ‫استفاده‬.
58
6- Delphi
1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2006: Springer .p 182.
59
7-Usability
• Usability evaluation is a method for identifying specific problems with IT p
roducts and specifically focuses on the interaction between the user and tas
k in a defined environment.[1,4]
• The usability evaluation methods are so various which makes the choice of
the appropriate one(s) a difficult task for evaluators. Usability factors are va
rious features that are used to measure how easy systems are in supporting
users’ tasks.[2,3]
1.Brown Iii, W., et al., Assessment of the Health IT Usability Evaluation Model (Health-ITUEM) for evaluating
mobile health (mHealth) technology. Journal of Biomedical Informatics, 2013. 46(6): p. 1080-1087.
2.Dhouib, A., et al., A classification and comparison of usability evaluation methods for interactive adaptive syst
ems. 2016: p. 246-251.
3.Borycki, E., et al., Usability Methods for Ensuring Health Information Technology Safety: Evidence-Based Ap
proaches Contribution of the IMIA Working Group Health Informatics
for Patient Safety. IMIAYearbook of Medical Informatics, 2013: p. 20-27.
4. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
60
USABILITY EVALUATION METHODS FOR INTER
ACTIVE ADAPTIVE SYSTEMS:
• Cognitive Walkthrough
• Heuristic Evaluation
• Focus Group
• User-as-wizard
• Task experiment based
• Simulated users
1.Dhouib, A., et al., A classification and comparison of usability evaluation meth
ods for interactive adaptive systems. 2016: p. 246-251.
61
Usability Evaluation
Methods
Advantages Disadvantages
Cognitive
walkthrough
(+) Makes predictions
about the design
without
involving users,
(+) Can be used early
in the preliminary evalu
ation phase.
(–)Certain adaptation
aspects not covered,
(–)Time consuming.
Heuristic evaluation (+) Quick and low-cost
evaluation method,
(+) Can be done early
in the IAS development
process
(–) Evaluators must
be experts,
(–) Need to choose
The appropriate
heuristics.
Focus group (+) Fast to conduct,
(+) Can provide large
amounts of data in
short time
(–) Support only
subjective opinions
(–) Depends on
experienced
moderator.
1.Dhouib, A., et al., A classification and comparison of usability evaluation methods for interactive adaptive
systems. 2016: p. 246-251.
62
8- Cognitive Assessment
• Task-oriented
• Assessment of the cognitive aspects of the interaction
between an IT system and its users
• Cognitive aspects deal with the extent to which the system
functions in accordance with the way the users
• Think and work in respect of user interaction with the IT
system in general and with the user dialogue in particular .
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
63
9- Cognitive Walkthrough
• Cognitive Walkthroughs an analytical method designed to evaluate
usability
• Assessment of user 'friendliness' on the basis of a system design,
from specification, to muck-ups and early prototypes of the system,
aimed at judging how well the system complies with the users' way
of thinking
• The cognitive walkthrough is a formalized way of imagining people
’s thoughts and actions when they use an interface for the first time.
1. Wharton, C., et al., The cognitive walkthrough method: a practitioner's guide, in Usability i
nspection methods, N. Jakob and L.M. Robert, Editors. 1994, John Wiley & Sons, Inc.
p. 105-140.
2. A. W. Kushniruk, A. W. Kushniruk, E. M. Borycki-Human, Social, and Organizational Aspe
cts of Health Information Systems-IGI Global, Premier Reference Source (2008)
3. Xiao, T., Broxham, W., Stitzlein, C., Croll, J., & Sanderson, P. Two human-cent
ered approaches to health informatics: Cognitive systems engineering and usa
bility. in Proceedings of the WCC IFIP-IMIA International eHealth Joint Conf
erence. 2010. Brisbane, Qld.
64
Cognitive Walkthrough
• CW is an inspection method for evaluating usability in a user
interface.
• The questions are aids to simulating the user’s cognitive process:
(1)Will the user be trying to achieve the right effect?
(2)Will the user discover that the correct action is available?
(3)Will the user associate the correct action with the desired effect?
(4)If the correct action is performed, will the user see that progress is being
made?
answered with YES or NO and reasons why the user will succeed or fail in
performing the action (“failure/success stories”). [1,2]
1.Blig, L.-O. and A.-L. Osvalder, Enhanced cognitive walkthrough: development of the cogniti
ve walkthrough method to better predict, identify, and present usability problems. Adv. in Hum.-
Comp. Int., 2013. 2013: p. 9-17.
2.Wharton, C., et al., The cognitive walkthrough method: a practitioner's guide, in Usability ins
pection methods, N. Jakob and L.M. Robert, Editors. 1994, John Wiley & Sons, Inc. p. 105
-140.
65
CW
• The method focuses on simplicity in learning, especially
through exploratory learning. CW has been employed to
evaluate medical equipment, such as clinical information
systems, patient information systems, clinical order
systems , dialysis machines, and patient surveillance
systems .[1]
1.Blig, L.-O. and A.-L. Osvalder , Enhanced cognitive walkthrough: development of the c
ognitive walkthrough method to better predict, identify, and present usability problems. A
dv. in Hum.-Comp. Int., 2013. 2013: p. 9-17.
66
10-Focus Group Interview
• A focus group is a form of qualitative research in which
a group of people are asked about their perceptions,
opinions, beliefs, and attitudes towards a product.
• The moderator speaks very little, and encourages the
group to generate the information required by stimulating
discussion through terse provocative statements.
 Challenge: analyzing qualitative data
1.Rabiee, F., Focus-group interview and data analysis. Proc Nutr Soc, 2004. 63(4): p. 655-60.
2.Chapter 5: Personal Interviews. Marketing research and information systems. (Marketing and Agribusiness
Texts 2016 [cited 2016 10/31/2016]; Available from: http://www.fao.org/docrep/w3241e/w3241e06.htm.
67
Time of focus group interview
:
During the
early analysis
phases
During
operations
68
 The focus used for eliciting descriptive data from population subgroups
 Usually, a group of eight to twelve persons are gathered together for a
group interview or discussion on a focused topic.
 Focus groups are widely used in the investigation of applied-research
problems and are recognized as a distinct research method.
 The method enables researchers to generate new hypotheses;
 To explore intermediate variables as a means of explaining certain
relationships found in survey data.
1. Bender, D.E. and D. Ewbank, The focus group as a tool for health research: issues in design and analysis. Health Transit
Rev, 1994. 4(1): p. 63-80.
69
11-Equity Implementation Mo
del (EIM)
• Examine users' reaction to the implementation of a new system
• It may be applied as a measuring instrument to provide the decision-
making basis relation to policy planning and priority setting, as well
as for examination of barriers and drivers at the implementation of
change.
• The EIM helps provide a theory-based understanding for collecting
and reviewing users' reactions to, and acceptance or rejection of, a
new technology or system.
1. Lauer, T.W., K. Joshi, and T. Browdy, Use of the equity implementation model to review clini
cal system implementation efforts: a case report. J Am Med Inform Assoc, 2000. 7(1): p. 91-10
2.
2. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
70
• The method provides a means for explanation of system
implementation events, leading to an understanding of user
resistance or acceptance of the new technology.
• It may be applied as a measuring instrument to provide the decisi
on-making basis relation to policy planning and priority setting, as
well as for examination of barriers and drivers at the implementati
on of change.
71
12- Functionality Assessment
1. Validation of objectives fulfillment
2. Impact Assessment
3. Identification of problems
• It is best suited to qualitative studies
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
72
13-FieMStudy
 Observation of an organization to identify its practices
and to expose mechanisms that control change.
 This method is widely used in psychology, sociology,
anthropology, and so on that is, in professions that deal
with different perspectives of human factors - to identify
what goes on and how
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
73
14-Grounded Theory
• Grounded Theory is a supportive analytical method for
data acquisition methods that generate textual data, some
open Questionnaire methods, and Interviews.
• This method is subjective.
• In a social context through an iterative and comparative
process in 4 phase:
 Open coding
 Axial coding
 Selective coding
 Theoretical coding
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
74
15-Heuristic Evaluation
• Used when no other realizable possibilities exist
• This method can be used for nearly everything but in practice it is m
ost commonly used for the assessment of user interfaces
• Heuristic evaluation, an established evaluation method in the field o
f Human-Computer Interaction, was originally proposed by Jakob N
ielsen .
 Five phase:
1. Selection of appropriate heuristics,
2. An individual and independent inspection of the design using heuristics to i
dentify features that conflict with some aspect of best practice,
3. Editing the joint material to identify and resolve duplicates and related findi
ngs,
4. Prioritization to identify severities
5. Analysis and report writing
1.Westbrook, J. and J.I. Westbrook, Information Technology in Health Care 2007: Procee
dings of the 3rd International Conference on Information Technology in Health Care: So
cio-Technical Approaches. 2007: IOS Press . pp 205-207
2- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
75
76
Zhang and colleagues incorporated Nielsen’s 10 heuristics ,
Shneiderman’s eight golden rules , and the results of their research to
formulate a list of 14 heuristics.
1.Allen, M., et al., Heuristic evaluation of paper-based Web pages: A simplified inspect
ion usability methodology. Journal of Biomedical Informatics, 2006. 39(4): p. 412-423.
77
16- Impact Assessment
• Measurement of the effect plus assessment
of side effect
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
78
17- Interview (Nonstandardized)
• For qualitative studies of subjective
as well as objective circumstances.
1- Brender, J., Handbook of Evaluation Methods for Health
Informatics 2006: Academic Press.
79
18- KUBI
• Translated: Quality development through user involvement
• It has many points in common with the Balanced Scorecard
method
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
80
19- SWOT
• Allows the assessment of an organization from a
neutral perspective through a detailed discussion
of the organization’s strengths, weaknesses, opp
ortunities and threats.
81
SWOT
• The method is useful as a brainstorming method to elucidate a
nd evaluate each of the aspects brought forward.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
82
20- Think Aloud
• Think Aloud is a method that requires users
to speak as they interact with an IT-based
system to solve a problem or perform a
task.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
83
• Thinking aloud technique dates back to the works of experimental
psychology and was first ever described by Karl Duncker (1945) wh
ile he studied productive thinking.
• TAP is popularly used by usability researchers today. But what do
researchers think they get from TAP? Is it right to assume that there
is a one-to-one mapping between verbal protocols and ‘pure data’?
• TAP adds cognitive load on users and can hinder primary tasks
(Preece, 1994). So how do the users experience it? What do users thi
nk of it?
Research Paper: Janni Nielsen, Torkil Clemmensen, and Carsten Yssing. 2002. Getting access to what goes
on in people's heads?: Reflections on the think-aloud technique. In Proceedings of the second Nordic
conference on Human-computer interaction (NordiCHI '02). ACM, New York, NY.
84
• Thinking is much more that what can be explicitly expressed in words.
•To get access to human cognitive processes, a way forward may be to
develop a practice of introspection; to expand our knowledge about the
reflective activity of the user in the expert-guided think aloud session.
•The authors argue that access to subjective experience is possible in
terms of introspection where user has to become a participant in the
analysis of his or her own cognitive processes.
• The paper suggests that use of think aloud should have, as a
prerequisite, explicit descriptions of design, test procedure and
framework for analysis.
85
21- Prospective Time Series
 Measurement of a development trend
 Time series
 Before-and-after studies
 Measurement of a number of measures over time shows how an acti
vity or an outcome changes as a function of either time alone or as a
function of different initiatives
 This method requires control all over time
 Carried out as a matched-pair design in:
 Controlled studies including RCTs
 Simple before-and-after study
 In cohort studies with the follow-up of cases over time
 In decision support system with before-after study
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. P 160.
86
22-Risk Assessment
• Monitoring of risk factors in a
development or assessment project to make it
possible to take preemptive action.
• Risk assessment, risk management, and risk
control can be handled either retrospectively
or prospectively.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
87
23-Root Causes Analysis
• Exploration of what, how, and why a given
incident occurred to identify the root causes of
undesirable events.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic
Press.
88
24-Social Network Analysis
• Social network analysis is carried out by means of diagra
mming techniques in relationships between elements wit
hin an organization (such as individuals, professions, de
partments or other organizations)
• Most commonly applied to help improve the effectivenes
s and efficiency of decision making processes in commer
cial organizations.
• The potential value of SNA to measure team function an
d use the information to improve working processes was
another finding of studies in different settings
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
2- Chambers, D., Wilson, P., Thompson, C., & Harden, M. (2012). Social Network Analysis in Health
care Settings: A Systematic Scoping Review. PLoS ONE, 7(8), e41911. http://doi.org/10.1371/journal.
pone.0041911
89
Example of social network: the case of the eye care programme in
the Brong Ahafo region, January 2010.
Karl Blanchet, and Philip James Health Policy Plan.
2011;heapol.czr055
Published by Oxford University Press in association with The London School of Hygiene and
Tropical Medicine © The Author 2011; all rights reserved.
90
25-Stakeholder Analysis
• Assessment of stakeholder features and their inner dynamics, aimin
g to identify participants for the completion of a given task, a proble
m-solving activity or a project.
Namazzi, G., et al., Stakeholder analysis for a maternal and newborn health project in Eastern Ugand
a. BMC Pregnancy Childbirth, 2013. 13: p. 58.
91
Stakeholder Analysis. Service quality 2016 [cited 2016 11/2/2016]; Available from: http://asq
.org/service/body-of-knowledge/tools-stakeholder-analysis.
92
26-Technical Verification
• Functions are present, work correctly and are in
compliance with the agreement.
A. Completeness of the functionality
B. Correctness of the functionality
C. Coherence of the functionality
D. Consistency of the functionality
E. Interconnectivity of the functionality
F. Reliability
G. Performance
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic
Press.
93
27-User Acceptance and
Satisfaction
 Assessment of user opinions, attitudes, and perceptions of an IT syst
em during daily operation
 Most common methods :
 Interview techniques : Interviews or Focus-Group Interviews.
 Questionnaires: most commonly used when measuring user satisfact
ion.
 The Equity Implementation Model
*User-satisfaction questionnaires
 Aim: verify and correct for aspects such as reliability,
predictive value, accuracy and content validity and
internal validity.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
94
28-Videorecording
Video recording is a capability for recording videos of
every new Automate test session that is requested on
BrowserStack. While your test is running, you can see a
live screencast that is recorded and saved with a proper
name that can be accessed and downloaded later.
https://www.browserstack.com/question/651
95
29-WHO: Framework for As
sessment of Strategies
 The method may be applied for assessment of different (
development) strategies, either individually or as a
comparative analysis.
 The method is particularly relevant early in the
explorative Phase for exploring the feasibility of basic
ideas and solutions.
1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.p 222
96
30-WHO: Framework for Asse
ssment of
Strategies
97
Conclusion
Literature reviews
98
Literature reviews:
• Evaluation studies have been performed with different aims and obje
ctives, and are concerned with different domains and areas of focus.
1.Rahimi, B. and V. Vimarlund, Methods to Evaluate Health information Systems in
Healthcare Settings: A Literature Review. Journal of Medical Systems, 2007. 31(5): p. 397-
432.
99
Literature reviews about ICT Evaluation
 Most important study conducted by Kapalan about CDSS Evaluation[1].
• Conclusion of 27 reviews : RCT is standard method of evaluation approach for
CDSS.
• Problems of RCT: Do not answer questions such as why some systems tend to
be used while others are not
•Unrecognized ineffectiveness reason of CDSS.
 1,035 articles reviewed by Ammenwerth and Keizer’s from 1982 to 2002
• Concluded that the number of evaluation studies in the area of medical informatics is
rising significantly.
• appropriateness of patient care, efficiency of patient care, user satisfaction, and software
quality.
• The quality of care processes and patient outcomes was found to have increased.
• Interpreted this shift as a sign of the maturation of evaluation research in medical
informatics.
1. Rahimi, B. and V. Vimarlund, Methods to Evaluate Health information Systems in Healthc
are Settings: A Literature Review. Journal of Medical Systems, 2007. 31(5): p. 397-432.
10
 Delpierre et al reviewed 26 articles about CBPRS , from January
2000 to March 2003.
• Increased satisfaction of users and patients
• Could lead to significant changes in medical practice.
• Most of the studies did not include qualitative factors such as
characteristics of the disease and the tool
Rahimi, B. and V. Vimarlund, Methods to Evaluate Health information Systems in He
althcare Settings: A Literature Review. Journal of Medical Systems, 2007. 31(5): p. 3
97-432.
10
• Some of the studies in this review evaluate the effects of the
new system’s implementation on the quality of work
performance, such as user job performance , computer
knowledge, and investigation of skill among other users
• Most of the studies included in this paper have used survey
methodology as their research method
• Some of the studies used a clinical trial or cohort study to
research the systems 'outputs.
• No studies have been conducted to explore the impact of IT
on the system as a whole.
10
10
10
Free PPT Chart : ALLPPT.com
ALLPPT.com _ Free Powerpoint Templates, Diagrams and Charts
Your own sub headline
Text here
Add text Add
Text here Text here
Add text add text add text add text
add text add text add text add text
Text here Text here
Add text add text add text add text
Text here Text here
Add text add text add text
add text add text add text add text
Text here Text here
Add text add text add text add text
add text add text add text add text
Text here Text here
Add text add text add text add text

Weitere ähnliche Inhalte

Was ist angesagt?

health equity
health equity health equity
health equity lokesh213
 
EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)
EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)
EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)Dr Ghaiath Hussein
 
Public health model
Public health modelPublic health model
Public health modelSoumya Sahoo
 
Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...
Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...
Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...HFG Project
 
Hospital administration
Hospital administrationHospital administration
Hospital administrationNursing Path
 
Financial Management In Healthcare PowerPoint Presentation Slides
Financial Management In Healthcare PowerPoint Presentation SlidesFinancial Management In Healthcare PowerPoint Presentation Slides
Financial Management In Healthcare PowerPoint Presentation SlidesSlideTeam
 
Acht the role of hospital board members
Acht the role of hospital board membersAcht the role of hospital board members
Acht the role of hospital board membersDavid Levien
 
Privatization of Health Care Services
Privatization of Health Care Services Privatization of Health Care Services
Privatization of Health Care Services Ghada Elmasuri
 
Public health ethics (KFMC,11.05.2016)
Public health ethics (KFMC,11.05.2016)Public health ethics (KFMC,11.05.2016)
Public health ethics (KFMC,11.05.2016)Dr Ghaiath Hussein
 
introduction-to-health-policy
introduction-to-health-policyintroduction-to-health-policy
introduction-to-health-policyNayyar Kazmi
 

Was ist angesagt? (20)

Healthcare system and leadership
Healthcare system and leadershipHealthcare system and leadership
Healthcare system and leadership
 
Quality assurance in healthcare delivery
Quality assurance in healthcare deliveryQuality assurance in healthcare delivery
Quality assurance in healthcare delivery
 
health equity
health equity health equity
health equity
 
EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)
EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)
EMPHNET Public Health Ethics (PHE): Introduction to public health ethics (phe)
 
Health promotion
Health promotionHealth promotion
Health promotion
 
Public health model
Public health modelPublic health model
Public health model
 
Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...
Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...
Essential Packages of Health Services: A Landscape Analysis in 24 EPCMD Count...
 
Health care financing
Health care financingHealth care financing
Health care financing
 
Hospital administration
Hospital administrationHospital administration
Hospital administration
 
THE DEMAND FOR HEALTH CARE
THE DEMAND FOR HEALTH CARETHE DEMAND FOR HEALTH CARE
THE DEMAND FOR HEALTH CARE
 
Health information systems
Health information systemsHealth information systems
Health information systems
 
Financial Management In Healthcare PowerPoint Presentation Slides
Financial Management In Healthcare PowerPoint Presentation SlidesFinancial Management In Healthcare PowerPoint Presentation Slides
Financial Management In Healthcare PowerPoint Presentation Slides
 
Presentation of health policies
Presentation of health policiesPresentation of health policies
Presentation of health policies
 
Acht the role of hospital board members
Acht the role of hospital board membersAcht the role of hospital board members
Acht the role of hospital board members
 
Privatization of Health Care Services
Privatization of Health Care Services Privatization of Health Care Services
Privatization of Health Care Services
 
HPSR? What is health policy and systems research?
HPSR? What is health policy and systems research?HPSR? What is health policy and systems research?
HPSR? What is health policy and systems research?
 
Public health ethics (KFMC,11.05.2016)
Public health ethics (KFMC,11.05.2016)Public health ethics (KFMC,11.05.2016)
Public health ethics (KFMC,11.05.2016)
 
Surveillance
SurveillanceSurveillance
Surveillance
 
introduction-to-health-policy
introduction-to-health-policyintroduction-to-health-policy
introduction-to-health-policy
 
Quality In Health Care
Quality In Health CareQuality In Health Care
Quality In Health Care
 

Andere mochten auch

110958678 ee-188-project
110958678 ee-188-project110958678 ee-188-project
110958678 ee-188-projecthomeworkping7
 
CV_SyedShoeb_2015
CV_SyedShoeb_2015CV_SyedShoeb_2015
CV_SyedShoeb_2015Syed Shoeb
 
103856295 cases-law-on-pub-off-with-election-law
103856295 cases-law-on-pub-off-with-election-law103856295 cases-law-on-pub-off-with-election-law
103856295 cases-law-on-pub-off-with-election-lawhomeworkping7
 
Critical thinking pp ci 350
Critical thinking pp ci 350 Critical thinking pp ci 350
Critical thinking pp ci 350 smith2118
 
206891661 ee2002-lab-manual-fall-2013
206891661 ee2002-lab-manual-fall-2013206891661 ee2002-lab-manual-fall-2013
206891661 ee2002-lab-manual-fall-2013homeworkping7
 
Deep Random Secrecy Presentation
Deep Random Secrecy PresentationDeep Random Secrecy Presentation
Deep Random Secrecy Presentationdevalroger
 
205875895 law-222-law-of-tort
205875895 law-222-law-of-tort205875895 law-222-law-of-tort
205875895 law-222-law-of-torthomeworkping7
 
206885611 eskom-ee-simama-ranta-2014
206885611 eskom-ee-simama-ranta-2014206885611 eskom-ee-simama-ranta-2014
206885611 eskom-ee-simama-ranta-2014homeworkping7
 
102531137 ppp-case-study-janani-express-madhya-pradesh
102531137 ppp-case-study-janani-express-madhya-pradesh102531137 ppp-case-study-janani-express-madhya-pradesh
102531137 ppp-case-study-janani-express-madhya-pradeshhomeworkping7
 

Andere mochten auch (11)

110958678 ee-188-project
110958678 ee-188-project110958678 ee-188-project
110958678 ee-188-project
 
CV_SyedShoeb_2015
CV_SyedShoeb_2015CV_SyedShoeb_2015
CV_SyedShoeb_2015
 
Revised assure
Revised assureRevised assure
Revised assure
 
103856295 cases-law-on-pub-off-with-election-law
103856295 cases-law-on-pub-off-with-election-law103856295 cases-law-on-pub-off-with-election-law
103856295 cases-law-on-pub-off-with-election-law
 
Critical thinking pp ci 350
Critical thinking pp ci 350 Critical thinking pp ci 350
Critical thinking pp ci 350
 
Webquest
WebquestWebquest
Webquest
 
206891661 ee2002-lab-manual-fall-2013
206891661 ee2002-lab-manual-fall-2013206891661 ee2002-lab-manual-fall-2013
206891661 ee2002-lab-manual-fall-2013
 
Deep Random Secrecy Presentation
Deep Random Secrecy PresentationDeep Random Secrecy Presentation
Deep Random Secrecy Presentation
 
205875895 law-222-law-of-tort
205875895 law-222-law-of-tort205875895 law-222-law-of-tort
205875895 law-222-law-of-tort
 
206885611 eskom-ee-simama-ranta-2014
206885611 eskom-ee-simama-ranta-2014206885611 eskom-ee-simama-ranta-2014
206885611 eskom-ee-simama-ranta-2014
 
102531137 ppp-case-study-janani-express-madhya-pradesh
102531137 ppp-case-study-janani-express-madhya-pradesh102531137 ppp-case-study-janani-express-madhya-pradesh
102531137 ppp-case-study-janani-express-madhya-pradesh
 

Ähnlich wie Evaluation methods in heathcare systems

Running head evaluation tool1evaluation tool6Evaluation Tool.docx
Running head evaluation tool1evaluation tool6Evaluation Tool.docxRunning head evaluation tool1evaluation tool6Evaluation Tool.docx
Running head evaluation tool1evaluation tool6Evaluation Tool.docxcowinhelen
 
Expert-System for Health Promotion
Expert-System for Health PromotionExpert-System for Health Promotion
Expert-System for Health PromotionJoel Bennett
 
Lecture 5 A
Lecture 5 A Lecture 5 A
Lecture 5 A CMDLMS
 
Knowledge Translation: Practical Strategies for Success v1
Knowledge Translation: Practical Strategies for Success v1Knowledge Translation: Practical Strategies for Success v1
Knowledge Translation: Practical Strategies for Success v1Imad Hassan
 
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docxChapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docxchristinemaritza
 
Planning Slides.pdf (1).pdf
Planning Slides.pdf (1).pdfPlanning Slides.pdf (1).pdf
Planning Slides.pdf (1).pdfAboAseel3
 
Onc safer guides to safe ehr jan302014_ppt
Onc safer guides to safe ehr jan302014_pptOnc safer guides to safe ehr jan302014_ppt
Onc safer guides to safe ehr jan302014_pptKristenReiter3
 
Amia Pres Oct 26 2011 Final
Amia Pres Oct 26 2011 FinalAmia Pres Oct 26 2011 Final
Amia Pres Oct 26 2011 FinalBrad Doebbeling
 
Quality improvement healthcare final
Quality improvement healthcare finalQuality improvement healthcare final
Quality improvement healthcare finalEvanvs
 
M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...
M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...
M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...Mike Heenan
 
Score iSYS Health Apps
Score iSYS Health AppsScore iSYS Health Apps
Score iSYS Health AppsFunancion iSYS
 
Medical audit process
Medical audit processMedical audit process
Medical audit processsopi_1234
 
How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.Healthcare consultant
 
Application Evaluation Project Part 1 Evaluation Plan FocusTec.docx
Application Evaluation Project Part 1 Evaluation Plan FocusTec.docxApplication Evaluation Project Part 1 Evaluation Plan FocusTec.docx
Application Evaluation Project Part 1 Evaluation Plan FocusTec.docxalfredai53p
 
Pharma Quality Metrics
Pharma Quality MetricsPharma Quality Metrics
Pharma Quality MetricsMarcep Inc.
 
Research method ch14 selected research papers
Research method ch14 selected research papersResearch method ch14 selected research papers
Research method ch14 selected research papersUuganaa Baatar
 
The efficacy of app-supported smartphone interventions for mental health prob...
The efficacy of app-supported smartphone interventions for mental health prob...The efficacy of app-supported smartphone interventions for mental health prob...
The efficacy of app-supported smartphone interventions for mental health prob...RachitSharma132
 
Knowledge transfer research examples
Knowledge transfer research examplesKnowledge transfer research examples
Knowledge transfer research examplestaem
 
Lecture 8B
Lecture 8BLecture 8B
Lecture 8BCMDLMS
 
Lecture 6A
Lecture 6ALecture 6A
Lecture 6ACMDLMS
 

Ähnlich wie Evaluation methods in heathcare systems (20)

Running head evaluation tool1evaluation tool6Evaluation Tool.docx
Running head evaluation tool1evaluation tool6Evaluation Tool.docxRunning head evaluation tool1evaluation tool6Evaluation Tool.docx
Running head evaluation tool1evaluation tool6Evaluation Tool.docx
 
Expert-System for Health Promotion
Expert-System for Health PromotionExpert-System for Health Promotion
Expert-System for Health Promotion
 
Lecture 5 A
Lecture 5 A Lecture 5 A
Lecture 5 A
 
Knowledge Translation: Practical Strategies for Success v1
Knowledge Translation: Practical Strategies for Success v1Knowledge Translation: Practical Strategies for Success v1
Knowledge Translation: Practical Strategies for Success v1
 
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docxChapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
 
Planning Slides.pdf (1).pdf
Planning Slides.pdf (1).pdfPlanning Slides.pdf (1).pdf
Planning Slides.pdf (1).pdf
 
Onc safer guides to safe ehr jan302014_ppt
Onc safer guides to safe ehr jan302014_pptOnc safer guides to safe ehr jan302014_ppt
Onc safer guides to safe ehr jan302014_ppt
 
Amia Pres Oct 26 2011 Final
Amia Pres Oct 26 2011 FinalAmia Pres Oct 26 2011 Final
Amia Pres Oct 26 2011 Final
 
Quality improvement healthcare final
Quality improvement healthcare finalQuality improvement healthcare final
Quality improvement healthcare final
 
M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...
M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...
M Heenan_PhD Dissertation Lecture_eHealth Lecture_Engaging Leaders in KPI Sel...
 
Score iSYS Health Apps
Score iSYS Health AppsScore iSYS Health Apps
Score iSYS Health Apps
 
Medical audit process
Medical audit processMedical audit process
Medical audit process
 
How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.
 
Application Evaluation Project Part 1 Evaluation Plan FocusTec.docx
Application Evaluation Project Part 1 Evaluation Plan FocusTec.docxApplication Evaluation Project Part 1 Evaluation Plan FocusTec.docx
Application Evaluation Project Part 1 Evaluation Plan FocusTec.docx
 
Pharma Quality Metrics
Pharma Quality MetricsPharma Quality Metrics
Pharma Quality Metrics
 
Research method ch14 selected research papers
Research method ch14 selected research papersResearch method ch14 selected research papers
Research method ch14 selected research papers
 
The efficacy of app-supported smartphone interventions for mental health prob...
The efficacy of app-supported smartphone interventions for mental health prob...The efficacy of app-supported smartphone interventions for mental health prob...
The efficacy of app-supported smartphone interventions for mental health prob...
 
Knowledge transfer research examples
Knowledge transfer research examplesKnowledge transfer research examples
Knowledge transfer research examples
 
Lecture 8B
Lecture 8BLecture 8B
Lecture 8B
 
Lecture 6A
Lecture 6ALecture 6A
Lecture 6A
 

Kürzlich hochgeladen

Call Girls Hyderabad Krisha 9907093804 Independent Escort Service Hyderabad
Call Girls Hyderabad Krisha 9907093804 Independent Escort Service HyderabadCall Girls Hyderabad Krisha 9907093804 Independent Escort Service Hyderabad
Call Girls Hyderabad Krisha 9907093804 Independent Escort Service Hyderabaddelhimodelshub1
 
Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...
Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...
Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...narwatsonia7
 
Hi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbers
Hi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbersHi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbers
Hi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbersnarwatsonia7
 
Call Girls Hsr Layout Whatsapp 7001305949 Independent Escort Service
Call Girls Hsr Layout Whatsapp 7001305949 Independent Escort ServiceCall Girls Hsr Layout Whatsapp 7001305949 Independent Escort Service
Call Girls Hsr Layout Whatsapp 7001305949 Independent Escort Servicenarwatsonia7
 
Single Assessment Framework - What We Know So Far
Single Assessment Framework - What We Know So FarSingle Assessment Framework - What We Know So Far
Single Assessment Framework - What We Know So FarCareLineLive
 
Models Call Girls Electronic City | 7001305949 At Low Cost Cash Payment Booking
Models Call Girls Electronic City | 7001305949 At Low Cost Cash Payment BookingModels Call Girls Electronic City | 7001305949 At Low Cost Cash Payment Booking
Models Call Girls Electronic City | 7001305949 At Low Cost Cash Payment Bookingnarwatsonia7
 
College Call Girls Mumbai Alia 9910780858 Independent Escort Service Mumbai
College Call Girls Mumbai Alia 9910780858 Independent Escort Service MumbaiCollege Call Girls Mumbai Alia 9910780858 Independent Escort Service Mumbai
College Call Girls Mumbai Alia 9910780858 Independent Escort Service Mumbaisonalikaur4
 
Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...
Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...
Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...narwatsonia7
 
Globalny raport: „Prawdziwe piękno 2024" od Dove
Globalny raport: „Prawdziwe piękno 2024" od DoveGlobalny raport: „Prawdziwe piękno 2024" od Dove
Globalny raport: „Prawdziwe piękno 2024" od Doveagatadrynko
 
Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...
Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...
Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...scanFOAM
 
Call Girl Hyderabad Madhuri 9907093804 Independent Escort Service Hyderabad
Call Girl Hyderabad Madhuri 9907093804 Independent Escort Service HyderabadCall Girl Hyderabad Madhuri 9907093804 Independent Escort Service Hyderabad
Call Girl Hyderabad Madhuri 9907093804 Independent Escort Service Hyderabaddelhimodelshub1
 
Russian Call Girls in Raipur 9873940964 Book Hot And Sexy Girls
Russian Call Girls in Raipur 9873940964 Book Hot And Sexy GirlsRussian Call Girls in Raipur 9873940964 Book Hot And Sexy Girls
Russian Call Girls in Raipur 9873940964 Book Hot And Sexy Girlsddev2574
 
Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...
Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...
Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...ggsonu500
 
Call Girl Gurgaon Saloni 9711199012 Independent Escort Service Gurgaon
Call Girl Gurgaon Saloni 9711199012 Independent Escort Service GurgaonCall Girl Gurgaon Saloni 9711199012 Independent Escort Service Gurgaon
Call Girl Gurgaon Saloni 9711199012 Independent Escort Service GurgaonCall Girls Service Gurgaon
 
Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...
Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...
Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...High Profile Call Girls Chandigarh Aarushi
 
Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...
Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...
Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...soniya singh
 
Call Girls Kukatpally 7001305949 all area service COD available Any Time
Call Girls Kukatpally 7001305949 all area service COD available Any TimeCall Girls Kukatpally 7001305949 all area service COD available Any Time
Call Girls Kukatpally 7001305949 all area service COD available Any Timedelhimodelshub1
 

Kürzlich hochgeladen (20)

Call Girls Hyderabad Krisha 9907093804 Independent Escort Service Hyderabad
Call Girls Hyderabad Krisha 9907093804 Independent Escort Service HyderabadCall Girls Hyderabad Krisha 9907093804 Independent Escort Service Hyderabad
Call Girls Hyderabad Krisha 9907093804 Independent Escort Service Hyderabad
 
Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...
Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...
Call Girls Service Bommasandra - Call 7001305949 Rs-3500 with A/C Room Cash o...
 
Hi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbers
Hi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbersHi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbers
Hi,Fi Call Girl In Marathahalli - 7001305949 with real photos and phone numbers
 
Call Girls Hsr Layout Whatsapp 7001305949 Independent Escort Service
Call Girls Hsr Layout Whatsapp 7001305949 Independent Escort ServiceCall Girls Hsr Layout Whatsapp 7001305949 Independent Escort Service
Call Girls Hsr Layout Whatsapp 7001305949 Independent Escort Service
 
Single Assessment Framework - What We Know So Far
Single Assessment Framework - What We Know So FarSingle Assessment Framework - What We Know So Far
Single Assessment Framework - What We Know So Far
 
Models Call Girls Electronic City | 7001305949 At Low Cost Cash Payment Booking
Models Call Girls Electronic City | 7001305949 At Low Cost Cash Payment BookingModels Call Girls Electronic City | 7001305949 At Low Cost Cash Payment Booking
Models Call Girls Electronic City | 7001305949 At Low Cost Cash Payment Booking
 
College Call Girls Mumbai Alia 9910780858 Independent Escort Service Mumbai
College Call Girls Mumbai Alia 9910780858 Independent Escort Service MumbaiCollege Call Girls Mumbai Alia 9910780858 Independent Escort Service Mumbai
College Call Girls Mumbai Alia 9910780858 Independent Escort Service Mumbai
 
Russian Call Girls South Delhi 9711199171 discount on your booking
Russian Call Girls South Delhi 9711199171 discount on your bookingRussian Call Girls South Delhi 9711199171 discount on your booking
Russian Call Girls South Delhi 9711199171 discount on your booking
 
Call Girl Guwahati Aashi 👉 7001305949 👈 🔝 Independent Escort Service Guwahati
Call Girl Guwahati Aashi 👉 7001305949 👈 🔝 Independent Escort Service GuwahatiCall Girl Guwahati Aashi 👉 7001305949 👈 🔝 Independent Escort Service Guwahati
Call Girl Guwahati Aashi 👉 7001305949 👈 🔝 Independent Escort Service Guwahati
 
Call Girl Lucknow Gauri 🔝 8923113531 🔝 🎶 Independent Escort Service Lucknow
Call Girl Lucknow Gauri 🔝 8923113531  🔝 🎶 Independent Escort Service LucknowCall Girl Lucknow Gauri 🔝 8923113531  🔝 🎶 Independent Escort Service Lucknow
Call Girl Lucknow Gauri 🔝 8923113531 🔝 🎶 Independent Escort Service Lucknow
 
Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...
Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...
Housewife Call Girls Nandini Layout - Phone No 7001305949 For Ultimate Sexual...
 
Globalny raport: „Prawdziwe piękno 2024" od Dove
Globalny raport: „Prawdziwe piękno 2024" od DoveGlobalny raport: „Prawdziwe piękno 2024" od Dove
Globalny raport: „Prawdziwe piękno 2024" od Dove
 
Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...
Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...
Experience learning - lessons from 25 years of ATACC - Mark Forrest and Halde...
 
Call Girl Hyderabad Madhuri 9907093804 Independent Escort Service Hyderabad
Call Girl Hyderabad Madhuri 9907093804 Independent Escort Service HyderabadCall Girl Hyderabad Madhuri 9907093804 Independent Escort Service Hyderabad
Call Girl Hyderabad Madhuri 9907093804 Independent Escort Service Hyderabad
 
Russian Call Girls in Raipur 9873940964 Book Hot And Sexy Girls
Russian Call Girls in Raipur 9873940964 Book Hot And Sexy GirlsRussian Call Girls in Raipur 9873940964 Book Hot And Sexy Girls
Russian Call Girls in Raipur 9873940964 Book Hot And Sexy Girls
 
Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...
Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...
Gurgaon Sector 90 Call Girls ( 9873940964 ) Book Hot And Sexy Girls In A Few ...
 
Call Girl Gurgaon Saloni 9711199012 Independent Escort Service Gurgaon
Call Girl Gurgaon Saloni 9711199012 Independent Escort Service GurgaonCall Girl Gurgaon Saloni 9711199012 Independent Escort Service Gurgaon
Call Girl Gurgaon Saloni 9711199012 Independent Escort Service Gurgaon
 
Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...
Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...
Call Girl Chandigarh Mallika ❤️🍑 9907093804 👄🫦 Independent Escort Service Cha...
 
Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...
Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...
Gurgaon iffco chowk 🔝 Call Girls Service 🔝 ( 8264348440 ) unlimited hard sex ...
 
Call Girls Kukatpally 7001305949 all area service COD available Any Time
Call Girls Kukatpally 7001305949 all area service COD available Any TimeCall Girls Kukatpally 7001305949 all area service COD available Any Time
Call Girls Kukatpally 7001305949 all area service COD available Any Time
 

Evaluation methods in heathcare systems

  • 1. 1
  • 2. 2
  • 3. 3 Introduction:  From 1995s in the hope of : Medical information systems involve: Computer- stored databases • patient information to support medical order entry • Results reporting • Decision support systems • clinical reminders Comprehensive systems • coordinates patient care activities by linking computer terminals in patient care areas to all departments Smaller separate systems •Link patient care areas to only one department •Laboratory system •Radiology system •Pharmacy system •Expert systems •Computerized databases CPOE • Concerns about patient safety and medical error • Need to Evaluation  Unfortunately reports of system failures have continued 1.Anderson, J.G. and C. Aydin, Evaluating the Organizational Impact of Healthcare Information Systems. 2005: Springer.pp 5-8. Hope • Increasing efficiency, reducing costs, and improving patient care 1995 • Healthcare applications appeared
  • 4. 4 Evaluation Definition:  Evaluation can be defined as “the act of measuring or exploring properties of a health information system (in planning, development , implementation, or operation),the result of which informs a decision to be made concerning that system in a specific context.”  Evaluating should perform not only in technology assessment but also in the social and behavioral processes 1.Ammenwerth, E., et al., Evaluation of health information systems—problems and challenges. International Journal of Medical Informatics, 2003. 71(2-3): p. 125-135. 2. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. During IT Evaluation Not only the technology itself But also the interaction between IT and human players in their information processing Must be Considered
  • 5. 5 Evaluation is the means to assess: 5 Quality Value Impacts of IT in the health care environment Effects
  • 6. 6 Aim of evaluation : To provide the basis for a decision about the IT system investigated Decision-making context is also the context of the evaluation .[2] 1- Yusof, M. M., et al. (2008). "Investigating evaluation frameworks for health information systems." Int J Med Inform 77(6): 377-385. 2-Brender, J., Handbook of Evaluation Methods for Health Informatics ,2006: Academic Press.p 9-11. Improve performance Outcomes Safety Effectiveness
  • 7. 7 Questions should be answered:  Evaluation seeks to answer the : Why (objective of evaluation), Who (which stakeholders’ perspective is going to be evaluated) When (which phase in the system development life cycle), What (aspects or focus of evaluation)(What is it going to be used for ?) How (methods of evaluation) questions [1],[2] 1- Maryati Mohd. Yusof, R.J.P., Lampros K. Stergioulas. Towards a Framework for Health Information Systems Evaluation. in Proceedings of the 39th Hawaii International Conference on System Sciences. 2006. Hawaii IEEE. 2-SYMONS, V.J., A review of information systems evaluation-content ,context and process. Eur J Inf Syst, 1991. 1(3): p. 205-212.
  • 8. 8 1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2 006: Spr inger .p 6.
  • 10. 10 Pre-implementation evaluation Inform difficult decisions Prior to starting any program As well as post-implementation evaluation 1.Nykanen, P., et al., Guideline for good evaluation practice in health informatics (GEP-HI). Int J Med Inform, 2011. 80(12): p. 815-27.
  • 11. 11 Assessment: There are two basic types of assessment: I. Summative II. Formative [1]  Measured : Qualitative and Quantitative: Putting the results into a metric context .  Qualitatively and subjectively  Quantitatively and objectively :With the aid of a questionnaire study..[2] 1-Brender, J., Handbook of Evaluation Methods for Health Informatics ,2006: Academic Pr ess. pp 9-14. 2-Brender, J., Handbook of Evaluation Methods for Health Informatics ,2006: Academic Pre ss. pp 20-29.
  • 12. 12 Formative and Summativ e evaluation Formative evaluation Throughout the systems lifecycle It provides information for improving the system under development Summative evaluation Focused on assessing the effect or outcome of the evaluation object At a certain point of time after implementation 1.Nykanen, P., et al., Guideline for good evaluation practice in health informatics (GEP-HI). Int J Med Inform, 2011. 80(12): p. 815-27.
  • 13. 13 Quantitative & qualitative methods: • Task analysis • Interface design • Time motion analysis • Software log • Questionnaire Quantitative methods • Think aloud protocol • Unstructured interviews Qualitative methods To assess • user satisfaction • user-perceived • usefulness • usability of this framework. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 14. 14 Categories of most evaluatio n studies: I. CPR evaluation studies, II. Telemedicine evaluation studies, III. DSS evaluation studies.  Evaluation can be conveniently classified into: Objectivist and Subjectivist approaches . [2] 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. 2- C.P. Friedman, J. Wyatt, Evaluation methods in biomedical informatics, 2nd ed., Springer Science & Business,Media, New York, US, 2006.
  • 15. 15 Methodology and Methods: A method is based on a well-defined theory and includes a consistent set of techniques, tools and principles to organize it. Methodology Method metric metric measure measure Method metric Method Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. p 13.
  • 16. 16 Methodology • Methodology is supposed to: (1) provide the answer to what to do next, when to do what, and how to do it (2) to describe the ideas behind such choices and the suppositions (for instance , the philosophical background) behind them.  A methodology must comprise: a) The basic philosophies and theories, so that a user of the methodology can judge the validity of its use b) Perspective c) Assumptions d) Areas of use e) Applicable methods, tools, and techniques Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. PP14-15.
  • 17. 17 Complexity of evaluation in biomedical informatics 1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2006: Springer .p 6. Evaluation methodology Computer-based information systems Medicine and healthcare delivery Evaluation in medical informatics
  • 18. 18
  • 19. 19 User Requirements Specificati on • Analysis of Work Procedures • Assessment of Bids • Balanced Scorecard • BIKVA • Delphi: Qualitative assessment • FieM Study • Focus Group Interview • Future Workshop • Grounded Theory • Heuristic Evaluation • Interview (no standardized) • KUBI • Logical Framework Approach • Organizational Readiness • Pardizipp • Questionnaire (no standardized) • Requirements Assessment • Risk Assessment • Social Network Analysis • Stakeholder Analysis • SWOT • Usability • Video recording • WHO: Framework for Assessment of Strategies
  • 20. 20 Technical Development Phase Used to provide feed-back for the technical development.  Balanced Score card  Clinical/Diagnostic Performance  Cognitive Assessment  Cognitive Walkthrough  Heuristic Evaluation  Risk Assessment  SWOT  Technical Verification  Think Aloud  Usability
  • 21. 21 Assessment Methods : Adapta tion Phase • Analysis of Work Procedures • BIKVA • Clinical/Diagnostic Performance • Cognitive Assessment • Cognitive Walkthrough • Equity Implementation Model • Field Study • Focus Group Interview • Functionality Assessment • Grounded Theory • Heuristic Evaluation • Interview (nonstandardized) • Prospective Time Series • Questionnaire (nonstandardized) • RCT, Randomized Controlled Tria l • Risk Assessment • Root Causes Analysis • Social Network Analysis • SWOT • Technical Verification • Think Aloud • Usability • User Acceptance and Satisfaction • Videorecording Real operational assessment can take place.
  • 22. 22 Assessment Methods: Evaluat ion Phase 1. Analysis of Work Procedures 2. Balanced Score card 3. BIKVA 4. Clinical/Diagnostic Performance 5. Cognitive Assessment 6. Cognitive Walkthrough 7. Delphi 8. Equity Implementation Model 9. FieMStudy 10. Focus Group Interview 11. Functionality Assessment 12. Grounded Theory 13. Heuristic Evaluation 14. Impact Assessment 15. Interview (nonstandardized) 16. KUBI 17. Prospective Time Series 18. Questionnaire (nonstandardized) 19. RCT, Randomized Controlled Trial 20. Risk Assessment 21. Root Causes Analysis 22. Social Network Analysis 23. Stakeholder Analysis 24. SWOT 25. Technical Verification 26. Think Aloud 27. Usability 28. User Acceptance and Satisfaction 29. Videorecording 30. WHO: Framework for Assessment of Strategies
  • 23. 23 • Usability methods, such as heuristic evaluation, cognitive walk-throughs , RCT and user testing, are increasingly used to evaluate and improve the design of clinical software applications. • Evaluation studies do not focus solely on the structure and function of information resources; they also address their impact on persons who are customarily users of these resources and on the outcomes of users’ interactions with them to understand users’ actions. 1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2006: Springer .p 6. 2- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 24. 24 1- Analysis of Work Procedures • Assessing what actually happens compared to the expectations. They may clearly have a role as part of an evaluation study. • Some well-known options : 1. The Learning Organization 2. Enterprise modeling 3. Business Process Reengineering 4. Use Cases and scenarios 5. Total Quality Management 6. Health Technology Assessment (HTA) 7. Computer-Supported Cooperative Work (CSCW) 8. Cognitive Task Analysis 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 25. 25 Analysis of Work Procedures *Note: The use of diagramming techniques and other forms of graphical modeling requires experience and understanding the principles and procedures involved. Applied in a health care setting Provide clear structure: Framework • Potential Views • Level of systems Systems analyses in a health care environment We can use Diagramming techniques
  • 26. 26 Enterprise Modeling Enterprise Architecture (EA) is a strategic activity and planning tool for an enterprise, which facilitates decision- making by enabling a conceptual view of the enterprise. The main objective of an EA approach is to define the layout of organizational components and relationships among them.
  • 28. 28 Use Case and Scenario
  • 29. 29 Health Technology Assessme nt (HTA): • HTA is concerned with the systematic evaluation of the consequences of the adoption and use of new health technologies and improving the evidence on existing technologies. [1,2] • One of the basic lessons learned in the area of HCI is that usability evaluation should start early in the design process • Goodman and Ahn (1999) provide a basic overview of HTA principles and methods. [3] 1. O'Reilly, D., K. Campbell, and R. Goeree, Basics of health technology assessment. Methods Mol Biol, 2009. 473: p. 263-83. 2. Stevens, A., R. Milne, and A. Burls, Health technology assessment: history and de mand. J Public Health Med, 2003. 25(2): p. 98-101. 3. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Acade mic Press.
  • 30. 30 2- Balanced Scorecard • Created in 1992 by Drs. Robert S. Kaplan and David P. Norton, the Balanced Scorecard (BSC) is a revolutionary way to handle strategy management. • Balancing focus areas by means of a set of indicators for a set of strategic objectives. • The Balanced Scorecard is a tool used to measure an organization’s activities and initiatives against its Vision, Mission and Values as outlined in its Strategic Plan. 1. http://www.hrh.ca/balancedscorecard, Humber River Hospital, 10-20-2016 2. https://2gc.eu/- Balance Scorecards reports-10-20-2016- 2GC is a strategic execution consultancy with particular experience in implementing the latest generation Balanced Scorecards. With a global client list,
  • 31. 31 Balanced Scorecard is a Manage ment System Not just a measurement system You do not just measure your heart beat You use the heart rate monitor to manage your exercise
  • 32. 32
  • 33. 33 3-BIKVA :User Involvement in Quality Development A tool for making critical, subjective decisions about an existing practice. 1. Identifying user satisfaction or dissatisfaction. 2. Summarizing the information in group interview. 3. User encouraged to identify the reasons behind the incidents and interactions referred to. 4.Interview management in iterative process with the objective of clarifying all the disparities that relate to the issues of quality identified by the users. 5.conclusion will then be presented to the decision makers for an assessment 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 34. 34 BIKVA :User Involvement in Q uality Development  Developed in Denmark and it is an evaluation and quality enhancement method .  The evaluation method is interviewing. The evaluation process starts from the clients, then moves to the front-line staff (employees in direct contact with the clients) and finally ends to managers and politicians. Clients are asked to express and justify "why they are satisfied or dissatisfied " with the services offered. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. (2016). "Linkki-hanke - yhteys työelämään." Retrieved 11/9/2016, 2016, from : http://www.palmenia.helsinki.fi/linkki/.
  • 35. 35 3- Questionnaires  Advantage: • most people can manage it  Different Types : A. Open questions B. Checklist questions C. The Likert scale consisting of a bar with fields to tick on a scale D. Multipoint scale E. Semantic differential scale, which in tabular form uses columns wit h a value scale (for instance, "extremely", "very“,….) F. Categorical scale  Need verification : need thorough scientific validation of the questi onnaire in order to provide results leading to optimal action 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. p 164
  • 36. 36 Questionnaires: • Questionnaires contain items that are pre-determined by the in vestigators and consequently are of limited value in identifyi ng new or emergent issues in the use of a system that the inve stigators have not previously thought of. [1] 1.Kushniruk, A.W. and V.L. Patel, Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform, 2004. 37(1 ): p. 56-76.
  • 37. 37 4- Clinical/Diagnostic Perform ance Measurement of the diagnostic performance in measures of accuracy and precision for IT-based expert systems and decision-support systems before the system is implemented.  Measurement is doing in Technical Development phase but it can continue during the operational phase (the Adaptation Phase and Evolution Phase )  The clinical performance of the systems (for diagnostic, prognostic , screening tasks, etc.) is typically measured with measures from medicine, such as accuracy, precision, sensitivity, specificity, and predictive values. 1.Kaplan, B., Evaluating informatics applications—clinical decision systems. Internationa l Journal of Medical Informatics, 2001. 64: p. 15-37. 2. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 38. 38
  • 39. 39 *Point* • Kaplan reviews the literature (only for the years 1997- 1998) on clinical performance of decision-support systems for the period and includes other review articles on this subject and he concluded that most studies use an experimental or randomized controlled clinical trials design (RCT) to assess system performance. 1.Kaplan, B., Evaluating informatics applications—clinical decision systems. International Journal of Medical Informatics, 2001. 64: p. 15-37.
  • 40. 40 RCT is a trial in which subjects are randomly assigned to two groups: one (the experimental group) receiving the intervention that is being tested, and the other (the comparison group or controls) receiving an alternative treatment. The two groups are then followed up to see if any differences between the result. This helps in assessing the effectiveness of the intervention. 40
  • 41. 41 Hierarchy of Study Types Descriptive •Case report •Case series •Survey Analytic Observational •Cross sectional •Case-control •Cohort studies Experimental •Randomized controlled trials Strength of evidence for causality between a risk factor and outcome 41
  • 42. 42 ‫مطالعات‬ ‫انواع‬ ‫توصيفي‬ ‫تحليلي‬ ‫مداخله‬‫اي‬‫اي‬ ‫مشاهده‬ ‫باليني‬ ‫مايي‬‫ز‬‫آ‬‫ر‬‫كا‬ ‫اجتماعي‬ ‫مايي‬‫ز‬‫آ‬‫ر‬‫كا‬ ‫ميداني‬ ‫مايي‬‫ز‬‫آ‬‫ر‬‫كا‬ ‫مقطعي‬ ‫شاهدي‬ ‫مورد‬ ‫كوهورت‬ ‫اكولوژيك‬ ‫مورد‬‫ش‬‫ر‬‫ا‬‫ز‬‫گ‬ ‫د‬‫ر‬‫موا‬‫ش‬‫ر‬‫ا‬‫ز‬‫گ‬ Goal: To test hypotheses about the determinants of disease In contrast, the goal of intervention studies is to test the efficacy of specific treatments or preventive measures by assigning individual subjects to one of two or more treatment or prevention options 42
  • 43. 43 5-RCT, Randomized Controlle d Trial  Randomized controlled trials are the ideal study design to evaluate the effectiveness of health-care interventions.  RCT is used to identify marginal differences between two or more types of treatment.  Useful in assessment studies of IT-based solutions for decision- support systems and expert systems, only to a limited degree for other types of systems. 1.Navaneethan, S.D., et al., How to design a randomized controlled trial. Nephrology (Carlton), 2010. 15(8): p. 732-9. 2. Bothwell, L.E. and S.H. Podolsky, The Emergence of the Randomized, Controlled Trial. N Engl J Med, 2016. 375(6): p. 501-4. P 172 3- 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 44. 44 All RCTs are controlled clinical trial BUT Not all controlled trials are RCTs Randomization simple cluster stratified
  • 45. 45 5-RCT, Randomized Controlle d Trial  An RCT is the most rigorous scientific method for evaluating the effectiveness of health care interventions.  However, bias could arise when there are flaws in the design and management of a trial. 1.Akobeng, A., Understanding randomized controlled trials. Archives of Disease in C hildhood, 2005. 90(8): p. 840-844.
  • 46. 46 • ‫اثربخشی‬ • ‫عوارض‬RCT ‫روش‬ ‫بهترين‬ ‫انتخاب‬ ‫پيشگيري‬ ‫يا‬‫درمان‬ ‫به‬ ‫رسيدن‬ ‫براي‬‫هدف‬
  • 47. 47 ‫به‬ ‫ساده‬ ‫تصادفی‬ ‫روش‬ ‫به‬ ‫ها‬ ‫نمونه‬ ‫و‬ ‫مورد‬ ‫گروههای‬‫ميشوند‬ ‫تقسيم‬ ‫شاهد‬ ‫روشهای‬ ‫قويترين‬ ‫از‬ ‫يکی‬ ‫روش‬ ‫اين‬ ‫ميباشد‬ ‫پزشکی‬ ‫درعلوم‬ ‫تحقيقاتی‬ ‫از‬ ‫عبارتند‬ ‫که‬ ‫دارد‬ ‫مهم‬ ‫ويژگی‬ ‫سه‬: ‫به‬ ‫ها‬ ‫نمونه‬ ‫انتساب‬ ‫بودن‬ ‫تصادفی‬ ‫بررسی‬ ‫مورد‬ ‫گروههای‬ (Random Allocation) ‫کنترل‬ ‫گروه‬ ‫وجود‬. ‫ها‬ ‫نمونه‬ ‫ومواجهه‬ ‫محقق‬ ‫وکنترل‬ ‫دخالت‬ ‫فرضی‬ ‫علت‬ ‫با‬. Randomization
  • 48. 48 National Cancer Institute at the National Institutes of Health; https://www.cancer.gov/about-cancer/treatment/clinical-trials/what-are- trials/randomization; 11/2/2016.
  • 49. 49 RCT usage: I. By some patients being treated under the old s ystem and others under the new system, or equ ally with regard to staff; II. By some departments keeping the old system a nd other department getting the new system; III. Comparing similar departments in different ho spitals. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 50. 50 ‫نیست؟‬ ‫مناسب‬ ‫مواردی‬ ‫چه‬ ‫برای‬RCT • Testing the etiology of disease • rare outcome • Low participation • Exclusion • Low cost
  • 51. 51 RCT IS NOT suitable for: * ETIOLOGY AND CLINICAL COURSE smoking and cancer * RARE & PROLONGED OUTCOME
  • 52. 52 CASP RCTs - a checklist • Good randomisation procedures • Patients blind to treatment • Clinicians blind to treatment • All participants followed up • All participants analysed in the groups to which they were randomised (intention to treat)
  • 53. 53 6- Delphi • Delphi may be characterized as a method for structuring a group communication process so that the process is effective in allowing a group of individuals, as a whole, to deal with a complex problem. [1,3] • When judgment is necessary with controlled opinion feedback: we use Delphi • Qualitative assessment of an effect • Method for the prediction of the future • Key advantage : avoids direct confrontation of the experts. • Tool for expert problem solving [1,2] 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. 2.Okoli, C. and S.D. Pawlowski, The Delphi method as a research tool: an example, design considerations and applications. Information & Management, 2004. 42(1): p. 15-29. 3-‫تكنيك‬‫دلفي‬:‫علوم‬ ‫در‬ ‫آموزش‬ ‫ايراني‬ ‫مجله‬ ،‫اباذري‬ ‫پروانه‬ ،‫نصيرياني‬ ‫خديجه‬ ،‫احمدي‬ ‫اله‬ ‫فضل‬ ،‫تحقيق‬ ‫در‬ ‫ابزاري‬‫بهار‬ ،‫پزشكي‬‫ت‬ ‫و‬‫ابستان‬-138 7185(:1)8‫؛‬
  • 54. 54 Delphi: • Delphi survey, including guidelines for data collection, d ata analysis and reporting of results . • Three ways of Delphi methods: 1- Classic Delphi, 2- Policy Delphi, 3- Decision Delphi • Preparation of a questionnaire, including the perils and pitfalls associated with a Questionnaire approach 1.Okoli, C. and S.D. Pawlowski, The Delphi method as a research tool: an example, design considerations a nd applications. Information & Management, 2004. 42(1): p. 15-29. 2.Kushniruk, A.W. and V.L. Patel, Cognitive and usability engineering methods for the evaluation of clinical informati on systems. J Biomed Inform, 2004. 37(1): p. 56-76. 3.‫دلفي‬ ‫تكنيك‬:‫تا‬ ‫و‬ ‫بهار‬ ،‫پزشكي‬ ‫علوم‬ ‫در‬ ‫آموزش‬ ‫ايراني‬ ‫مجله‬ ،‫اباذري‬ ‫پروانه‬ ،‫نصيرياني‬ ‫خديجه‬ ،‫احمدي‬ ‫اله‬ ‫فضل‬ ،‫تحقيق‬ ‫در‬ ‫ابزاري‬‫بستان‬-1387 185(:1)8‫؛‬
  • 55. 55
  • 57. 57 ‫دلفی‬ ‫روش‬ ‫اجرايی‬ ‫مراحل‬: ‫دلفی‬ ‫روش‬ ‫اصول‬‫گرايی‬ ‫کثرت‬ ‫و‬ ‫جمعی‬ ‫خرد‬ ‫از‬ ‫استفاده‬‫شرکت‬ ‫نظرات‬ ‫ميباشد‬ ‫کنندگان‬. ‫قابل‬ ‫منابع‬ ‫و‬ ‫موضوع‬ ‫گستره‬ ‫و‬ ‫دامنه‬ ‫به‬ ‫توجه‬ ‫با‬ ‫دلفی‬ ‫روش‬ ‫در‬ ‫دسترس‬‫متفاوت‬ ‫کنندگان‬ ‫شرکت‬ ‫تعداد‬‫بود‬ ‫خواهد‬. ‫هم‬ ‫برای‬ ‫کنندگان‬ ‫شرکت‬ ‫که‬ ‫است‬ ‫اين‬ ‫بر‬ ‫تالش‬ ‫روش‬ ‫اين‬ ‫در‬‫ناشناس‬ ‫بمانند‬ ‫باقی‬. ‫مراحل‬ ‫اين‬ ‫طی‬ ‫در‬ ‫که‬ ‫ميباشد‬ ‫تکرار‬ ‫دوره‬ ‫چندين‬ ‫شامل‬ ‫روش‬ ‫اين‬ ‫که‬ ‫است‬ ‫آن‬ ‫بر‬ ‫سعی‬‫گروهی‬ ‫نظرات‬ ‫از‬ ‫استفاده‬ ‫حداکثر‬‫عمل‬ ‫به‬ ‫را‬ ،‫آورده‬‫و‬ ‫مخالفت‬‫برسد‬ ‫حداقل‬ ‫به‬ ‫ناسازگاری‬.‫در‬ ‫معموال‬2‫يا‬3‫راند‬ ‫يابد‬ ‫می‬ ‫پايان‬. ‫صورت‬ ‫به‬ ‫اول‬ ‫مرحله‬ ‫ليست‬ ‫چک‬‫مند‬ ‫ساختار‬ ‫نيمه‬ ‫يا‬ ‫و‬ ‫باز‬ ‫فيلد‬ (‫جديد‬ ‫های‬ ‫آيتم‬ ‫پيشنهاد‬ ‫جهت‬ ‫باز‬ ‫فيلد‬ ‫همراه‬ ‫به‬)‫می‬ ‫طراحی‬ ‫گردد‬. ‫صورت‬ ‫به‬ ‫مراحل‬ ‫ساير‬ ‫ليست‬ ‫چک‬‫مند‬ ‫ساختار‬‫گردد‬ ‫می‬ ‫طراحی‬( .‫بدون‬ ‫باز‬ ‫فيلد‬) ‫از‬ ‫معمول‬ ‫طور‬ ‫به‬‫معيار‬5‫ليکرت‬ ‫تايی‬‫گردد‬ ‫می‬ ‫استفاده‬.
  • 58. 58 6- Delphi 1.Friedman C., W.J., Evaluation methods in biomedical informatics. 2006: Springer .p 182.
  • 59. 59 7-Usability • Usability evaluation is a method for identifying specific problems with IT p roducts and specifically focuses on the interaction between the user and tas k in a defined environment.[1,4] • The usability evaluation methods are so various which makes the choice of the appropriate one(s) a difficult task for evaluators. Usability factors are va rious features that are used to measure how easy systems are in supporting users’ tasks.[2,3] 1.Brown Iii, W., et al., Assessment of the Health IT Usability Evaluation Model (Health-ITUEM) for evaluating mobile health (mHealth) technology. Journal of Biomedical Informatics, 2013. 46(6): p. 1080-1087. 2.Dhouib, A., et al., A classification and comparison of usability evaluation methods for interactive adaptive syst ems. 2016: p. 246-251. 3.Borycki, E., et al., Usability Methods for Ensuring Health Information Technology Safety: Evidence-Based Ap proaches Contribution of the IMIA Working Group Health Informatics for Patient Safety. IMIAYearbook of Medical Informatics, 2013: p. 20-27. 4. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 60. 60 USABILITY EVALUATION METHODS FOR INTER ACTIVE ADAPTIVE SYSTEMS: • Cognitive Walkthrough • Heuristic Evaluation • Focus Group • User-as-wizard • Task experiment based • Simulated users 1.Dhouib, A., et al., A classification and comparison of usability evaluation meth ods for interactive adaptive systems. 2016: p. 246-251.
  • 61. 61 Usability Evaluation Methods Advantages Disadvantages Cognitive walkthrough (+) Makes predictions about the design without involving users, (+) Can be used early in the preliminary evalu ation phase. (–)Certain adaptation aspects not covered, (–)Time consuming. Heuristic evaluation (+) Quick and low-cost evaluation method, (+) Can be done early in the IAS development process (–) Evaluators must be experts, (–) Need to choose The appropriate heuristics. Focus group (+) Fast to conduct, (+) Can provide large amounts of data in short time (–) Support only subjective opinions (–) Depends on experienced moderator. 1.Dhouib, A., et al., A classification and comparison of usability evaluation methods for interactive adaptive systems. 2016: p. 246-251.
  • 62. 62 8- Cognitive Assessment • Task-oriented • Assessment of the cognitive aspects of the interaction between an IT system and its users • Cognitive aspects deal with the extent to which the system functions in accordance with the way the users • Think and work in respect of user interaction with the IT system in general and with the user dialogue in particular . 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 63. 63 9- Cognitive Walkthrough • Cognitive Walkthroughs an analytical method designed to evaluate usability • Assessment of user 'friendliness' on the basis of a system design, from specification, to muck-ups and early prototypes of the system, aimed at judging how well the system complies with the users' way of thinking • The cognitive walkthrough is a formalized way of imagining people ’s thoughts and actions when they use an interface for the first time. 1. Wharton, C., et al., The cognitive walkthrough method: a practitioner's guide, in Usability i nspection methods, N. Jakob and L.M. Robert, Editors. 1994, John Wiley & Sons, Inc. p. 105-140. 2. A. W. Kushniruk, A. W. Kushniruk, E. M. Borycki-Human, Social, and Organizational Aspe cts of Health Information Systems-IGI Global, Premier Reference Source (2008) 3. Xiao, T., Broxham, W., Stitzlein, C., Croll, J., & Sanderson, P. Two human-cent ered approaches to health informatics: Cognitive systems engineering and usa bility. in Proceedings of the WCC IFIP-IMIA International eHealth Joint Conf erence. 2010. Brisbane, Qld.
  • 64. 64 Cognitive Walkthrough • CW is an inspection method for evaluating usability in a user interface. • The questions are aids to simulating the user’s cognitive process: (1)Will the user be trying to achieve the right effect? (2)Will the user discover that the correct action is available? (3)Will the user associate the correct action with the desired effect? (4)If the correct action is performed, will the user see that progress is being made? answered with YES or NO and reasons why the user will succeed or fail in performing the action (“failure/success stories”). [1,2] 1.Blig, L.-O. and A.-L. Osvalder, Enhanced cognitive walkthrough: development of the cogniti ve walkthrough method to better predict, identify, and present usability problems. Adv. in Hum.- Comp. Int., 2013. 2013: p. 9-17. 2.Wharton, C., et al., The cognitive walkthrough method: a practitioner's guide, in Usability ins pection methods, N. Jakob and L.M. Robert, Editors. 1994, John Wiley & Sons, Inc. p. 105 -140.
  • 65. 65 CW • The method focuses on simplicity in learning, especially through exploratory learning. CW has been employed to evaluate medical equipment, such as clinical information systems, patient information systems, clinical order systems , dialysis machines, and patient surveillance systems .[1] 1.Blig, L.-O. and A.-L. Osvalder , Enhanced cognitive walkthrough: development of the c ognitive walkthrough method to better predict, identify, and present usability problems. A dv. in Hum.-Comp. Int., 2013. 2013: p. 9-17.
  • 66. 66 10-Focus Group Interview • A focus group is a form of qualitative research in which a group of people are asked about their perceptions, opinions, beliefs, and attitudes towards a product. • The moderator speaks very little, and encourages the group to generate the information required by stimulating discussion through terse provocative statements.  Challenge: analyzing qualitative data 1.Rabiee, F., Focus-group interview and data analysis. Proc Nutr Soc, 2004. 63(4): p. 655-60. 2.Chapter 5: Personal Interviews. Marketing research and information systems. (Marketing and Agribusiness Texts 2016 [cited 2016 10/31/2016]; Available from: http://www.fao.org/docrep/w3241e/w3241e06.htm.
  • 67. 67 Time of focus group interview : During the early analysis phases During operations
  • 68. 68  The focus used for eliciting descriptive data from population subgroups  Usually, a group of eight to twelve persons are gathered together for a group interview or discussion on a focused topic.  Focus groups are widely used in the investigation of applied-research problems and are recognized as a distinct research method.  The method enables researchers to generate new hypotheses;  To explore intermediate variables as a means of explaining certain relationships found in survey data. 1. Bender, D.E. and D. Ewbank, The focus group as a tool for health research: issues in design and analysis. Health Transit Rev, 1994. 4(1): p. 63-80.
  • 69. 69 11-Equity Implementation Mo del (EIM) • Examine users' reaction to the implementation of a new system • It may be applied as a measuring instrument to provide the decision- making basis relation to policy planning and priority setting, as well as for examination of barriers and drivers at the implementation of change. • The EIM helps provide a theory-based understanding for collecting and reviewing users' reactions to, and acceptance or rejection of, a new technology or system. 1. Lauer, T.W., K. Joshi, and T. Browdy, Use of the equity implementation model to review clini cal system implementation efforts: a case report. J Am Med Inform Assoc, 2000. 7(1): p. 91-10 2. 2. Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 70. 70 • The method provides a means for explanation of system implementation events, leading to an understanding of user resistance or acceptance of the new technology. • It may be applied as a measuring instrument to provide the decisi on-making basis relation to policy planning and priority setting, as well as for examination of barriers and drivers at the implementati on of change.
  • 71. 71 12- Functionality Assessment 1. Validation of objectives fulfillment 2. Impact Assessment 3. Identification of problems • It is best suited to qualitative studies 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 72. 72 13-FieMStudy  Observation of an organization to identify its practices and to expose mechanisms that control change.  This method is widely used in psychology, sociology, anthropology, and so on that is, in professions that deal with different perspectives of human factors - to identify what goes on and how 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 73. 73 14-Grounded Theory • Grounded Theory is a supportive analytical method for data acquisition methods that generate textual data, some open Questionnaire methods, and Interviews. • This method is subjective. • In a social context through an iterative and comparative process in 4 phase:  Open coding  Axial coding  Selective coding  Theoretical coding 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 74. 74 15-Heuristic Evaluation • Used when no other realizable possibilities exist • This method can be used for nearly everything but in practice it is m ost commonly used for the assessment of user interfaces • Heuristic evaluation, an established evaluation method in the field o f Human-Computer Interaction, was originally proposed by Jakob N ielsen .  Five phase: 1. Selection of appropriate heuristics, 2. An individual and independent inspection of the design using heuristics to i dentify features that conflict with some aspect of best practice, 3. Editing the joint material to identify and resolve duplicates and related findi ngs, 4. Prioritization to identify severities 5. Analysis and report writing 1.Westbrook, J. and J.I. Westbrook, Information Technology in Health Care 2007: Procee dings of the 3rd International Conference on Information Technology in Health Care: So cio-Technical Approaches. 2007: IOS Press . pp 205-207 2- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 75. 75
  • 76. 76 Zhang and colleagues incorporated Nielsen’s 10 heuristics , Shneiderman’s eight golden rules , and the results of their research to formulate a list of 14 heuristics. 1.Allen, M., et al., Heuristic evaluation of paper-based Web pages: A simplified inspect ion usability methodology. Journal of Biomedical Informatics, 2006. 39(4): p. 412-423.
  • 77. 77 16- Impact Assessment • Measurement of the effect plus assessment of side effect 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 78. 78 17- Interview (Nonstandardized) • For qualitative studies of subjective as well as objective circumstances. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 79. 79 18- KUBI • Translated: Quality development through user involvement • It has many points in common with the Balanced Scorecard method 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 80. 80 19- SWOT • Allows the assessment of an organization from a neutral perspective through a detailed discussion of the organization’s strengths, weaknesses, opp ortunities and threats.
  • 81. 81 SWOT • The method is useful as a brainstorming method to elucidate a nd evaluate each of the aspects brought forward. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 82. 82 20- Think Aloud • Think Aloud is a method that requires users to speak as they interact with an IT-based system to solve a problem or perform a task. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 83. 83 • Thinking aloud technique dates back to the works of experimental psychology and was first ever described by Karl Duncker (1945) wh ile he studied productive thinking. • TAP is popularly used by usability researchers today. But what do researchers think they get from TAP? Is it right to assume that there is a one-to-one mapping between verbal protocols and ‘pure data’? • TAP adds cognitive load on users and can hinder primary tasks (Preece, 1994). So how do the users experience it? What do users thi nk of it? Research Paper: Janni Nielsen, Torkil Clemmensen, and Carsten Yssing. 2002. Getting access to what goes on in people's heads?: Reflections on the think-aloud technique. In Proceedings of the second Nordic conference on Human-computer interaction (NordiCHI '02). ACM, New York, NY.
  • 84. 84 • Thinking is much more that what can be explicitly expressed in words. •To get access to human cognitive processes, a way forward may be to develop a practice of introspection; to expand our knowledge about the reflective activity of the user in the expert-guided think aloud session. •The authors argue that access to subjective experience is possible in terms of introspection where user has to become a participant in the analysis of his or her own cognitive processes. • The paper suggests that use of think aloud should have, as a prerequisite, explicit descriptions of design, test procedure and framework for analysis.
  • 85. 85 21- Prospective Time Series  Measurement of a development trend  Time series  Before-and-after studies  Measurement of a number of measures over time shows how an acti vity or an outcome changes as a function of either time alone or as a function of different initiatives  This method requires control all over time  Carried out as a matched-pair design in:  Controlled studies including RCTs  Simple before-and-after study  In cohort studies with the follow-up of cases over time  In decision support system with before-after study 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. P 160.
  • 86. 86 22-Risk Assessment • Monitoring of risk factors in a development or assessment project to make it possible to take preemptive action. • Risk assessment, risk management, and risk control can be handled either retrospectively or prospectively. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 87. 87 23-Root Causes Analysis • Exploration of what, how, and why a given incident occurred to identify the root causes of undesirable events. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 88. 88 24-Social Network Analysis • Social network analysis is carried out by means of diagra mming techniques in relationships between elements wit hin an organization (such as individuals, professions, de partments or other organizations) • Most commonly applied to help improve the effectivenes s and efficiency of decision making processes in commer cial organizations. • The potential value of SNA to measure team function an d use the information to improve working processes was another finding of studies in different settings 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press. 2- Chambers, D., Wilson, P., Thompson, C., & Harden, M. (2012). Social Network Analysis in Health care Settings: A Systematic Scoping Review. PLoS ONE, 7(8), e41911. http://doi.org/10.1371/journal. pone.0041911
  • 89. 89 Example of social network: the case of the eye care programme in the Brong Ahafo region, January 2010. Karl Blanchet, and Philip James Health Policy Plan. 2011;heapol.czr055 Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2011; all rights reserved.
  • 90. 90 25-Stakeholder Analysis • Assessment of stakeholder features and their inner dynamics, aimin g to identify participants for the completion of a given task, a proble m-solving activity or a project. Namazzi, G., et al., Stakeholder analysis for a maternal and newborn health project in Eastern Ugand a. BMC Pregnancy Childbirth, 2013. 13: p. 58.
  • 91. 91 Stakeholder Analysis. Service quality 2016 [cited 2016 11/2/2016]; Available from: http://asq .org/service/body-of-knowledge/tools-stakeholder-analysis.
  • 92. 92 26-Technical Verification • Functions are present, work correctly and are in compliance with the agreement. A. Completeness of the functionality B. Correctness of the functionality C. Coherence of the functionality D. Consistency of the functionality E. Interconnectivity of the functionality F. Reliability G. Performance 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 93. 93 27-User Acceptance and Satisfaction  Assessment of user opinions, attitudes, and perceptions of an IT syst em during daily operation  Most common methods :  Interview techniques : Interviews or Focus-Group Interviews.  Questionnaires: most commonly used when measuring user satisfact ion.  The Equity Implementation Model *User-satisfaction questionnaires  Aim: verify and correct for aspects such as reliability, predictive value, accuracy and content validity and internal validity. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.
  • 94. 94 28-Videorecording Video recording is a capability for recording videos of every new Automate test session that is requested on BrowserStack. While your test is running, you can see a live screencast that is recorded and saved with a proper name that can be accessed and downloaded later. https://www.browserstack.com/question/651
  • 95. 95 29-WHO: Framework for As sessment of Strategies  The method may be applied for assessment of different ( development) strategies, either individually or as a comparative analysis.  The method is particularly relevant early in the explorative Phase for exploring the feasibility of basic ideas and solutions. 1- Brender, J., Handbook of Evaluation Methods for Health Informatics 2006: Academic Press.p 222
  • 96. 96 30-WHO: Framework for Asse ssment of Strategies
  • 98. 98 Literature reviews: • Evaluation studies have been performed with different aims and obje ctives, and are concerned with different domains and areas of focus. 1.Rahimi, B. and V. Vimarlund, Methods to Evaluate Health information Systems in Healthcare Settings: A Literature Review. Journal of Medical Systems, 2007. 31(5): p. 397- 432.
  • 99. 99 Literature reviews about ICT Evaluation  Most important study conducted by Kapalan about CDSS Evaluation[1]. • Conclusion of 27 reviews : RCT is standard method of evaluation approach for CDSS. • Problems of RCT: Do not answer questions such as why some systems tend to be used while others are not •Unrecognized ineffectiveness reason of CDSS.  1,035 articles reviewed by Ammenwerth and Keizer’s from 1982 to 2002 • Concluded that the number of evaluation studies in the area of medical informatics is rising significantly. • appropriateness of patient care, efficiency of patient care, user satisfaction, and software quality. • The quality of care processes and patient outcomes was found to have increased. • Interpreted this shift as a sign of the maturation of evaluation research in medical informatics. 1. Rahimi, B. and V. Vimarlund, Methods to Evaluate Health information Systems in Healthc are Settings: A Literature Review. Journal of Medical Systems, 2007. 31(5): p. 397-432.
  • 100. 10  Delpierre et al reviewed 26 articles about CBPRS , from January 2000 to March 2003. • Increased satisfaction of users and patients • Could lead to significant changes in medical practice. • Most of the studies did not include qualitative factors such as characteristics of the disease and the tool Rahimi, B. and V. Vimarlund, Methods to Evaluate Health information Systems in He althcare Settings: A Literature Review. Journal of Medical Systems, 2007. 31(5): p. 3 97-432.
  • 101. 10 • Some of the studies in this review evaluate the effects of the new system’s implementation on the quality of work performance, such as user job performance , computer knowledge, and investigation of skill among other users • Most of the studies included in this paper have used survey methodology as their research method • Some of the studies used a clinical trial or cohort study to research the systems 'outputs. • No studies have been conducted to explore the impact of IT on the system as a whole.
  • 102. 10
  • 103. 10
  • 104. 10 Free PPT Chart : ALLPPT.com ALLPPT.com _ Free Powerpoint Templates, Diagrams and Charts Your own sub headline Text here Add text Add Text here Text here Add text add text add text add text add text add text add text add text Text here Text here Add text add text add text add text Text here Text here Add text add text add text add text add text add text add text Text here Text here Add text add text add text add text add text add text add text add text Text here Text here Add text add text add text add text

Hinweis der Redaktion

  1. These applications are referred to generally as medical or clinical information systems or electronic medical records (EMRs).
  2. The word impact is used in the sense of ‘influence’. On the other hand the word ‘effect’ is used in the sense of ‘result’. This is the main difference between the two words impact and effect.
  3. A framework for systems analysis in health care was developed and applied in a health care setting. To provide a clear structure, the framework describes the potential views and levels of systems analyses in a health care environment.
  4. the tool is usually divided into four sections called quadrants. These are designed to help ensure the activities and initiatives being monitored are comprehensive and reflect a well-balanced approach to achieving the Vision
  5. دلفي رويكرد يا روشي سيستماتيك در تحقيق براي استخراج نظرات از يك گروه متخصصان در مورد يك موضوع يا يك سؤال است و يا رسيدن به اجماع گروهي از طريق يك سري از راندهاي پرسشنامه اي با حفظ گمنامي پاسخ دهندگان، و بازخورد نظرات به اعضاي پانل است
  6. The central feature of this method of obtaining information from groups of people is that the interviewer strives to keep the discussion led by a moderator focused upon the issue of concern. The moderator behaves almost like a psycho-therapist who directs the group towards the focus of the researcher. In doing so, the moderator speaks very little, and encourages the group to generate the information required by stimulating discussion through terse provocative statements.
  7. Example of social network: the case of the eye care programme in the Brong Ahafo region, January 2010. Each square represents an actor and the arrow a relationship between two actors (i.e. the existence of a flow of information between two actors). (Source: Karl Blanchet)‏
  8. By Kapalan Conclusion of 27 reviews : RCT is standard method of evaluation approach for CDSS. 1,035 articles reviewed by Ammenwerth and Keizer’s from 1982 to 2002 Delpierre et al reviewed 26 articles about CBPRS , from January 2000 to March 2003.