SlideShare a Scribd company logo
1 of 37
Evaluation of Interactive Systems
Design/Prototype/Product
Md. Saifuddin Khalid
Assistant Professor
KANDIDATUDDANNELSEN I INFORMATIONSTEKNOLOGI,
IT OG LÆRING,
MED SPECIALISERING I
ORGANISATORISK OMSTILLING
Location of course: 8. semester
Course’s Scope: 5 ECTS
Aalborg University, Aalborg, Denmark
Spring 2017
ID6-L1
Aims
• Evaluation is the fourth main phase of the interactive
systems design process
• Evaluation means reviewing, trying out or testing a design
idea, a piece of software, a product or a service to
discover whether it meets some criteria.
• After studying Benyon, (2010, chapter 10) and attending
today’s session you should be able to:
– Appreciate the uses of a range of generally applicable
evaluation techniques designed for use with and without users
1. Understand expert-based evaluation methods
2. Understand participant-based evaluation methods
3. Apply the techniques in appropriate contexts.
Monday, 27 March 2017 Aalborg University
Categorization of design and systems
evaluation methods
• User-focused
– Analytical (without user) and
– Empirical (with user) Evaluation
• Product creation process (PCP) oriented
– Exploratory (before design or after release),
– Predictive (after design and before implementation)
– Formative (during design and implementation) and
– Summative (after implementation) Evaluation
• Expert, participant-based and context-
appropriated methods
Monday, 27 March 2017 Aalborg University
Usability
Engineering
Lifecycle
Monday, 27 March 2017 Aalborg University
Source: http://keithandrews.com/talks/2012/dd-2012-03-27/#(2)
1. Design-test-redesign
2. Design versus evaluation
Monday, 27 March 2017 Aalborg University
Source:
http://keithandrews.com/talks/2012/dd-2012-
03-27/#(5)
See also, for details
http://user.medunigraz.at/andreas.holzinger/h
olzinger%20de/usability%20holzinger.html#Act
ion Analysis (AA)
Analytic Evaluation (without users)
and Empirical Evaluation (with users)
• Analytic method
– For example, evaluation of an axe’s characteristics (e.g. design
of the bit, the weight distribution, the steel alloy used, etc.)
• Empirical method
– studying a good axe man as he uses the tool
Monday, 27 March 2017 Aalborg University
Rosson, M. B., & Carroll, J. M. (2002).
Formative Evaluation (during design)
and Summative Evaluation (at the end)
• Formative evaluation takes place during the design process.
– Goals: to identify aspects of a design that can be improved, to set
priorities, and in general to provide guidance in howto make changes
to a design.
– A typical formative evaluation would be to ask a user to think out
loud as he or she attempts a series of realistic tasks with a prototype
system.
• Summative evaluation happen at the end of a development process
– Goals: to answer — “Does the system meet its specified goals?” or “Is
this system better than its predecessors and competitors?”
– Most likely to happen at the end of a development process when the
system is tested to see if it has met its usability objectives.
– can also take place at critical points during development to determine
how close the system is to meeting its objective, or to decide whether
and how much additional resources to assign to a project.
Aalborg University
Rosson, M. B., & Carroll, J. M. (2002).
Monday, 27 March 2017
ISO 9241-210: Human-centred design
for interactive systems
Monday, 27 March 2017 Aalborg University
Source: http://uxlabs.co.uk/services/; adapted – Red-colored boxes
PACT Analysis
Models: Use
cases, Rich
picture, User
stories
Rogers et al. (2013) and ISO 9241-210 phases are not same!
Evaluation is closely tied to …
• key activities of interactive
systems design, understanding,
design and envisionment.
• who is involved in the evaluation
– Expert-based methods (what type
of expert? Usability expert or
interaction designer)
– Participant/End-User methods
(what type? Target user, other
designers, or students or others)
Monday, 27 March 2017 Aalborg University
Consider an exercise: Collect several advertisements for small, personal technologies
such as that shown in Figure 10-1. What claims are the advertisers making about
design features and benefits? What issues does this raise for their evaluation?
Source: Benyon, 2010, p. 226
The four roles that children may have in
the design of new technologies
Monday, 27 March 2017 Aalborg University
Druin, A. (2002). The role of children in the design of new technology. Behaviour &
Information Technology, 21(1), 1–25. http://doi.org/10.1080/01449290110108659
EXPERT EVALUATION
Formal Heuristic and informal expert review
Monday, 27 March 2017 Aalborg University
Expert Evaluation: Heuristic evaluation
• Heuristic: enabling a person to
discover or learn something for
themselves.
• “Heuristic evaluation refers to a
number of methods in which a
person trained in HCI and
interaction design examines a
proposed design to see how it
measures up against a list of
principles, guidelines or ‘heuristics’
for good design. This review [of
interfaces] may be a quick
discussion over the shoulder of a
colleague, or may be a formal,
carefully documented process.”
Monday, 27 March 2017 Aalborg University
List of the design principles – or
heuristics [ a specific rule-of-
thumb or argument derived
from experience]:
1. Visibility
2. Consistency
3. Familiarity
4. Affordance
5. Navigation
6. Control
7. Feedback
8. Recovery
9. Constraints
10. Flexibility
11. Style
12. Conviviality
Expert Evaluation: Discount
Usability/Heuristic Evaluation
• Three overarching usability principles
– learnability (principles 1–4),
– effectiveness (principles 5–9)
– accommodation (principles 10–12).
• A ‘quick and dirty’ approach, for time-pressured evaluation
practitioners in need of feedback
• “Woolrych and Cockton (2000) conclude that the heuristics add
little advantage to an expert evaluation and the results of applying
them may be counter-productive.”
• “They (and other authors) suggest that more theoretically informed
techniques such as the cognitive walkthrough offer more robust
support for problem identification.”
Monday, 27 March 2017 Aalborg University
Heuristic evaluation (cont.)
• Heuristic evaluation is valuable as formative evaluation, to help the
designer improve the interaction at an early stage.
• It should not be used as a summative assessment, to make claims about
the usability and other characteristics of a finished product.
• If that is what we need to do, then we must carry out properly designed
and controlled experiments with a much greater number of participants.
• Ecological validity: The results of most user testing can only ever be
indicative of issues in real-life usage due to human nature of adapting
technology in ways that was not designed for. So,
– Ethnographically informed observations of technologies in long-term
use
– Having users keep diaries, which can be audio-visual as well as written
– Collecting ‘bug’ reports – often these are usability problems – and
help centre queries.
Monday, 27 March 2017 Aalborg University
Cognitive walkthrough
• The cognitive walkthrough entails a usability analyst stepping through the
cognitive tasks that must be carried out in interacting with technology.
• Inputs to the process are:
– An understanding of the people who are expected to use the system
– A set of concrete scenarios representing both (a) very common and (b)
uncommon but critical sequences of activities
– A complete description of the interface to the system - hierarchical
task analysis (HTA).
• The ‘cognitive jogthrough’ (Rowley and Rhoades, 1992) – video records
(rather than conventional minutes) are made of walkthrough meetings,
annotated to indicate significant items of interest, design suggestions are
permitted, and low level actions are aggregated wherever possible.
• The cognitive walkthrough is very often practiced (and taught) as a
technique executed by the analyst alone, to be followed in some cases by
a meeting with the design team.
Monday, 27 March 2017 Aalborg University
PARTICIPANT-BASED EVALUATION
Usability Evaluation
Monday, 27 March 2017 Aalborg University
Cooperative evaluation
• The technique is ‘cooperative’ because participants are not passive
subjects but work as co-evaluators.
Monday, 27 March 2017 Aalborg University
Cooperative evaluation (cont.)
Monday, 27 March 2017 Aalborg University
Sample questions during the evaluation:
What do you want to do?
What were you expecting to happen?
What is the system telling you?
Why has the system done that?
What are you doing now?
Sample questions after the session:
What was the best/worst thing about the
prototype?
What most needs changing?
How easy were the tasks?
How realistic were the tasks?
Did giving a commentary distract you?
Participatory heuristic evaluation
• The developers of participatory heuristic evaluation (Muller et al.,
1998) claim that it extends the power of heuristic evaluation
without adding greatly to the effort required.
• The procedure for the use of participatory heuristic evaluation is
just as for the expert version, but the participants are involved as
‘work-domain experts’ alongside usability experts and must be
briefed about what is required.
Monday, 27 March 2017 Aalborg University
Co-discovery
• Co-discovery is a naturalistic,
informal technique that is particularly
good for capturing first impressions.
It is best used in the later stages of
design.
• Watching individual people
interacting with the technology, and
possibly ‘thinking aloud’ as they do
so, can be varied by having
participants explore new technology
in pairs.
• Depending on the data to be
collected, the evaluator can take an
active part in the session by asking
questions or suggesting activities, or
simply monitor the interaction either
live or using a video-recording.
Monday, 27 March 2017 Aalborg University
Figure. Catroid Co-discovery test
Controlled experiments
• Controlled experiments are appropriate where the designer is
interested in particular features of a design, perhaps comparing one
design to another to see which is better.
• Identify independent and dependent variables, and design decision.
• E.g. You might want to judge which Web design is better based on
the number of clicks needed to achieve some task; speed of access
could be the dependent variable for selecting a function.
• confounding variables - learning effects, the effects of different
tasks, the effects of different background knowledge, etc.
• Considering participants’ differences, the next stage is to decide
whether each participant will participate in all conditions (so-called
within-subject design) or whether each participant will perform in
only one condition (so-called between-subject design).
Monday, 27 March 2017 Aalborg University
EVALUATION IN PRACTICE
The trend in current practice
Monday, 27 March 2017 Aalborg University
Trend
• A survey of 103 experienced practitioners of human-
centred design conducted in 2000 indicates that
– around 40 per cent of those surveyed conducted
‘usability evaluation’,
– around 30 per cent used ‘informal expert review’
– around 15 per cent used ‘formal heuristic
evaluation’
• What is the current trend?
Monday, 27 March 2017 Aalborg University
Main steps of evaluation
project/phase
1. Establish the aims of the evaluation, the intended participants in
the evaluation, the context of use and the state of the technology;
obtain or construct scenarios illustrating how the application will
be used.
2. Select evaluation methods. These should be a combination of
expert-based review methods and participant methods.
3. Carry out expert review.
4. Plan participant testing; use the results of the expert review to
help focus this.
5. Recruit people and organize testing venue and equipment.
6. Carry out the evaluation.
7. Analyse results, document and report back to designers.
Monday, 27 March 2017 Aalborg University
Perceived costs and benefits of
evaluation methods
Monday, 27 March 2017 Aalborg University
Figure: A survey of user-centred design practice
(Benyon, 2010, p. 237, cited Vredenburg, K., Mao, J.-Y., Smith, P.W. and Carey, T. , 2002)
Metrics (and measures)
• Three things to keep in mind when deciding metrics:
– Just because something can be measured, it doesn’t mean it
should be.
– Always refer back to the overall purpose and context of use
of the technology.
– Consider the usefulness of the data you are likely to obtain
against the resources it will take to test against the metrics.
Monday, 27 March 2017 Aalborg University
Monday, 27 March 2017 Aalborg University
People who will use the system
• Nielsen’s recommended sample of 3–5 participants has been
accepted wisdom in usability practice for over a decade.
• For heterogeneous set of customers, run 3–5 people from each
group through your tests.
• If you cannot recruit any genuine participants then use convenient
‘user recruitment’ - one of your colleagues, a friend, your mother
or anyone you trust to give you a brutally honest reaction. But, be
extremely careful as to how far you generalize from your findings.
Monday, 27 March 2017 Aalborg University
The test plan and task specification
• A plan should be drawn up to guide the evaluation.
– Aims of the test session
– Practical details, including where and when it will be
conducted, how long each session will last, the specification of
equipment and materials for testing and data collection, and
any technical support that may be necessary
– Numbers and types of participant
– Tasks to be performed, with a definition of successful
completion. This section also specifies what data should be
collected and how it will be analysed.
• Conduct a pilot session and fix any unforeseen difficulties
Monday, 27 March 2017 Aalborg University
Reporting usability evaluation results
to the design team
• The report should be ordered either by areas of the system
concerned, or by severity of problem.
• A face-to-face meeting may have more impact than a written
document alone (although this should always be produced as
supporting material) and this would be the ideal venue for showing
short video clips of participant problems.
• Usability problems can be fed into a ‘bug’ reporting system if one
exists.
• The can be turned into user stories as part of spring backlog of
Scrum development approach.
Monday, 27 March 2017 Aalborg University
EVALUATION: FURTHER ISSUES
The mix category
Monday, 27 March 2017 Aalborg University
Evaluation without being there
• Internet connectivity enabled evaluations without being physically
present.
• If the application itself is Web-based, or can be installed remotely,
instructions can be supplied so that users can run test tasks and fill
in and return questionnaires in soft or hard copy.
• On-line questionnaires and crowd sourcing methods are
appropriate here (See, Benyon, 2010, Chapter 7).
Monday, 27 March 2017 Aalborg University
Physical and physiological measures:
Evidence of emotional reactions
• Eye-movement tracking (or ‘eye
tracking’) can show participants’
changing focus on different areas of the
screen.
• Physiological techniques in evaluation
rely on the fact that all our emotions –
The most common measures are of
changes in heart rate, the rate of
respiration, skin temperature, blood
volume pulse and galvanic skin response
(an indicator of the amount of
perspiration).
Monday, 27 March 2017 Aalborg University
• Body-connected sensors are linked to software which converts the
results to numerical and graphical formats for analysis
Evaluating presence – in virtual reality
• Designers of virtual reality – and some multimedia – applications
are often concerned with the sense of presence, of being ‘there’ in
the virtual environment rather than ‘here’ in the room where the
technology is being used.
• Methods:
– Questionnaire
– Written accounts of experience/interview
– Observation in virtual environment
– Direct physiological measures
Monday, 27 March 2017 Aalborg University
Evaluation at home
• Issues: Privacy, time
and motivation
• Methods
– Interviews
– Act out scenarios
– Diaries
“working with children is
a good way of drawing
parents into evaluation
activities.”
Monday, 27 March 2017 Aalborg University
The investigator supplied users with Post-its
to capture their thoughts about design concepts
(Figure 10.5). An illustration of each different concept
was left in the home in a location where it might be
used, and users were encouraged to think about how
they would use the device and any issues that might
arise. These were noted on the Post-its, which were
then stuck to the illustration and collected later.
Questions and clarifications
Monday, 27 March 2017 Aalborg University
References
• Druin, A. (2002). The role of children in the design of new technology. Behaviour & Information
Technology, 21(1), 1–25. http://doi.org/10.1080/01449290110108659
• Rosson, M. B., & Carroll, J. M. (2002). Usability engineering: scenario-based development of human-
computer interaction (1st ed.). San Fancisco: Academic Press.
Monday, 27 March 2017 Aalborg University

More Related Content

What's hot

Prototyping for Interaction Design
Prototyping for Interaction DesignPrototyping for Interaction Design
Prototyping for Interaction DesignPhilip van Allen
 
Usability test
Usability testUsability test
Usability testAnsviaLab
 
Creative research methods: Arts-based methods
Creative research methods: Arts-based methodsCreative research methods: Arts-based methods
Creative research methods: Arts-based methodsUniversity of Southampton
 
Lecture 2: Human-Computer Interaction: Conceptual Design (2014)
Lecture 2: Human-Computer Interaction: Conceptual Design (2014)Lecture 2: Human-Computer Interaction: Conceptual Design (2014)
Lecture 2: Human-Computer Interaction: Conceptual Design (2014)Lora Aroyo
 
An Introduction to Usability Testing
An Introduction to Usability TestingAn Introduction to Usability Testing
An Introduction to Usability TestingLennart Overkamp
 
Lecture 3: Human-Computer Interaction: HCI Design (2014)
Lecture 3: Human-Computer Interaction: HCI Design (2014)Lecture 3: Human-Computer Interaction: HCI Design (2014)
Lecture 3: Human-Computer Interaction: HCI Design (2014)Lora Aroyo
 
Interaction design beyond human computer interaction
Interaction design beyond human computer interactionInteraction design beyond human computer interaction
Interaction design beyond human computer interactionKenny Nguyen
 
Human Computer Interaction Chapter 3 HCI in the Software Process and Design ...
Human Computer Interaction Chapter 3 HCI in the Software Process and  Design ...Human Computer Interaction Chapter 3 HCI in the Software Process and  Design ...
Human Computer Interaction Chapter 3 HCI in the Software Process and Design ...VijiPriya Jeyamani
 
Introduction to HCI
Introduction to HCI Introduction to HCI
Introduction to HCI Deskala
 
Design thinking for Education, AUW Session 1
Design thinking  for Education, AUW Session 1Design thinking  for Education, AUW Session 1
Design thinking for Education, AUW Session 1Stefanie Panke
 
Interaction design and cognitive aspects
Interaction design and cognitive aspects Interaction design and cognitive aspects
Interaction design and cognitive aspects Andres Baravalle
 
Wimp interface
Wimp interfaceWimp interface
Wimp interfaceAbrish06
 
Qualitative research method
Qualitative research methodQualitative research method
Qualitative research methodmetalkid132
 
Human Computer Interaction Introduction
Human Computer Interaction IntroductionHuman Computer Interaction Introduction
Human Computer Interaction IntroductionN.Jagadish Kumar
 
Software prototyping
Software prototypingSoftware prototyping
Software prototypingBirju Tank
 
Introduction of stress testing
Introduction of stress testingIntroduction of stress testing
Introduction of stress testingPankaj Kumar
 

What's hot (20)

Prototyping for Interaction Design
Prototyping for Interaction DesignPrototyping for Interaction Design
Prototyping for Interaction Design
 
Usability test
Usability testUsability test
Usability test
 
Creative research methods: Arts-based methods
Creative research methods: Arts-based methodsCreative research methods: Arts-based methods
Creative research methods: Arts-based methods
 
Lecture 2: Human-Computer Interaction: Conceptual Design (2014)
Lecture 2: Human-Computer Interaction: Conceptual Design (2014)Lecture 2: Human-Computer Interaction: Conceptual Design (2014)
Lecture 2: Human-Computer Interaction: Conceptual Design (2014)
 
An Introduction to Usability Testing
An Introduction to Usability TestingAn Introduction to Usability Testing
An Introduction to Usability Testing
 
Lecture 3: Human-Computer Interaction: HCI Design (2014)
Lecture 3: Human-Computer Interaction: HCI Design (2014)Lecture 3: Human-Computer Interaction: HCI Design (2014)
Lecture 3: Human-Computer Interaction: HCI Design (2014)
 
Interaction design beyond human computer interaction
Interaction design beyond human computer interactionInteraction design beyond human computer interaction
Interaction design beyond human computer interaction
 
Human Computer Interaction Chapter 3 HCI in the Software Process and Design ...
Human Computer Interaction Chapter 3 HCI in the Software Process and  Design ...Human Computer Interaction Chapter 3 HCI in the Software Process and  Design ...
Human Computer Interaction Chapter 3 HCI in the Software Process and Design ...
 
Introduction to HCI
Introduction to HCI Introduction to HCI
Introduction to HCI
 
Design thinking for Education, AUW Session 1
Design thinking  for Education, AUW Session 1Design thinking  for Education, AUW Session 1
Design thinking for Education, AUW Session 1
 
Affordances
Affordances Affordances
Affordances
 
Interaction design and cognitive aspects
Interaction design and cognitive aspects Interaction design and cognitive aspects
Interaction design and cognitive aspects
 
Heuristic evaluation
Heuristic evaluationHeuristic evaluation
Heuristic evaluation
 
Uxpa design thinking workshop
Uxpa design thinking workshopUxpa design thinking workshop
Uxpa design thinking workshop
 
Wimp interface
Wimp interfaceWimp interface
Wimp interface
 
Chap05
Chap05Chap05
Chap05
 
Qualitative research method
Qualitative research methodQualitative research method
Qualitative research method
 
Human Computer Interaction Introduction
Human Computer Interaction IntroductionHuman Computer Interaction Introduction
Human Computer Interaction Introduction
 
Software prototyping
Software prototypingSoftware prototyping
Software prototyping
 
Introduction of stress testing
Introduction of stress testingIntroduction of stress testing
Introduction of stress testing
 

Viewers also liked

Introduction to Interaction Design Course 2017 (5 ECTS)
Introduction to Interaction Design Course 2017 (5 ECTS)Introduction to Interaction Design Course 2017 (5 ECTS)
Introduction to Interaction Design Course 2017 (5 ECTS)Khalid Md Saifuddin
 
Scrum and ISO 9241:210 Interaction Design Process and User Stories
Scrum and ISO 9241:210 Interaction Design Process and User StoriesScrum and ISO 9241:210 Interaction Design Process and User Stories
Scrum and ISO 9241:210 Interaction Design Process and User StoriesKhalid Md Saifuddin
 
Product Strategy and Product Success
Product Strategy and Product SuccessProduct Strategy and Product Success
Product Strategy and Product SuccessRoman Pichler
 
Lecture 1 rapid android development
Lecture 1   rapid android developmentLecture 1   rapid android development
Lecture 1 rapid android developmentFernando Loizides
 
Designing interactive systems - going beyond cognitive domain
Designing interactive systems - going beyond cognitive domainDesigning interactive systems - going beyond cognitive domain
Designing interactive systems - going beyond cognitive domainSanjay Goel
 
Interactive Design with your Left Brain
Interactive Design with your Left BrainInteractive Design with your Left Brain
Interactive Design with your Left BrainMing Chan
 
What does mapping controversies mean and why controversies are charted and vi...
What does mapping controversies mean and why controversies are charted and vi...What does mapping controversies mean and why controversies are charted and vi...
What does mapping controversies mean and why controversies are charted and vi...Khalid Md Saifuddin
 
Cartography of Controversies about MOOCs
Cartography of Controversies about MOOCsCartography of Controversies about MOOCs
Cartography of Controversies about MOOCsKhalid Md Saifuddin
 
Design Patterns for Interactive Video Narratives
Design Patterns for Interactive Video NarrativesDesign Patterns for Interactive Video Narratives
Design Patterns for Interactive Video NarrativesFlikli
 
Six Pattern of Interactive Storytelling
Six Pattern of Interactive StorytellingSix Pattern of Interactive Storytelling
Six Pattern of Interactive StorytellingChristian Riedel
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionMark Billinghurst
 
Mercuri Urval Assessment Overview
Mercuri Urval Assessment OverviewMercuri Urval Assessment Overview
Mercuri Urval Assessment OverviewMartin Ystrøm
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 

Viewers also liked (17)

Introduction to Interaction Design Course 2017 (5 ECTS)
Introduction to Interaction Design Course 2017 (5 ECTS)Introduction to Interaction Design Course 2017 (5 ECTS)
Introduction to Interaction Design Course 2017 (5 ECTS)
 
Scrum and ISO 9241:210 Interaction Design Process and User Stories
Scrum and ISO 9241:210 Interaction Design Process and User StoriesScrum and ISO 9241:210 Interaction Design Process and User Stories
Scrum and ISO 9241:210 Interaction Design Process and User Stories
 
Product Strategy and Product Success
Product Strategy and Product SuccessProduct Strategy and Product Success
Product Strategy and Product Success
 
Interactive Systems Design Project
Interactive Systems Design ProjectInteractive Systems Design Project
Interactive Systems Design Project
 
Lecture 1 rapid android development
Lecture 1   rapid android developmentLecture 1   rapid android development
Lecture 1 rapid android development
 
Designing interactive systems - going beyond cognitive domain
Designing interactive systems - going beyond cognitive domainDesigning interactive systems - going beyond cognitive domain
Designing interactive systems - going beyond cognitive domain
 
Interactive Design with your Left Brain
Interactive Design with your Left BrainInteractive Design with your Left Brain
Interactive Design with your Left Brain
 
What does mapping controversies mean and why controversies are charted and vi...
What does mapping controversies mean and why controversies are charted and vi...What does mapping controversies mean and why controversies are charted and vi...
What does mapping controversies mean and why controversies are charted and vi...
 
Cartography of Controversies about MOOCs
Cartography of Controversies about MOOCsCartography of Controversies about MOOCs
Cartography of Controversies about MOOCs
 
Design Patterns for Interactive Video Narratives
Design Patterns for Interactive Video NarrativesDesign Patterns for Interactive Video Narratives
Design Patterns for Interactive Video Narratives
 
Six Pattern of Interactive Storytelling
Six Pattern of Interactive StorytellingSix Pattern of Interactive Storytelling
Six Pattern of Interactive Storytelling
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
 
Mercuri Urval Assessment Overview
Mercuri Urval Assessment OverviewMercuri Urval Assessment Overview
Mercuri Urval Assessment Overview
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
Loque debe tener un proyecto de investigación
Loque debe tener un proyecto de investigaciónLoque debe tener un proyecto de investigación
Loque debe tener un proyecto de investigación
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 

Similar to Evaluation of Interactive Systems Design or Prototype or Product

User Centered Design Process to Develop a Multi-modal Family Needs Assessment...
User Centered Design Process to Develop a Multi-modal Family Needs Assessment...User Centered Design Process to Develop a Multi-modal Family Needs Assessment...
User Centered Design Process to Develop a Multi-modal Family Needs Assessment...Arthi Krishnaswami
 
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentationVisualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentationUniversity of Newcastle, NSW.
 
2013 UX RESEARCH - Usability Testing Approaches
2013 UX RESEARCH - Usability Testing Approaches2013 UX RESEARCH - Usability Testing Approaches
2013 UX RESEARCH - Usability Testing ApproachesVanessa Speziale
 
2015-11-11 research seminar
2015-11-11 research seminar2015-11-11 research seminar
2015-11-11 research seminarifi8106tlu
 
11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptxZahirahZairul2
 
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchTao Xie
 
Euro symposium Action Design Research practise 19092019
Euro symposium Action Design Research practise 19092019Euro symposium Action Design Research practise 19092019
Euro symposium Action Design Research practise 19092019Matti Rossi
 
Information Systems Action design research method
Information Systems Action design research methodInformation Systems Action design research method
Information Systems Action design research methodRaimo Halinen
 
How to Conduct Usability Studies: A Librarian Primer
How to Conduct Usability Studies: A Librarian PrimerHow to Conduct Usability Studies: A Librarian Primer
How to Conduct Usability Studies: A Librarian PrimerTao Zhang
 
HCI(Human Computer Interaction)-PPT-REPORT.pptx
HCI(Human Computer Interaction)-PPT-REPORT.pptxHCI(Human Computer Interaction)-PPT-REPORT.pptx
HCI(Human Computer Interaction)-PPT-REPORT.pptxvliencycapateiii
 
Usability heuristics for documentation by Anupama Gummaraju
Usability heuristics for documentation by Anupama GummarajuUsability heuristics for documentation by Anupama Gummaraju
Usability heuristics for documentation by Anupama GummarajuSTC India UX SIG
 
evaluation technique uni 2
evaluation technique uni 2evaluation technique uni 2
evaluation technique uni 2vrgokila
 
Measuring the user experience
Measuring the user experienceMeasuring the user experience
Measuring the user experienceAndres Baravalle
 
The Pragmatic Evaluation of Tool System Interoperability
The Pragmatic Evaluation of Tool System InteroperabilityThe Pragmatic Evaluation of Tool System Interoperability
The Pragmatic Evaluation of Tool System InteroperabilityCommunitySense
 
Ignacio panach ormeño et-al_caise2013
Ignacio panach   ormeño et-al_caise2013Ignacio panach   ormeño et-al_caise2013
Ignacio panach ormeño et-al_caise2013caise2013vlc
 
[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineering[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineeringIvano Malavolta
 
Assessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopAssessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopLACE Project
 
Unit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxUnit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxssuser50f868
 
UX Design Process | Sample Proposal
UX Design Process | Sample Proposal UX Design Process | Sample Proposal
UX Design Process | Sample Proposal Marta Fioni
 

Similar to Evaluation of Interactive Systems Design or Prototype or Product (20)

A brief overview of ux research
A brief overview of ux researchA brief overview of ux research
A brief overview of ux research
 
User Centered Design Process to Develop a Multi-modal Family Needs Assessment...
User Centered Design Process to Develop a Multi-modal Family Needs Assessment...User Centered Design Process to Develop a Multi-modal Family Needs Assessment...
User Centered Design Process to Develop a Multi-modal Family Needs Assessment...
 
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentationVisualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
 
2013 UX RESEARCH - Usability Testing Approaches
2013 UX RESEARCH - Usability Testing Approaches2013 UX RESEARCH - Usability Testing Approaches
2013 UX RESEARCH - Usability Testing Approaches
 
2015-11-11 research seminar
2015-11-11 research seminar2015-11-11 research seminar
2015-11-11 research seminar
 
11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx
 
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
 
Euro symposium Action Design Research practise 19092019
Euro symposium Action Design Research practise 19092019Euro symposium Action Design Research practise 19092019
Euro symposium Action Design Research practise 19092019
 
Information Systems Action design research method
Information Systems Action design research methodInformation Systems Action design research method
Information Systems Action design research method
 
How to Conduct Usability Studies: A Librarian Primer
How to Conduct Usability Studies: A Librarian PrimerHow to Conduct Usability Studies: A Librarian Primer
How to Conduct Usability Studies: A Librarian Primer
 
HCI(Human Computer Interaction)-PPT-REPORT.pptx
HCI(Human Computer Interaction)-PPT-REPORT.pptxHCI(Human Computer Interaction)-PPT-REPORT.pptx
HCI(Human Computer Interaction)-PPT-REPORT.pptx
 
Usability heuristics for documentation by Anupama Gummaraju
Usability heuristics for documentation by Anupama GummarajuUsability heuristics for documentation by Anupama Gummaraju
Usability heuristics for documentation by Anupama Gummaraju
 
evaluation technique uni 2
evaluation technique uni 2evaluation technique uni 2
evaluation technique uni 2
 
Measuring the user experience
Measuring the user experienceMeasuring the user experience
Measuring the user experience
 
The Pragmatic Evaluation of Tool System Interoperability
The Pragmatic Evaluation of Tool System InteroperabilityThe Pragmatic Evaluation of Tool System Interoperability
The Pragmatic Evaluation of Tool System Interoperability
 
Ignacio panach ormeño et-al_caise2013
Ignacio panach   ormeño et-al_caise2013Ignacio panach   ormeño et-al_caise2013
Ignacio panach ormeño et-al_caise2013
 
[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineering[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineering
 
Assessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopAssessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
Assessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
 
Unit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxUnit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptx
 
UX Design Process | Sample Proposal
UX Design Process | Sample Proposal UX Design Process | Sample Proposal
UX Design Process | Sample Proposal
 

More from Khalid Md Saifuddin

Learning Technology for Improving Teaching Quality at Scale
Learning Technology for Improving Teaching Quality at ScaleLearning Technology for Improving Teaching Quality at Scale
Learning Technology for Improving Teaching Quality at ScaleKhalid Md Saifuddin
 
Students find it difficult to explain what skills they need and have
Students find it difficult to explain what skills they need and haveStudents find it difficult to explain what skills they need and have
Students find it difficult to explain what skills they need and haveKhalid Md Saifuddin
 
Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...
Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...
Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...Khalid Md Saifuddin
 
Problematizing the rapid changes in didactics, material and spatial condition...
Problematizing the rapid changes in didactics, material and spatial condition...Problematizing the rapid changes in didactics, material and spatial condition...
Problematizing the rapid changes in didactics, material and spatial condition...Khalid Md Saifuddin
 
Khalid and Peture E-Learn 2016 2 Washington DC
Khalid and Peture E-Learn 2016 2 Washington DCKhalid and Peture E-Learn 2016 2 Washington DC
Khalid and Peture E-Learn 2016 2 Washington DCKhalid Md Saifuddin
 
Considerations and Methods for Usability Testing with School Children
Considerations and Methods for Usability Testing with School ChildrenConsiderations and Methods for Usability Testing with School Children
Considerations and Methods for Usability Testing with School ChildrenKhalid Md Saifuddin
 
Workshop introduction to web 2.0 technologies and educational application of...
Workshop  introduction to web 2.0 technologies and educational application of...Workshop  introduction to web 2.0 technologies and educational application of...
Workshop introduction to web 2.0 technologies and educational application of...Khalid Md Saifuddin
 
Systematic review and meta-analysis of teachers’ development of digital literacy
Systematic review and meta-analysis of teachers’ development of digital literacySystematic review and meta-analysis of teachers’ development of digital literacy
Systematic review and meta-analysis of teachers’ development of digital literacyKhalid Md Saifuddin
 
CSCL 2015 Tracing Sequential Video Production
CSCL 2015 Tracing Sequential Video ProductionCSCL 2015 Tracing Sequential Video Production
CSCL 2015 Tracing Sequential Video ProductionKhalid Md Saifuddin
 
Competence and education development in the globalization perspective
Competence and education development in the globalization perspectiveCompetence and education development in the globalization perspective
Competence and education development in the globalization perspectiveKhalid Md Saifuddin
 
Software-based Clicker Systems for Classroom Activities of Secondary and Tert...
Software-based Clicker Systems for Classroom Activities of Secondary and Tert...Software-based Clicker Systems for Classroom Activities of Secondary and Tert...
Software-based Clicker Systems for Classroom Activities of Secondary and Tert...Khalid Md Saifuddin
 
Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...
Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...
Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...Khalid Md Saifuddin
 
Barriers to the integration and adoption of ICT in Developing Countries
Barriers to the integration and adoption of ICT in Developing CountriesBarriers to the integration and adoption of ICT in Developing Countries
Barriers to the integration and adoption of ICT in Developing CountriesKhalid Md Saifuddin
 
Exploring the use of i pads in danish schools
Exploring the use of i pads in danish schoolsExploring the use of i pads in danish schools
Exploring the use of i pads in danish schoolsKhalid Md Saifuddin
 
A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...
A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...
A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...Khalid Md Saifuddin
 
Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...
Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...
Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...Khalid Md Saifuddin
 
School-Centered Diffusion of Learning Media and Technologies in Rural Bangladesh
School-Centered Diffusion of Learning Media and Technologies in Rural BangladeshSchool-Centered Diffusion of Learning Media and Technologies in Rural Bangladesh
School-Centered Diffusion of Learning Media and Technologies in Rural BangladeshKhalid Md Saifuddin
 
Pecha Kucha Made for JTEL Summer School 2011 Chania, Greece
Pecha Kucha Made for JTEL Summer School 2011 Chania, GreecePecha Kucha Made for JTEL Summer School 2011 Chania, Greece
Pecha Kucha Made for JTEL Summer School 2011 Chania, GreeceKhalid Md Saifuddin
 
ICT in Education-Secondary Technical Vocational Education and Training Instit...
ICT in Education-Secondary Technical Vocational Education and Training Instit...ICT in Education-Secondary Technical Vocational Education and Training Instit...
ICT in Education-Secondary Technical Vocational Education and Training Instit...Khalid Md Saifuddin
 
digital divide between teachers and students in urban bangladesh
digital divide between teachers and students in urban bangladeshdigital divide between teachers and students in urban bangladesh
digital divide between teachers and students in urban bangladeshKhalid Md Saifuddin
 

More from Khalid Md Saifuddin (20)

Learning Technology for Improving Teaching Quality at Scale
Learning Technology for Improving Teaching Quality at ScaleLearning Technology for Improving Teaching Quality at Scale
Learning Technology for Improving Teaching Quality at Scale
 
Students find it difficult to explain what skills they need and have
Students find it difficult to explain what skills they need and haveStudents find it difficult to explain what skills they need and have
Students find it difficult to explain what skills they need and have
 
Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...
Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...
Cross-location and Cross-disciplinary Collaborative Prototyping Using Virtual...
 
Problematizing the rapid changes in didactics, material and spatial condition...
Problematizing the rapid changes in didactics, material and spatial condition...Problematizing the rapid changes in didactics, material and spatial condition...
Problematizing the rapid changes in didactics, material and spatial condition...
 
Khalid and Peture E-Learn 2016 2 Washington DC
Khalid and Peture E-Learn 2016 2 Washington DCKhalid and Peture E-Learn 2016 2 Washington DC
Khalid and Peture E-Learn 2016 2 Washington DC
 
Considerations and Methods for Usability Testing with School Children
Considerations and Methods for Usability Testing with School ChildrenConsiderations and Methods for Usability Testing with School Children
Considerations and Methods for Usability Testing with School Children
 
Workshop introduction to web 2.0 technologies and educational application of...
Workshop  introduction to web 2.0 technologies and educational application of...Workshop  introduction to web 2.0 technologies and educational application of...
Workshop introduction to web 2.0 technologies and educational application of...
 
Systematic review and meta-analysis of teachers’ development of digital literacy
Systematic review and meta-analysis of teachers’ development of digital literacySystematic review and meta-analysis of teachers’ development of digital literacy
Systematic review and meta-analysis of teachers’ development of digital literacy
 
CSCL 2015 Tracing Sequential Video Production
CSCL 2015 Tracing Sequential Video ProductionCSCL 2015 Tracing Sequential Video Production
CSCL 2015 Tracing Sequential Video Production
 
Competence and education development in the globalization perspective
Competence and education development in the globalization perspectiveCompetence and education development in the globalization perspective
Competence and education development in the globalization perspective
 
Software-based Clicker Systems for Classroom Activities of Secondary and Tert...
Software-based Clicker Systems for Classroom Activities of Secondary and Tert...Software-based Clicker Systems for Classroom Activities of Secondary and Tert...
Software-based Clicker Systems for Classroom Activities of Secondary and Tert...
 
Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...
Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...
Barriers to the Integration and Adoption of iPads in Schools: A Systematic Li...
 
Barriers to the integration and adoption of ICT in Developing Countries
Barriers to the integration and adoption of ICT in Developing CountriesBarriers to the integration and adoption of ICT in Developing Countries
Barriers to the integration and adoption of ICT in Developing Countries
 
Exploring the use of i pads in danish schools
Exploring the use of i pads in danish schoolsExploring the use of i pads in danish schools
Exploring the use of i pads in danish schools
 
A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...
A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...
A Belief Rule Based (BRB) Decision Support System to Assess Clinical Asthma S...
 
Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...
Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...
Facilitating Adoption of Web Tools for Problem and Project Based Learning Act...
 
School-Centered Diffusion of Learning Media and Technologies in Rural Bangladesh
School-Centered Diffusion of Learning Media and Technologies in Rural BangladeshSchool-Centered Diffusion of Learning Media and Technologies in Rural Bangladesh
School-Centered Diffusion of Learning Media and Technologies in Rural Bangladesh
 
Pecha Kucha Made for JTEL Summer School 2011 Chania, Greece
Pecha Kucha Made for JTEL Summer School 2011 Chania, GreecePecha Kucha Made for JTEL Summer School 2011 Chania, Greece
Pecha Kucha Made for JTEL Summer School 2011 Chania, Greece
 
ICT in Education-Secondary Technical Vocational Education and Training Instit...
ICT in Education-Secondary Technical Vocational Education and Training Instit...ICT in Education-Secondary Technical Vocational Education and Training Instit...
ICT in Education-Secondary Technical Vocational Education and Training Instit...
 
digital divide between teachers and students in urban bangladesh
digital divide between teachers and students in urban bangladeshdigital divide between teachers and students in urban bangladesh
digital divide between teachers and students in urban bangladesh
 

Recently uploaded

The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 

Recently uploaded (20)

The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 

Evaluation of Interactive Systems Design or Prototype or Product

  • 1. Evaluation of Interactive Systems Design/Prototype/Product Md. Saifuddin Khalid Assistant Professor KANDIDATUDDANNELSEN I INFORMATIONSTEKNOLOGI, IT OG LÆRING, MED SPECIALISERING I ORGANISATORISK OMSTILLING Location of course: 8. semester Course’s Scope: 5 ECTS Aalborg University, Aalborg, Denmark Spring 2017 ID6-L1
  • 2. Aims • Evaluation is the fourth main phase of the interactive systems design process • Evaluation means reviewing, trying out or testing a design idea, a piece of software, a product or a service to discover whether it meets some criteria. • After studying Benyon, (2010, chapter 10) and attending today’s session you should be able to: – Appreciate the uses of a range of generally applicable evaluation techniques designed for use with and without users 1. Understand expert-based evaluation methods 2. Understand participant-based evaluation methods 3. Apply the techniques in appropriate contexts. Monday, 27 March 2017 Aalborg University
  • 3. Categorization of design and systems evaluation methods • User-focused – Analytical (without user) and – Empirical (with user) Evaluation • Product creation process (PCP) oriented – Exploratory (before design or after release), – Predictive (after design and before implementation) – Formative (during design and implementation) and – Summative (after implementation) Evaluation • Expert, participant-based and context- appropriated methods Monday, 27 March 2017 Aalborg University
  • 4. Usability Engineering Lifecycle Monday, 27 March 2017 Aalborg University Source: http://keithandrews.com/talks/2012/dd-2012-03-27/#(2) 1. Design-test-redesign 2. Design versus evaluation
  • 5. Monday, 27 March 2017 Aalborg University Source: http://keithandrews.com/talks/2012/dd-2012- 03-27/#(5) See also, for details http://user.medunigraz.at/andreas.holzinger/h olzinger%20de/usability%20holzinger.html#Act ion Analysis (AA)
  • 6. Analytic Evaluation (without users) and Empirical Evaluation (with users) • Analytic method – For example, evaluation of an axe’s characteristics (e.g. design of the bit, the weight distribution, the steel alloy used, etc.) • Empirical method – studying a good axe man as he uses the tool Monday, 27 March 2017 Aalborg University Rosson, M. B., & Carroll, J. M. (2002).
  • 7. Formative Evaluation (during design) and Summative Evaluation (at the end) • Formative evaluation takes place during the design process. – Goals: to identify aspects of a design that can be improved, to set priorities, and in general to provide guidance in howto make changes to a design. – A typical formative evaluation would be to ask a user to think out loud as he or she attempts a series of realistic tasks with a prototype system. • Summative evaluation happen at the end of a development process – Goals: to answer — “Does the system meet its specified goals?” or “Is this system better than its predecessors and competitors?” – Most likely to happen at the end of a development process when the system is tested to see if it has met its usability objectives. – can also take place at critical points during development to determine how close the system is to meeting its objective, or to decide whether and how much additional resources to assign to a project. Aalborg University Rosson, M. B., & Carroll, J. M. (2002). Monday, 27 March 2017
  • 8. ISO 9241-210: Human-centred design for interactive systems Monday, 27 March 2017 Aalborg University Source: http://uxlabs.co.uk/services/; adapted – Red-colored boxes PACT Analysis Models: Use cases, Rich picture, User stories Rogers et al. (2013) and ISO 9241-210 phases are not same!
  • 9. Evaluation is closely tied to … • key activities of interactive systems design, understanding, design and envisionment. • who is involved in the evaluation – Expert-based methods (what type of expert? Usability expert or interaction designer) – Participant/End-User methods (what type? Target user, other designers, or students or others) Monday, 27 March 2017 Aalborg University Consider an exercise: Collect several advertisements for small, personal technologies such as that shown in Figure 10-1. What claims are the advertisers making about design features and benefits? What issues does this raise for their evaluation? Source: Benyon, 2010, p. 226
  • 10. The four roles that children may have in the design of new technologies Monday, 27 March 2017 Aalborg University Druin, A. (2002). The role of children in the design of new technology. Behaviour & Information Technology, 21(1), 1–25. http://doi.org/10.1080/01449290110108659
  • 11. EXPERT EVALUATION Formal Heuristic and informal expert review Monday, 27 March 2017 Aalborg University
  • 12. Expert Evaluation: Heuristic evaluation • Heuristic: enabling a person to discover or learn something for themselves. • “Heuristic evaluation refers to a number of methods in which a person trained in HCI and interaction design examines a proposed design to see how it measures up against a list of principles, guidelines or ‘heuristics’ for good design. This review [of interfaces] may be a quick discussion over the shoulder of a colleague, or may be a formal, carefully documented process.” Monday, 27 March 2017 Aalborg University List of the design principles – or heuristics [ a specific rule-of- thumb or argument derived from experience]: 1. Visibility 2. Consistency 3. Familiarity 4. Affordance 5. Navigation 6. Control 7. Feedback 8. Recovery 9. Constraints 10. Flexibility 11. Style 12. Conviviality
  • 13. Expert Evaluation: Discount Usability/Heuristic Evaluation • Three overarching usability principles – learnability (principles 1–4), – effectiveness (principles 5–9) – accommodation (principles 10–12). • A ‘quick and dirty’ approach, for time-pressured evaluation practitioners in need of feedback • “Woolrych and Cockton (2000) conclude that the heuristics add little advantage to an expert evaluation and the results of applying them may be counter-productive.” • “They (and other authors) suggest that more theoretically informed techniques such as the cognitive walkthrough offer more robust support for problem identification.” Monday, 27 March 2017 Aalborg University
  • 14. Heuristic evaluation (cont.) • Heuristic evaluation is valuable as formative evaluation, to help the designer improve the interaction at an early stage. • It should not be used as a summative assessment, to make claims about the usability and other characteristics of a finished product. • If that is what we need to do, then we must carry out properly designed and controlled experiments with a much greater number of participants. • Ecological validity: The results of most user testing can only ever be indicative of issues in real-life usage due to human nature of adapting technology in ways that was not designed for. So, – Ethnographically informed observations of technologies in long-term use – Having users keep diaries, which can be audio-visual as well as written – Collecting ‘bug’ reports – often these are usability problems – and help centre queries. Monday, 27 March 2017 Aalborg University
  • 15. Cognitive walkthrough • The cognitive walkthrough entails a usability analyst stepping through the cognitive tasks that must be carried out in interacting with technology. • Inputs to the process are: – An understanding of the people who are expected to use the system – A set of concrete scenarios representing both (a) very common and (b) uncommon but critical sequences of activities – A complete description of the interface to the system - hierarchical task analysis (HTA). • The ‘cognitive jogthrough’ (Rowley and Rhoades, 1992) – video records (rather than conventional minutes) are made of walkthrough meetings, annotated to indicate significant items of interest, design suggestions are permitted, and low level actions are aggregated wherever possible. • The cognitive walkthrough is very often practiced (and taught) as a technique executed by the analyst alone, to be followed in some cases by a meeting with the design team. Monday, 27 March 2017 Aalborg University
  • 17. Cooperative evaluation • The technique is ‘cooperative’ because participants are not passive subjects but work as co-evaluators. Monday, 27 March 2017 Aalborg University
  • 18. Cooperative evaluation (cont.) Monday, 27 March 2017 Aalborg University Sample questions during the evaluation: What do you want to do? What were you expecting to happen? What is the system telling you? Why has the system done that? What are you doing now? Sample questions after the session: What was the best/worst thing about the prototype? What most needs changing? How easy were the tasks? How realistic were the tasks? Did giving a commentary distract you?
  • 19. Participatory heuristic evaluation • The developers of participatory heuristic evaluation (Muller et al., 1998) claim that it extends the power of heuristic evaluation without adding greatly to the effort required. • The procedure for the use of participatory heuristic evaluation is just as for the expert version, but the participants are involved as ‘work-domain experts’ alongside usability experts and must be briefed about what is required. Monday, 27 March 2017 Aalborg University
  • 20. Co-discovery • Co-discovery is a naturalistic, informal technique that is particularly good for capturing first impressions. It is best used in the later stages of design. • Watching individual people interacting with the technology, and possibly ‘thinking aloud’ as they do so, can be varied by having participants explore new technology in pairs. • Depending on the data to be collected, the evaluator can take an active part in the session by asking questions or suggesting activities, or simply monitor the interaction either live or using a video-recording. Monday, 27 March 2017 Aalborg University Figure. Catroid Co-discovery test
  • 21. Controlled experiments • Controlled experiments are appropriate where the designer is interested in particular features of a design, perhaps comparing one design to another to see which is better. • Identify independent and dependent variables, and design decision. • E.g. You might want to judge which Web design is better based on the number of clicks needed to achieve some task; speed of access could be the dependent variable for selecting a function. • confounding variables - learning effects, the effects of different tasks, the effects of different background knowledge, etc. • Considering participants’ differences, the next stage is to decide whether each participant will participate in all conditions (so-called within-subject design) or whether each participant will perform in only one condition (so-called between-subject design). Monday, 27 March 2017 Aalborg University
  • 22. EVALUATION IN PRACTICE The trend in current practice Monday, 27 March 2017 Aalborg University
  • 23. Trend • A survey of 103 experienced practitioners of human- centred design conducted in 2000 indicates that – around 40 per cent of those surveyed conducted ‘usability evaluation’, – around 30 per cent used ‘informal expert review’ – around 15 per cent used ‘formal heuristic evaluation’ • What is the current trend? Monday, 27 March 2017 Aalborg University
  • 24. Main steps of evaluation project/phase 1. Establish the aims of the evaluation, the intended participants in the evaluation, the context of use and the state of the technology; obtain or construct scenarios illustrating how the application will be used. 2. Select evaluation methods. These should be a combination of expert-based review methods and participant methods. 3. Carry out expert review. 4. Plan participant testing; use the results of the expert review to help focus this. 5. Recruit people and organize testing venue and equipment. 6. Carry out the evaluation. 7. Analyse results, document and report back to designers. Monday, 27 March 2017 Aalborg University
  • 25. Perceived costs and benefits of evaluation methods Monday, 27 March 2017 Aalborg University Figure: A survey of user-centred design practice (Benyon, 2010, p. 237, cited Vredenburg, K., Mao, J.-Y., Smith, P.W. and Carey, T. , 2002)
  • 26. Metrics (and measures) • Three things to keep in mind when deciding metrics: – Just because something can be measured, it doesn’t mean it should be. – Always refer back to the overall purpose and context of use of the technology. – Consider the usefulness of the data you are likely to obtain against the resources it will take to test against the metrics. Monday, 27 March 2017 Aalborg University
  • 27. Monday, 27 March 2017 Aalborg University
  • 28. People who will use the system • Nielsen’s recommended sample of 3–5 participants has been accepted wisdom in usability practice for over a decade. • For heterogeneous set of customers, run 3–5 people from each group through your tests. • If you cannot recruit any genuine participants then use convenient ‘user recruitment’ - one of your colleagues, a friend, your mother or anyone you trust to give you a brutally honest reaction. But, be extremely careful as to how far you generalize from your findings. Monday, 27 March 2017 Aalborg University
  • 29. The test plan and task specification • A plan should be drawn up to guide the evaluation. – Aims of the test session – Practical details, including where and when it will be conducted, how long each session will last, the specification of equipment and materials for testing and data collection, and any technical support that may be necessary – Numbers and types of participant – Tasks to be performed, with a definition of successful completion. This section also specifies what data should be collected and how it will be analysed. • Conduct a pilot session and fix any unforeseen difficulties Monday, 27 March 2017 Aalborg University
  • 30. Reporting usability evaluation results to the design team • The report should be ordered either by areas of the system concerned, or by severity of problem. • A face-to-face meeting may have more impact than a written document alone (although this should always be produced as supporting material) and this would be the ideal venue for showing short video clips of participant problems. • Usability problems can be fed into a ‘bug’ reporting system if one exists. • The can be turned into user stories as part of spring backlog of Scrum development approach. Monday, 27 March 2017 Aalborg University
  • 31. EVALUATION: FURTHER ISSUES The mix category Monday, 27 March 2017 Aalborg University
  • 32. Evaluation without being there • Internet connectivity enabled evaluations without being physically present. • If the application itself is Web-based, or can be installed remotely, instructions can be supplied so that users can run test tasks and fill in and return questionnaires in soft or hard copy. • On-line questionnaires and crowd sourcing methods are appropriate here (See, Benyon, 2010, Chapter 7). Monday, 27 March 2017 Aalborg University
  • 33. Physical and physiological measures: Evidence of emotional reactions • Eye-movement tracking (or ‘eye tracking’) can show participants’ changing focus on different areas of the screen. • Physiological techniques in evaluation rely on the fact that all our emotions – The most common measures are of changes in heart rate, the rate of respiration, skin temperature, blood volume pulse and galvanic skin response (an indicator of the amount of perspiration). Monday, 27 March 2017 Aalborg University • Body-connected sensors are linked to software which converts the results to numerical and graphical formats for analysis
  • 34. Evaluating presence – in virtual reality • Designers of virtual reality – and some multimedia – applications are often concerned with the sense of presence, of being ‘there’ in the virtual environment rather than ‘here’ in the room where the technology is being used. • Methods: – Questionnaire – Written accounts of experience/interview – Observation in virtual environment – Direct physiological measures Monday, 27 March 2017 Aalborg University
  • 35. Evaluation at home • Issues: Privacy, time and motivation • Methods – Interviews – Act out scenarios – Diaries “working with children is a good way of drawing parents into evaluation activities.” Monday, 27 March 2017 Aalborg University The investigator supplied users with Post-its to capture their thoughts about design concepts (Figure 10.5). An illustration of each different concept was left in the home in a location where it might be used, and users were encouraged to think about how they would use the device and any issues that might arise. These were noted on the Post-its, which were then stuck to the illustration and collected later.
  • 36. Questions and clarifications Monday, 27 March 2017 Aalborg University
  • 37. References • Druin, A. (2002). The role of children in the design of new technology. Behaviour & Information Technology, 21(1), 1–25. http://doi.org/10.1080/01449290110108659 • Rosson, M. B., & Carroll, J. M. (2002). Usability engineering: scenario-based development of human- computer interaction (1st ed.). San Fancisco: Academic Press. Monday, 27 March 2017 Aalborg University

Editor's Notes

  1. Both of these methods are complementary to formative and summative goals. Empirical methods are popular in usability engineering, human-computer interaction, and other disciplines coinciding user experience design; these methods involve studies of and studies with actual users.
  2. Both of these methods are complementary to formative and summative goals. Empirical methods are popular in usability engineering, human-computer interaction, and other disciplines coinciding user experience design; these methods involve studies of and studies with actual users.