9. Tradition—Nielsen’s 10 heuristics
1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover from errors
10. Help and documentation
J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994
Slide 9
10. The Nielsen Method
• Small set of evaluators
– 3 to 5 optimal cost-benefit
– Single evaluator finds 35% of problems
• Each evaluator inspects alone
– 1 to 2 hours
– Several passes through interface
– Inspection based on heuristics
– If evaluators are not SME’s, hints can be given
– Evaluator writes notes or report
11. The Nielsen Method
After individual evaluations are
done, evaluators:
– Talk to each other, often with a facilitator
– Share reports/notes
– Collate findings
– Rank issues by severity
– Write compiled report
12. Nielsen variations on method
• Supply a typical usage scenario, listing the
steps a user would take to perform tasks
• Hold a design debrief with designers
• Use brainstorming to focus on possible
solutions
• Include positive findings
13. And the method is called…
“Discount Usability
Engineering“
14. So, what do you get?
• A list of potential problems
• Also (sometimes) the positive findings
• Tied to a heuristic or rule of practice
• A ranking of findings by severity
• (Sometimes) recommendations for fixing
problems
• A report of findings
Slide 14
18. CUE 4 Hotel Pennsylvania 2003
• Comparative evaluation of reservation process
• 17 teams
– 8 did expert review/heuristic evaluation
– Only 1 team used Nielsen’s heuristics
• Rolf’s conclusions
– Findings “overly sensitive“—too many to manage
– Need to improve classification schemes
– Need more precise and usable recommendations
CHI 2003
Results available at Rolf Molich’s DialogDesign http://www.dialogdesign.dk/CUE-4.htm
Slide 18
21. Phase 2: Loose interpretation of
Nielsen
dropped his heuristics
kept severity ratings
added screen captures
added callouts
added recommendations
22. Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.
H = Hyperspace; C = Cardiac Arrest; S = Shock
Severity
Finding Description of problem Recommendation H C S
Rating
Objectives/goals Unclear reason content Develop a consistent 3
for the modules is being presented structure that defines what’s
not clear Lack of conciseness of noted in the bulleted points.
presentation Avoid generic statements that
Definitions are required don’tdefines what’s noted in the
Objectives/goals for Reason content is being
the modules presented
Develop a consistent structure
that
focus users on what 3
to work with the of
Conciseness they will be accomplishing.
bulleted points, above.
module/content required to Advisegeneric statements that
presentation
Definitions
Avoid
that there is an
don’t focus users on what they
Evaluation criteria and
work with the assessment used for
will be accomplishing.
module/content Advise that there is an
methods unclearcriteria and evaluation and indicate if it’s
Evaluation assessment used for evaluation
Direct tie between
methods at and indicate ifor interspersed in
the end it’s at the end or
Direct tie between interspersed in the module
content and assessment the module. the goals and
content and assessment Connect ideas in
measure unclear
measure Connect ideas in the the
objectives with outcomes in goals
Sequence of presentation assessment
Sequencefollows logically from and objectives presentation
of Follow the order of with outcomes
introduction
presentation does not
Quizzes challenge users
in defined at the beginning
the assessment.
Develop interesting and
follow logically from Follow the order of
challenging questions
introduction. presentation definedthe the
Re-frame goals/objectives at
end of the module
at
Quizzes do not beginning.
challenge users. Develop interesting and
challenging quiz questions.
Re-frame goals/objectives at
the end of the module.
Slide 22
28. A unique password between 6 and 16 characters was
required. “Unique” is not defined. This is a problem
with terminology.
Usually, passwords must be a combination of
letters and numbers for higher security. An all-
letter password—Heuristics—was accepted. A
dictionary term is not a secure password and
contradicts accepted conventions. The ability to
input a dictionary word may be a component of
trust for users.
The username and security question answer were
rejected on submit.
This result is confusing as the name was
confirmed on the previous screen. This relates
to establishing conventions for the form of
names/passwords on the input screen. Input
formats need to be defined on the relevant
page.
Differences in spelling “username” vs. “user
name” are subtle but are consistency issues.
The red banner is confusing as the user chose the
gold (Free Edition). This is a consistency issue.
28
32. Strategy—Persona-based
scenario review
• Ginny Redish and Dana Chisnell
• AARP report—58 pages, 50 websites
– Two personas—Edith and Matthew
– Evaluators “channel“ the user via persona and
tasks/goals
– The users’ stories emerge
Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf
Slide 32
33. While the clickable
area is very large
in the navigation
blocks, Edith
expected to click
on the labels, so
she was surprised
when the menu
appeared
When trying to
click an item in
the menu
above, Edith had
trouble selecting
because her
mouse hovered
close enough to
the choices
below to open
that
menu, obscuring
the item she
wanted to click
Chisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP)
34. Engage in conversation with your reader
Ginny Redish
“Every use of every Letting Go of the Words
Morgan Kaufmann, 2007 (new edition coming)
website is a
conversation started
by the site visitor.”
35. Tell the story of your user’s experience
Whitney Quesenbery and Kevin Brooks
“Stories organize Storytelling for User Experience
Rosenfeld Media 2010
facts in
memorable
ways.”
36. Options for report deliverables
No deliverable
Quick findings
Presentation
Detailed report
Slide 36
37. Steve Krug’s approach
• All sites have usability problems
• All organizations have limited resources
• You’ll always find more problems than you have
resources to fix
• It’s easy to get distracted by less serious problems
that are easier to solve . . .
• Which means that the worst ones often persist
• Therefore, you have to be intensely focused on
fixing the most serious problems first
Rocket Surgery Made Easy, New Riders, 2010
Slide 37
38. “Focus ruthlessly on a
small number of the
most important
problems.”
Steve Krug
Slide 38
44. Your turn. Expert review.
• Scenario. You want to do user testing in Atlanta.
– You heard there might be a lab at Southern
Polytechnic State University www.spsu.edu
– See if you can find whether they have a lab and can
rent the lab to you
• Your task for this review:
– Work independently
– Jot down findings
– Then meet with a few others to organize findings
– Discuss how you will report the top findings
54. RITE method
• Rapid iterative testing and evaluation
• Developed by Microsoft’s Game Studios
• Requires full team commitment
– Observe
– Analyze findings immediately
– Change immediately
– Retest
– Do it again
59. Your turn. Option 1
• Goal—ease of use for finding an online graduate program
that supports UX interests
• Create post-task questions
• Select one person in your group to be the user
– User task: search for an online program in UX or related field at
www.spsu.edu
– What are the requirements for admission?
– What are the fees?
– What is the next application deadline?
• Observers take notes
• Discuss findings
• Determine top findings
60. Your turn. Option 2
• New device for mobile phone user
• Create a few tasks
• Write a few post-task questions
• Select a “new“ user to be participant
• Observers take notes
• Discuss findings
• Determine top findings
67. How to deal the cards
• Spread them out on table
• Instruct user to
– walk along the table and pick up cards that express the user’s
experience
– Share the meaning of the cards
– User’s story emerges
• In remote testing, provide a table or Excel spreadsheet
– User highlights selections
– Explains choices
• Collate the results in clusters of similar/same cards
68.
69.
70. 3 TV weather websites
40
35
30
25
20
15
10
5
0
Positive
Negative
Positive
Negative
Station A Positive
Negative
Station B
Station C
26/13 39/5 24/17
70
71.
Easy-to-use
Helpful
Straightforward
Fast
Relevant
Reliable
Useful
Repeated positive card selections focused
on ease of use, relevance, and speed
73. “But the light bulb
has to want to
change”
Why do the most serious
usability problems
we uncover often go unfixed?
Steve Krug and Caroline Jarrett
#upa2012 Las Vegas
74. Survey says…
Conflicted with decision maker's belief or opinion
Not enough resources
Deferred until next major update/redesign
Not enough time
Too much else to do
No effective decision maker
Team did not have enough power to make it happen
Required too big a change to a business process
Technical team said it couldn't be done
Other events intervened before change could happen
Disagreements emerged later
Legal department objected
0 10 20 30 40 50 60 70
Number of times this reason was chosen
from 131 total usable responses
76. Jarrett/Krug theme:
Do basic UX better
• Do testing earlier
• Make stakeholders watch the sessions
• Present results better
– More explanations
– Use video clips
76