The document discusses outliers or "single-user problems" found during usability testing with only one participant. It notes that such problems are common, accounting for around 25-58% of all usability problems found. However, guidance on how to handle them is limited. The document reports on a survey of 89 usability practitioners that found varied practices for classifying or rejecting single-user problems. It provides recommendations for practitioners to establish procedures for evaluating single-user problems, consider sample size, check against guidelines, seek advice from others, and check if issues are artifacts of testing.
1. Outliers in usability testing:
How to treat usability problems found for only
one test participant
Asbjørn Følstad, SINTEF
Effie Lai-Chong Law, University of Leicester
Kasper Hornbæk, University of Copenhagen
NordiCHI 2012
2. Content
1 Single-user problems
2 Yes, they are abundant
3 … but how to deal with them?
4 Current practices – straight from the horse's mouth
5 Recommendations
2
3. Single-user problems
The problems backed up
with data from only a
single participant in
usability test
3
4. Single-user problems
Are they relevant?
May be infrequent usability
problems
- Point estimate: .25 (LaPlace)
- 95% conf. int.: .01-.58 (Adj. Wald)
Are they valid?
May be an artefact of the test
situation
"there is always a risk of being misled
by the spurious behavior of a single
person" (Nielsen, 2000 – useit.com)
4
5. Single-user problems are abundant
Office system eval.
Content
15 participants Nielsen and
management Landauer
system eval. 77 of 145 (1993)
problems single-
17 participants
user problems
41 of 88 problems
Law and single-user
Hvannberg problems
(2004)
Law, E.L.-C. Hvannberg, E.T. Analysis of Combinatorial User Effect in
International Usability Tests. In Proc. CHI '04, ACM Press (2004), 9-16.
Nielsen, J., Landauer, T.K. A mathematical model of the finding of usability
problems. In Proc. CHI '93, ACM Press (1993), 206-213. 5
6. Advice on how to deal with them is scarce
View "unique
Short discussion of problems as noise Kjeldskov, Skov
single-user problems. rather than real and Stage
Recommend to report usability problems" (2004)
these as outliers in study of instant
data analysis
Report single user
problems as real
problems in a stress-
test of problem Woolrych
predictions and Cockton
(2001)
Kjeldskov J., Skov M. B., Stage J. Instant Data Analysis: Evaluating
Usability in a Day. In Proc. NordiCHI '04, ACM Press (2004), 233-240.
Woolrych, A., Cockton, G. Why and when five test users aren’t enough.
In Proc. IHM-HCI 2001, Cépadèus Éditions (2001), 105-108. 6
7. Asking the practitioners for current practices
Opportunity: Larger survey on
analysis practices in usability
evaluation
Included question on single-user
problems
89 usability practitioners answered
this particular question
Median 6 yrs. work experience
17 different countries
Usability tests with median of 8
user participants
7
8. Potential outcomes for single-user problems
8 accept
Participants as divided as the
4 classify as low priority
little advice provided in the
literature 4 record as outlier
6 reject
(22 items total on this theme)
8
9. Relevant conditions when making the call
18 Problem severity
9 Test participants' profile
A range of conditions 6 Sample size
reported as relevant. But
some maybe deserving to be
reported more often? 6 Artifact of the test situation?
5 Task importance
5 Other
(49 items total on this theme)
9
10. Resources and strategies when making the call
9 Discuss with experts or team
members
20 reported to rely on own 9 New/extended evaluations
professional knowledge and 8 Check against heuristics /
experience.
guidelines / principles
However, several potential
useful resources and […]
strategies were reported
3 Specific process or policy
2 Confirmed hypotheses /
previous experiences
2 Debrief with users
(63 items total on this theme)
10
11. When considering the study findings …
How did you handle
single-user problems
in your latest
usability test?
11
12. Recommendations
1 Procedure for handling single-user problems
2 Pay particular attention to sample size
3
Check against knowledge resources –
guidelines, heuristics, or previous evaluations
4
Seek advice - from experts or team members
5 Be alert: Artefact of the test situation?
12