"From Usability Study to Innovation: Implementing LibAnswers at Loyola Marymount University" is being presented at the 4th QQML 2012 International Conference in Limerick, Ireland.
1. From Usability Study to Innovation:
Implementing LibAnswers at
Loyola Marymount University
Kenneth Simon
Susan Gardner Archambault
2. Loyola Marymount University
• Private Catholic
University in Los
Angeles, California
• 5900+ undergraduates
and 1900+ graduates
• William H. Hannon
Library Information
Desk open 24/5
3. Research Question
• What is the most effective
way to provide access to
our Library FAQ’s?
• Specifically, a comparison
of two products: which
features of How Do I? and
LibAnswers do students
prefer; and which
features lead to better
performance?
8. Methodology
• Conducted usability
testing on 20
undergraduate students
at LMU
• Population equally
represented each class
(freshmen through
seniors) and had a ratio
of 60:40 females to
males
9. Methodology
• Used a combination of
the Performance Test
methodology and the
Think-Aloud
methodology
10. Methodology
• Students given 10 performance tasks to
complete at a computer twice - once using
LibAnswers as starting point, and once using
How Do I?
• After each performance task, students given
questionnaire measuring satisfaction with site
11. Performance Task Questions
How to print in the library from a laptop How to request a research consultation
How long can a graduate student check
out a book
How to search for a book by the author’s
name
Where are the library copy machines How to tell what books are on reserve for
a class
How to request a book from basement
storage
Where to access CRSPSift software in the
library
Can a Loyola law school reserve a group
study room in advance
How much does it cost for an undergrad
to request a magazine article from
another library
14. Additional Questions
• How likely would you be to use each page
again?
• What was your favorite aspect of each site?
• What was your least favorite aspect?
• Overall, do you prefer LibAnswers or How Do
I?
15. Performance Scoring: Speed
• Start the clock when
the person begins
searching for the
answer to a new
question on the home
page of the site they are
testing
• Stop the clock when
they copy the URL with
the answer
16. Performance Scoring: Accuracy
Was the Answer…
Completely Accurate: found the answer
On the correct path to the information,
but did not go far enough or took wrong
subsequent path
On the correct page, but did not see the
answer (supersedes everything else they
tried on other attempts to answer)
Check off the one that
applies:
Pointed to a related question under the
correct category, but incorrect page
Incorrect and off topic
Gave up: never found an answer
17. Performance Scoring: Efficiency
• Count the number of
times the person made
a new attempt, or
started down a new
path, by returning to
the home page *after*
a previous attempt
away from or on the
homepage failed
18. Sample Scoring Video
bit.ly/usabilityvideo
Site Speed Accuracy Efficiency
How Do I? 46 seconds Completely Accurate +1 (clicked 1 wrong path)
LibAnswers 36 seconds Completely Accurate +1 (clicked 1 wrong path)
19. Performance Results
Speed Average (seconds)
LibAnswers 40.55
How Do I? 33.90
Efficiency Total Wrong Paths
LibAnswers 30
How Do I? 40
20. Performance Results
Accuracy LibAnswers How Do I?
Completely accurate 182 (91%) 175 (87.5%)
Correct path but did not go
far enough or took a wrong
subsequent path
5 (2.5%) 15 (7.5%)
Correct page, but did not
see the answer
3 (1.5%) 3 (1.5%)
Pointed to a related
question under the correct
category, but incorrect
page
6 (3%) 3 (1.5%)
Incorrect and off-topic 0 3 (1.5%)
Gave up: never found
answer
4 (2%) 1 (.005%)
21. LibAnswers Features Used
Feature Number Who Used Percent
Search Box 16 80%
Auto-Suggest 12 60%
Popular Answers 9 45%
Cloud Tag 8 40%
Related Questions 4 20%
Change Topic Drop-down 2 10%
Recent Answers 2 10%
24. Patterns
• Overall, 9 of 20 performed worse with the site
they said they preferred.
• 4 of 5 freshmen performed worse with the site
they said they preferred. Upperclassmen were
more consistent.
• Females tended to perform better with their
preferred site; males did not.
• 75% of the males preferred How Do I? over
LibAnswers, while females were evenly divided
25. LibAnswers
Likes
• Keyword search “like a
search engine”
• Autosuggest in search
bar
• Popular topics list
• Friendly / pleasant to
use
• Don’t have to read
through categories
Dislikes
• Overwhelming interface /
cluttered
• Long list of specific
questions but hard to find
the info you want
• Less efficient than the
“How Do I” page
• Once you do a search, you
lose your original question
• Autosuggestions are
ambiguous or too broad,
and sometimes don’t
function properly.
26. How Do I?
Likes
• Fast / efficient to use
• Everything is right
there in front of you: “I
don’t have to type, just
click.”
• Simple, clearly laid out
categories
• Organized and clean
looking
Dislikes
• Less efficient than the
LibAnswers page: have to
read a lot
• Too restricted: needs a
search box
• Have to guess a category
to decide where to look
• Limited number of too-
broad questions
• Boring / basic
appearance
27. Sharing results with Springshare
• Retain question asked in search results screen.
• Add stopwords to search, so typing “How do I”
doesn’t drop down a long list of irrelevant stuff,
and “Where is” and “where are” aren’t mutually
exclusive.
• Remove “related LibGuides” content to reduce
clutter.
• Control the list of “related questions” below an
answer: they seem to be based only on the first
topic assigned to a given question.
33. Conclusions
• Ended up with a
balance between two
extremes rather than
one or the other
• Think-aloud method:
gave up control; no
preconceived ideas
could influence
outcome
• Sitting in silence
watching the
participants made them
nervous- next time
maybe leave the room
and have a self-guided
test
• Efficiency is difficult to
measure: moved away
from counting clicks
34. Acknowledgements
Thank you:
• Shannon Billimore
• Jennifer Masunaga
• LMU Office of Assessment/Christine Chavez
• Springshare
• William H. Hannon Library Research Incentive
Travel Grant
35. Bibliography
• Ericsson, K.A. and Simon, H.A.
(1980). Verbal Reports as Data.
Psychological Review, 87(3), 215-
251.
• Magnerism. (2008, Nov. 20)
Think Aloud Protocol Part 2.
Retrieved May 3, 2012 from
http://www.youtube.com/watch?
v=dyQ_rtylJ3c&feature=related
• Norlin, Elaina. (2002). Usability
Testing for Library Web Sites: A
Hands-On Guide. Chicago:
American Library Association.
• Porter, J. (2003). Testing the
Three-Click Rule. Retrieved from
http://www.uie.com/articles/thre
e_click_rule/.
• Willis, G.B. (2005). Cognitive
Interviewing: A Tool for Improving
Questionnaire Design. Thousand
Oaks, CA: Sage Publications.
36. Additional Information
Presentation Slides
• bit.ly/simongardne
r
Contact Us
Ken Simon
Reference & Instruction Technologies
Librarian
Loyola Marymount University
Twitter: @ksimon
Email: kenneth.simon@lmu.edu
Susan [Gardner] Archambault
Head of Reference & Instruction
Loyola Marymount University
Twitter: @susanLMU
Email: susan.gardner@lmu.edu