Modern libraries provide a burgeoning array of digital services, all experienced through a myriad of touch-points. To name a few: catalogue; discovery layers; website; LibGuides; Learning Management Systems; chat; Skype; social media; YouTube; blogs; portals; email...
It's a complex picture! A dichotomy of implementing innovative new services while maintaining legacy ones rarely results in seamless, unified library experiences. Using unconnected touch-points often leads to broken user experiences. A good user experience requires research.
To increase satisfaction and delight library users, adopt an approach that gathers evidence, generates insights, and informs decision-making for iterative, incremental changes. This presentation explores some tried and tested user research methods to gather both qualitative and quantitative data from students and staff throughout all stages of project life-cycles. It aims to inspire you with examples of user research initiatives undertaken at Deakin University Library, including co-design workshops for a better homepage, and preliminary results from a longitudinal happiness tracking survey for continuous improvement.
Attendees will take away a digital set of research method cards templates, and tips for conducting quality user research to improve project outcomes at their libraries.
9. A HIERARCHY OF EFFORT
@igniting
Fixing a bro-
-ken user
experience
Stefan Klocek
HTTPS://WWW.SMASHINGMAGAZINE.COM/2012/09/FIXING-BROKEN-USER-EXPERIENCE/
10. Success
BRINGING UX INTO LIBRARIES
@CraigMMacDonald
HTTP://DX.DOI.ORG/10.1002/PRA2.2015.145052010055
38. Best
âThat you can go straight
to a document (as PDF)
from the first search
window. And that you can
easily get the citation
(harvard, for example)â
âEasy to use, great filters
for searching for specific
resources. Lots of
resources available.â
39.
40.
41. Improving library search
âexclude reviewsâ â designed to resolve
book reviews cluttering up results
âsession keeperâ â aims to resolve
annoying timeouts
45. References
Alvarez, H. (2016, April 22). Consider these variables before you choose a UX research method.
Retrieved 2 June 2017, from https://www.usertesting.com/blog/2016/04/22/choose-research-
method/
Buley, L. (2013). The User Experience Team of One : A Research and Design Survival Guide.
Rosenfeld Media.
Chisholm, J. (n.d.). What is co-design? Retrieved from http://www.designforeurope.eu/what-co-
design
Hall, E. (2013, September 10). Interviewing Humans. Retrieved from
http://alistapart.com/article/interviewing-humans
Klocek, S. (2012, September 27). Fixing A Broken User Experience. Retrieved from
https://www.smashingmagazine.com/2012/09/fixing-broken-user-experience/
46. References
MacDonald, C. M. (2015). User experience librarians: user advocates, user researchers, usability
evaluators, or all of the above? Proceedings of the Association for Information Science and
Technology, 52(1), 1â10. https://doi.org/10.1002/pra2.2015.145052010055
MacDonald, C. M. (2017). âIt Takes a Villageâ: On UX Librarianship and Building UX Capacity in
Libraries. Journal of Library Administration, 57(2), 194â214.
https://doi.org/10.1080/01930826.2016.1232942
MĂŒller, H., & Sedley, A. (2014). HaTS: Large-scale In-product Measurement of User Attitudes
& Experiences with Happiness Tracking Surveys. Retrieved from
https://research.google.com/pubs/pub43221.html
Naranjo-Bock, C. (2012, April 24). Creativity-based Research: The Process of Co-Designing with
Users. Retrieved 18 August 2017, from http://uxmag.com/articles/creativity-based-research-
the-process-of-co-designing-with-users
47. References
NNgroup. (2017). Small vs. Big User Studies â Whatâs Best? (Jakob Nielsen). Retrieved from
https://youtu.be/ZTsT8r4MWLk
PRESS RELEASE: Insync Surveys launches 2006 library results: CAUL hits an all time high. (2007,
April 5). Retrieved 18 August 2017, from http://www.insyncsurveys.com.au/news/2007/04/caul-
results/
Rohrer, C. (2014, October 12). When to Use Which User-Experience Research Methods.
Retrieved 2 June 2017, from https://www.nngroup.com/articles/which-ux-research-methods/
Sauro, J. (2013, March 12). What UX Methods to Use and When to Use Them. Retrieved from
https://measuringu.com/method-when/
Hi and welcome to Improving digital library services with user research.
Through this URL my slide-deck is available. If youâd prefer a copy with notes, message me.
Iâll start with a warm up on UX in digital libraries.
Iâll give an overview of some user research methods.
Then Iâll highlight 2 case studies of projects exemplifying impacts that UX is having on digital library services at Deakin.
At the end, Iâll give you some templates so you can start including more user research in projects at your library.
Letâs start!
Iâm sure those doing UX in libraries can relate to The User Experience Team of One in which Leah Buley says âuser experience refers to encounters that people have with digital products, like software or a web app.â
Think about digital products and services in your library âŠ
Now this is the YOU part of this session. Everyone: shout out all the digital library products / services you can think of! ⊠- catalogue; discovery layers; website; LibGuides; Reading Lists; Learning Management Systems; chat; Skype; social media; YouTube; blogs; portals; email, self checkout kiosks, Wi-Fi, BrowZine âŠ
Thank you.
Leah Buley explains that to do UX means:
âto practice ⊠methods and techniques for researching what users want and need, [and to design products and services for them.]â
Our regular Library Client Survey is one such technique for researching users wants and needs. So weâve all been doing UX for years! ï
Back in 2006 & 2005, the InSync survey says âthe virtual library category is a sore point for someâ. [A major aspect of this category is the library search box, which many libraries have iterated on several times in the last 10 years.
In 2016 the Deakin Library search box was identified as an improvement opportunity.
How will we know whether search experiences are more satisfying after a change? And where experiences are broken, how might we better meet expectations?
Stefan Klocek wrote Fixing a broken user experience â much like Bloomâs Taxonomy, his hierarchy of effort requires work to begin at the bottom. With both history and many services you called out earlier, libraries must approach from principle based improvements, to deliver learnable and desirable user experiences.
Points worth noting from Professor Craig MacDonaldâs 2015 paper: First, that âUser Experience (UX) is gaining momentum as a critical success factor across all industries and sectors, including librariesâ. Second, within libraries, there is âa general preference for qualitative research methodsâ. And 3rd, that âsome translation is required to bring professional UX knowledge into a library settingâ.
In his 2017 article, MacDonald categorised libraries in terms of perceived level of UX maturity. In these models, libraries journey towards an institutionalised user-centred culture from an unrecognised âzeroâ state. 1 factor that distinguishes between stages of maturity is:
âThe extent that UX methods and techniques are employed across the organisation.â (p207)
In a poster presentation with Deakin Library colleagues last December, I shared 6 tried and tested user research methods on cards. At the end of my presentation today, Iâll give you 12 method cards to kick-start user research at your library.
Now letâs look at when each method fits into 3 broad stages of project life-cycles.
Before development, where the goal is exploration, co-design and interviewing users are methods in this early stage.
During development the goals are evaluation & validation. Here we use methods such as Guerrilla (spontaneous) interviewing, and copious amounts of Prototyping.
At the live design or live service stage, the goal is measurement. Methods such as usability testing and intercepts surveys fit here. Part of choosing research methods is determined by which stage of project life-cycle youâre working in.
Another part is the business values associated with each method. Eg. Co-design encourages experimentation and iteration, Intercept Surveys help optimise the user experience. Both these methods support discovering growth opportunities.
Now Iâd like to share with you 2 case studies that used these methods.
1st is an exploration study in the form of co-design workshops, with insights guiding our library homepage re-design. After that, Iâll talk about our happiness tracking study â a continuous measurement of satisfaction with our library discovery layer.
Co-design workshops formed the initial research for a homepage redesign project (underway). Iâll describe some benefits of co-design, what the method entails, our workshop procedure where participants created collages for a better library homepage, the resulting artifacts for analysis, and evidence of user needs.
Designing digital library services needs an understanding of both: the demand from users needs, and the supply sideâs technologies & processes. So by bringing students, academics and staff perspectives together in co-design workshops provides immediate benefits such as generating better ideas with a high degree of originality and user value.
Co-design also offers longer-term benefits like better relationships between the library and our users.
Co-design is fun! Itâs where âparticipants are given design elements or creative materials ⊠to construct their ideal experience in a concrete way, that expresses what matters to them most, and why.â
Itâs a hybrid method that âallows users to interact with and rearrange design elements that could be part of a product experience, in order to discuss how their proposed solutions would better meet their needs and why they made certain choices.â
Huge thanks to Rachael Wilson who was instrumental in the success of the workshops. Our procedure went like this: - 1st, our team conducted environmental scanning to create clippings of home page elements. Add blank paper, scissors, Blu tack, tape, and sticky notes. Recruit a diverse range of participants. Then in each workshop, participant groups create collages to explore ideas for an awesome homepage. They work through 3 phases:
They gather elements from the stock clippings & create any elements to be included.
To understand their priorities, they order / number elements from first to last.
Then they present their design, so we understand what they want & why itâs important for them.
Before participants present, ask for their consent, then record the audio. You wonât need to transcribe, but itâs crucial for empathy building to share the voice of the user with stakeholders who couldnât attend the workshop.
A nice aspect of co-design workshops is that tangible artifacts are produced by participants. We ended up with 18 collages from workshops at all 4 Deakin campuses. Then our job is to analyse. You can do this solo, but where possible, build a shared understanding by analysing cooperatively / collaboratively.
The collages were affinity analysed, clustering elements of the same type and labelling them with the priority participants gave them.
Summarising the data, there were popular elements such as chat and todayâs hours; there were elements staff included and students didnât, and vice versa.
Insights from our co-design workshops informed wireframe construction for the desktop and mobile experiences. The insights also validated adjustments to the priority of elements in subsequent prototypes.
Weâve built and pilot tested this prototype. When the beta homepage is ready, workshop participants will be first to test.
[Currently scheduled for late September]
An invisible benefit of co-design activities, is improved relationships with library constituents. For us, an example of this being realised was in a related project of streamlining our homepage search box.
Including diverse user groups from an early stage gives our team confidence that weâre building the best design to fit user needs.
Now Iâd like to turn to a longitudinal search happiness study.
From a collaborative effort from EBSCO engineer Alvet Miranda, Hero MacDonald, Kathy Martin, and myself, we implemented a Happiness Tracking Survey for Deakin Library Search.
Weâre measuring satisfaction with Library Search and search related tasks. Our approach is based on HaTS: Large-scale In-Product Measurement of User Attitudes & Experiences with Happiness Tracking Surveys, a method developed by Google and deployed in their products to measure progress towards goals and inform decisions.
âSurveys measure and categorize attitudes or collect self-reported data that can help track or discover important issues to address.â We use an intercept survey method thatâs triggered during use of library search.
A survey invitation appears on the results display. If a user dismisses the invitation, or takes the survey, they wonât see another invitation for 12 weeks.
The first question is the only mandatory one. Itâs a 7-point Likert scale question with labels at the extremes and mid-point. An aggregate of responses tells us how satisfied or dissatisfied users are with Library Search.
So how are we tracking so far? Before we see results, would you care to speculate how your library might rate for search satisfaction?
In the first 3 weeks up to the 23rd, weâre hovering between 3 and 5, fairly middle of the road. The middle-ground requires more in-depth analysis before we can direct efforts to improving. [Skip to next slide.]
If satisfaction ratings were consistently high, maintain status quo. If they stay low, take action and consider alternatives.
Following the overall rating, are open-ended questions about frustrating or unappealing aspects as well as new feature suggestions. Analysing responses to these questions can help direct improvement efforts.
This question may appear double-barrelled, however, through several experiments Google found that asking about frustrations and new capabilities in the same question increased response quantity and quality.
Letâs look at common frustrations.
Analysis of frustrations has already revealed a handful of themes:
- Log in issues, session time out, interface woes, and book reviews dominating results.
Efforts are well underway to resolve our #1 issue. Kathy Martin is spearheading a massive project to unite Single Sign On across our platforms.
Next, our survey asks what people like best. Responses to this serve 3 purposes:
Prevent us creating an unbalanced view of Library Search.
Learn what aspects people like about Library Search to discover missed marketing opportunities.
Help us be mindful when planning changes that may have knock on effects.
A couple of quotes from what people like best:
Then there are 6 tasks weâre asking respondents to select if theyâve attempted them in the last month. Each selection then prompts to evaluate on a scale of satisfaction.
From this chart, you can see what tasks are being attempted more, and prioritise accordingly. No analysis has yet been performed on the task satisfaction data, but baselines will be available when weâre ready to tackle improving specific task flows.
Summing up, itâs early days for our Library Search Happiness Tracking.
I expect that once the Single Sign On project work is complete, weâll see a drop in frustrations associated with log in issues. Developments such as exclude reviews and a session keeper should address other major issues identified. We may even see a rise in overall satisfaction. Whatâs important is that we now have the means to quantify and qualify the improvements we implement on a major library service.
Now for some UX tools and templates for you.
UX has many research methods and a myriad of tools. My gift to you is a template of all my research methods and sample projects, wrapped in a free database tool.
This is what youâll see when you copy a redacted version of my UX methods & projects database. Looking at the methods and steps, youâll find you can easily use and adapt these to work for you. If youâd like to conduct user research at your library, this might be the head start you need.
[The template also contains some projects, results, and so on.]
Airtable is my go to tool for user research (Happiness Tracking, co-design workshops, + more). The first link gets you a free account, and the 2nd gives you a copy of my UX methods & projects database.
Equip yourself today so you can start including more user research in projects at your library.
I hope I've inspired you to be catalysts for more user research, to generate insights and inform decision-making for iterative, incremental changes at your library. Try UX at home.
Thank you.
And we have a few minutes for questions, so over to you.
People doing UX in libraries are widely dispersed. Iâve found it highly beneficial to participate in a global community of practice for UX in libraries. The LibUX community uses Slack to ask questions, share knowledge, give tips, and exchanges ideas.
So join the virtual community of practice! via http://libux.co/slack Like today, Slack is free, and the LibUX community are welcoming and helpful.
My post from July was a welcome to Josh Boyer of NCSU Libraries UX Department, and a mention of their decades of user studies available for perusal.
These 180 people are library staff from 19 countries who attended the 3rd UXLibs conference in Glasgow this June. Like you, they are from university libraries. However, only one is from Australia. If you canât spot the Aussie in the photo, then ⊠just look at me! [fist pump]
The big message from this yearâs conference is âthe value of getting all staff involved in researchâ so that initiatives get ample âbuy-in and staff become more customer focussed.â
As a member of a digital library team, itâs easy to become caught up in technological advances and implications, and lose some of that customer focus. Getting buy-in is a constant challenge, particularly for an emerging field that is largely unknown to libraries.
June next year is set for the 4th UXLibs conference â itâll be in the UK, some 17,000 km from Melbourne. That may be difficult to attend, so chat with CAVAL staff if youâre looking for another local event.
In qualitative studies, small samples with more frequent iterations are more valuable than large simples with less iterations.
The ethos of Erika Hallâs book is to do no more than just enough research to garner insights that can inform decision-making. I thorought recommend the whole book, but if short on time, try the excerpt of Chapter 5 which is dedicated to User Research.
Look into Googleâs HEART framework.
Clustered by stage of project life-cycle, these are the research methods Iâve used in projects so far. The observation you can make here is that weâre not doing much research in the before development phase where the goal is exploration. In an episode of Digital Insights titled 10 mistakes that are ruining the success of your digital projects, Paul Boag said âmany teams seem to skip over a discovery phase because of the pressure to deliverâ.