Establishing best practices to improve usefulness and usability of web interfaces providing atmospheric data
1. Establishing best practices to improve
usefulness and usability of web interfaces
providing atmospheric data
Nina S. Oakley
Britta Daudert
2. Atmospheric data are increasingly important to a broad audience
Resource Managers
Ecologists
Social ScientistsPublic Health Officials
Policy Makers
Farmers
Educators
Hydrologists
Geologists
Engineers
3. Web has become favored way to disseminate
atmospheric data
4. Data can be frustrating to access
I just want to download a
year of data for Reno Airport...
what do I click?
5. Usability addresses this issue
• Developed from e-commerce needs (for web)
• “The extent to which a web product can be used to achieve
goals with effectiveness, efficiency, and satisfaction.” (ISO, 2014)
• Focuses on user rather than developer needs
• Assumes users are busy people trying to accomplish tasks
• Users (not developers) decide whether a product is easy to use
(Dumas & Redish, 1999)
6. Why is usability important to atmospheric
science?
• Often many sites to access same data
• Users have “reservoir of goodwill”, leave site if frustrated
• E-commerce: loyal users spend more money than first-
time users (Nielsen, 2000)
• Loyal following = funding?
• Usability testing relatively cheap!
Krug, 2005
7. Time spent on site before leaving
Nielsen, 2011
• Users leave web pages in ~10-20 seconds
• Make clear, strong value proposition to get them
to remain longer
8. How to employ usability?
• Follow general usability
guidelines
• literature, usability.gov
• Perform usability testing
on your site
• We attended training
• nngroup.com
14. Methods: Recruiting participants
• Performed 2 rounds of testing with 5 participants
• Made improvements between rounds
• Test representatives of target user group
• Not necessarily your colleague down the hall
• Sought people in ecology, resource management
• Compensate participants!
• Provides motivation to show up, give quality feedback
15. Methods: Designing test questions
• PART 1: Online, 3 Tasks
• 1) List data for all stations in Shasta County, CA that
recorded snowfall and precipitation data for all dates
December 15-December 31, 2013
• 2) Find the highest temperature ever recorded in March at
Winnemucca AP, Nevada
• 3) Find the lowest minimum temperature among grid points
approximately covering Pyramid Lake in December 2013
16. Methods: Designing test questions
• PART 2: Written, Standardized Usability Scale (SUS)
• Standard usability test, results interpreted based on large
number of usability studies
• Produces valid results on small sample sizes (Brooke,
1986; Bangor 2009)
• 10 questions, Likert-style scale– 5 choices between agree
strongly/disagree strongly
• Gives a score or “grade” to the usability of a site
17. Methods: Designing test questions
• PART 3: Verbal, 7 Questions
• Questions on interpretation of common terms used in
climate data (raw data, tool, product, climate anomaly
map, etc)
• Card sorting activity to inform on how people search for
climate data (where, when, what, type, source)
• General questions, discussion about site
18. Methods: Researching general usability guidelines
• Overall goal: reduce cognitive load on user
• No need to reinvent wheel (in most cases)
• Not specific to climate data, web pages in general
Krug, 2005
19. Methods: Researching general usability guidelines
• Adhere to web conventions
• Navigation along top of page, links recognizable, search bar in
upper right or left, ?=help
21. Methods: Researching general usability guidelines
• Be consistent within pages
• Color scheme, formatting, layout same throughout SCENIC
• Navigation menu always available
• Provide help texts
• However, most users
muddle through first
23. Methods: Researching general usability guidelines
• Hide unnecessary
options
• Make labels clear and
meaningful
• was much more successful
that “submit”
• Page headings match link
name
25. • Round 1:
• Mean = 63
• Round 2:
• Mean = 67.5
Results: How did participants rate SCENIC?
26. Results: How did participants rate SCENIC?
• Round 1: Mean = 63
• Round 2: Mean = 67.5
Somewhere between
“OK” and “Good”
Bangor et al. 2009
27. Results: How participants search for data
• N = 14 (15 participants, 1 abstaining) to meet
requirements for significant results (Tullis and Wood, 2004)
• WHERE (location) most important, SOURCE (originator
of data) least important
28. Results: Labeling Challenging
• Gridded or modeled data?
• All gridded data here modeled, not observed
• Participants found gridded “more descriptive and useful”
• Climate anomaly maps and time series?
• Participants agreed on, liked these names
Time Series
Climate Anomaly Map
30. Results: Labeling Challenging
• Tool and product misleading
• General agreement “tool” manipulates data, “product” static
• Term “Data Tools” did not illicit desired response, changed to “Data
Analysis”
• Raw data
• Some thought had QC applied, some not
• Used “Data Lister”
• “Historic” data
• Confusing, replaced with “Data Lister”
32. Results: Biggest challenge
• Getting participants to utilize data
analysis tools
• Want to list data first– all participants!
• Some say they would do analysis
themselves
• Are analysis tools valuable for this
audience?
• How to motivate people to know
they are available?
Analysis Tools
Data Lister
34. Conclusions
• Usability testing extremely valuable
• Applies to SCENIC and future projects
• Removed many issues on site
• Learned about how people use web
• Still improvements to be made
• Testing informs on issues,
not always clear how to solve
• Usability is challenging!
35. Conclusions- Recommendations
• Perform testing early and often
• Work with target audience
• Consider the way your audience searches
for atmospheric data
• Naming/labeling most challenging task
• Test names on participants, compare with other agencies
• Adhere to general usability guidelines
• Usability.gov; Krug, 2005, Don’t Make Me Think good places to start
36. Moving Forward
• Broad survey on how people look for climate data
• Standardization of terms (labels) across agencies
• Interpretation of atmospheric data
• Research what help tools are most effective
• video, forums, text, tutorials
• Other groups in atmospheric science share usability
testing results
37. Thank you!
World Usability Day
is coming up!
November 13 2014
This project was
supported by
DAS EDGES 2013
Hinweis der Redaktion
GROWING audience of climate data
Not climatologists, meteorologists
Obligation to provide
very often have to build page though not web designers
easy to check out on Amazon
Lots of research into that
We want same for climate
If you can demonstrate loyal following, it may inspire funding agencies to support you
Page visits follow Weibull distribution– “time to failure” concept in engineering/usability.
2 kinds: positive aging– longer it is in service more likely to fail.
negative aging– more time it is in service, less likely it is to fail.
Page visits exhibit negative aging.
Hopefully get loyal users
*general guidelines to be described later in presentation
1 of 8 CSCs that were established by DOI to deliver state-of-the-art climate research.
ACIS- daily fines res.
station, gridded.
temp, precip vars
Not WRCC
what are some basic things people might want to do on this site? Test several applications
what are some basic things people might want to do on this site? Test several applications
what are some basic things people might want to do on this site? Test several applications
In perfect scenario, wireframing of page before first line of code
We learned about usability after starting this project
what are some basic things people might want to do on this site? Test several applications
User clicks on a link trying to access data and text (metadata) is displayed.
Thinks clicking on the text will get him to data
This info was reduced and put into a mouseover popup box, original spot he clicked leads to data
Users in round 2 did not have this issue.
note change in qmarks
Only 1 time in first round when people really used help We changed color and size of
Give example of how users had trouble with station finder, thought it would give data.
STOP early
User sees dropdown menu (a chrome browser function) meant to indicate autofill.
It looks like a dropdown menu, but is not. the user’s choices aren’t available so frustrating
Note that only the county number is shown in the field. Once we put both station name and number in as well as removed arrow, all users recognized that autofill available.
what are some basic things people might want to do on this site? Test several applications
what are some basic things people might want to do on this site? Test several applications
Note: these are from questions asked after testing
background: user asked to find data from DEC 2013. Thinks it is relatively current, looks at a dashboard that provides “current climate information”
background: user asked to find highest temperature recorded in Elko in March. Tries to use data lister and enter in “March” or POR