Pam Noreault and Tara Knapp from ACI Worldwide presented on improving content user experience (UX) through user research and validation testing. They described their methodology for selecting content to improve, testing updated content with customers, and analyzing the results. Their testing showed that the updated content was located faster and tasks were completed with more success. They emphasized the importance of user research and content validation, and suggested additional ways to get customer input like social media monitoring and customer focus groups.
8. Methodology now – inching upwards
Personas
(product level)
User Research and Analysis
(release level)
*usage patterns
Information Model
(release level)
User Stories
User-Centered Content
Information Model (deliverable)
Concise writing
Topic-based writing
Task-based writing
Writing for translation
Writing for accessibility
Validation Testing
9. How we select content to fix
1. Select deliverable
2. Select content to uplift
13. Road led us here – BUT prove the changes made a
difference
Contextual overviews
Concise/clear content
Reduced
content/eliminated clicks
Topic-based (text
scanning)
Accessibility – checklist of
fixes
Translation – checklist of
fixes
14. How many of you are doing content validation with
customers?
15. Validation methodology
Model for PDF Documents
• Uplifted four documents
Methodology
• Teams of 4-7 writers per document
• Tested each document with at least 2 users from 2 customers
• Tested 1 hour via WebEx
• 4 tasks tested on each doc
• Presented 2 docs – old + new
• Order of docs alternated
16. Validation protocol
Directions
• Think aloud
• Tell us when you have completed the task or you give up
Test Protocol
• Each task was timed.
• Each task was completed successfully/unsuccessfully. Testers could
give up.
• After all tasks were completed on one doc, testers rated the content.
Scale: from 1 to 5.
• After all tasks were completed for both docs, testers rated their overall
experience. Scale: from 1 (poor) to 7 (great).
• Data recorded in a Google form
• Sessions recorded with permission
17. Sample task
You are a operations manager put in charge of monitoring
the system.
Task: Use the user guide to determine your two areas of
responsibility in terms of configuration.
24. Average overall ranking – 1 poor to 7 great
0
1
2
3
4
5
6
7
Document 1 Document 2 Document 3 Document 4
Old Doc New Doc
25. • Content located faster in 3 of 4 new models.
• Validation tasks completed with increased
success in 2 of 4 new models.
• Content rating higher in 3 of 4 new models.
• Overall content ranking higher in 2 of 4 new
models.
26. Laugh and cry moments
• Surfing & browsing
• Clueless & perfectionist
• Change haters
• Fear of failure
• When is done really done
• Aha moments
• Technical snafus
• Testing heavy-duty reference
content was a bust
28. Lessons the writers came up with
• Do a dry run
• Observing the users’ choices can be as
useful as the data
• Cannot predict how users will do the tasks
• Define what “done” means
• Reference-based content should not be
tested with the same methodology as task-
based content
• Rebooting your computer prior to testing
has its benefits
• Repeated contact with customers removes
the fear factor over time
29. 1. We require user research and content validation, where
appropriate.
2. We get creative…..
• Collaborate with people who work with customers
• Monitor and mine data from social networking sites
• Join LinkedIn groups - ask questions & post surveys
• Seek input from people who represent the same personas as our users
• Participate in customer-focus groups (Design Partner Programs)
Re(evolution)
Tara slides 4 -7- 3,500 documents – 5 authoring tools, 3 different teams, 26 writers worldwide – all doing different things – no common publishing Engine, own standard, own processes – support ~30 products