Designing in agile environments demands many decisions be made in short periods of time. Informing these decisions with formative research enhances our understanding what we’re building, from the viability of concepts, to the effectiveness of designs, to the ultimate success of our solutions.
➥🔝 7737669865 🔝▻ jhansi Call-girls in Women Seeking Men 🔝jhansi🔝 Escorts S...
UserTesting 2016 webinar: Research to inform product design in Agile environments
1. Research to Inform Product Design in
Agile Environments
Steve Fadden, Ph.D.
Director, Salesforce Analytics UX Research
Professional Faculty, UC Berkeley School of Information
Presented for UserTesting Webinar, November 10, 2016
2. Agenda
1. Agile values and challenges
2. Research methods
3. Tips & tricks
Image: http://www.geograph.org.uk/photo/111487
5. Agile values
Individuals and interactions
over processes and tools
Working software over
comprehensive
documentation
Customer collaboration over
contract negotiation
Responding to change over
following a plan
Reference: http://www.agilemanifesto.org/
6. Continuous delivery and change
“Our highest priority is to satisfy the customer through early and
continuous delivery of valuable software.”
“Welcome changing requirements, even late in development.
Agile processes harness change for the customer's competitive
advantage.”
Reference: http://agilemanifesto.org/principles.html
7. Sustainable, optimal work
“Agile processes promote sustainable development.”
“The sponsors, developers, and users should be able to maintain
a constant pace indefinitely.”
“Simplicity--the art of maximizing the amount of work not
done--is essential.”
Reference: http://agilemanifesto.org/principles.html
8. Self-organization and adjustment
“The best architectures, requirements, and designs emerge from
self-organizing teams.”
“At regular intervals, the team reflects on how to become more
effective, then tunes and adjusts its behavior accordingly.”
Reference: http://agilemanifesto.org/principles.html
14. Evidence of problems
Potential opportunities
● Recent events
● Specific details
● Feelings and perceptions
● Future responses
Critical incidents
Image: https://www.flickr.com/photos/vfwnationalhome/12436173623
15. Reference: Flanagan, J.C. (1954). The Critical Incident Technique, Psychological Bulletin, 51(4), 327-358; http://www.usabilitynet.org/tools/criticalincidents.htm; Image: https://pixabay.com/en/photos/interview/
Critical Incident Process
1. Confirm user profile
2. Identify last time
3. Gather details
a. Description
b. Actions taken
c. Feelings
d. Outcome
e. Future actions
16. Example prompt
“Consider the last time you had to share
something online. How long ago did this
happen? What did you share? Describe the
steps you took to share, and highlight any
surprises or problems (if any) that
happened.”
17. Example result
● Validates problem
● Identifies opportunities
● Clarifies expectations
● Details scenarios
● Builds empathy
“I needed to share a PDF with a
friend, and we use ABC, but I
hadn’t used it in a while. I logged
in through my browser, dragged
the PDF to Files, and then saw
the PDF open. I expected ABC to
start uploading it. I hit back,
created a folder in ABC,
uploaded the PDF to it, clicked
share to add my friend, and sent
her the link.”
19. Scenario exploration
Initial confusions
Acceptability
Extensions and opportunities
● Description
● Flow and interaction
● Process illustration: steps,
images, storyboard, video
Reference: Carroll, J.M. (2003). Making Use: Scenario-Based Design of Human-Computer Interactions. MIT Press; Image: https://www.flickr.com/photos/brenneman/5273755180
20. Scenario exploration process
1. Present overall scenario
a. Ascertain understanding
b. Capture concerns/questions
2. Show steps of flow or interaction, capturing:
a. Concerns, confusions
b. Benefits, positives
c. Open questions
3. Gather final comments at end
21. Example prompt: Initial scenario
“Imagine you had a tool that provided the
ability to export data from any document that
contained numbers. You would be able to
select the data you wanted, and the tool
would export it to an analysis tool of your
choice. Discuss your initial thoughts about
this tool, highlighting any questions,
concerns, or benefits that come to mind.”
22. Example prompt: Storyboard review
“The following 4 slides illustrate how you
might interact with this tool. Review each
slide, and comment on anything you find to be
confusing, problematic, useful, or appealing
about the concept.”
24. Example feedback, slide 1
1.
“Makes sense so far. I wonder how the
tool will handle tables that are embedded
in documents that contain a lot of text,
such as labels or superimposed
descriptions.”
25. 2.
Example feedback, slide 2
“I’m thinking that this process would
require a lot of clicks, even for a small
number of columns. It would be better if
the tool automatically recognized a lot of
this information, and then I could go in and
review/modify it.”
26. 3.
Example feedback, slide 3
“I understand this process, but am
concerned that people might give different
names to the same data. You should
embed best practices for naming here.
Otherwise, the result could be messy.”
27. 4.
100%
Example feedback, slide 4
“I like that it shows progress, but it seems
that it should be pretty fast for documents
that don’t have a lot of tables embedded in
them. Will we be able to save the
mappings? That could save time in the
future.”
28. Example: Final comments
1.
2.
3.
4.
100%
“It’s great that you don’t have to
jump around different parts of
the system to do this. It’s very
valuable to be able to complete
this from one place.”
“Hi, I wanted to follow up to
reiterate that this is a REALLY
COOL idea and it fills a much
needed requirement for our
use of the product. Please
consider me for future studies
like this, because we need this
functionality!”
30. Initial impressions
Visual appeal
Goals and intentions
Specific content
● Interface preview
● Time limit
● Survey questions
Image: https://commons.wikimedia.org/wiki/File:Claude_Monet,_Impression,_soleil_levant.jpg
31. First impressions matter
Formed by ~50ms
● Primarily based on visual appeal
● Do not change with additional viewing time
Initial usability impression remains stable
● With 5s, 60s, and no time limit
● Even when site is manipulated to be more/less usable
References: Lindgaard, G., Fernandes, G., Dudek, C., & Brown, J.M. (2006). Attention web designers: You have 50 milliseconds to make a good first impression! Behaviour and Information Technology, 25(2),
115-126; http://usabilitynews.org/visual-appeal-vs-usability-which-one-influences-user-perceptions-of-a-website-more/; http://www.measuringu.com/five-second-tests.php
32. Impression test process
“View the following interface for a few
seconds, and then answer the questions that
follow.”
Present
instructions
Show
interface
Present
questions
33. Present questions after viewing
The interface you just viewed appears:
Very Very
Appealing - - - - - - - - - Unappealing
Very Very
Easy - - - - - - - - - Hard
Very Very
Efficient - - - - - - - - - Inefficient
What is the purpose of the interface?
34. Ratings and open feedback are helpful
“It looks very plain and almost
ugly. It’s obviously some kind of
class schedule. I remember seeing
assignments and icons that look
like standard office software. I
could probably figure out how to
use it, but I’m not sure I’d want
to.”
37. Expectation process: “Greeking” technique
1. Identify important tasks
2. Create scenario
3. Write “first step” questions
4. Present interface without text
5. Ask for group/category names
Reference: Thomas S. Tullis, 1998. A method for evaluating web page design concepts. CHI 98 conference summary on human factors in computing systems
38. Example prompt: Scenario and instructions
Scenario: “You are a college instructor using
a new online course management tool to
schedule and track assignments and
communicate with your students.”
Instruction: “Indicate where you would first
click to start each task?”
39. “Greeking” activity: Individual tasks
How would you expect to:
1. See a calendar with class
meetings?
2. Edit information about a
specific homework
assignment?
3. Send a secure email
message to a student?
40. “Greeking” activity: Groupings
Identify and prioritize any
groups of functionality
What would you call each
group?
2
1
3
“3. More details”
“1. Main navigation”
“2. Quick tools”
42. Look for problems
Assess effectiveness
Compare to competition
Validate performance
● Benchmark testing
● Discount usability
Both good, but...
Image: https://commons.wikimedia.org/wiki/File:Flickr_-_The_U.S._Army_-_Searching_for_opposing_forces.jpg
43. Fixing issues as a team
Task completion barriers
Improvement opportunities
Consensus on feasibility
● High fidelity interface
● Representative users
● Key stakeholders
Image: https://commons.wikimedia.org/wiki/File:Jigsaw_puzzle_01_by_Scouten.jpg
44. Rapid Iterative Testing & Evaluation (RITE)
Reference: Medlock, M.C., Wixon, D., Terrano, M., Romero, R., & Fulton, B. (2002). Using the RITE method to improve products: a definition and a case study, Presented at UPA conference.
Similar to conventional usability evaluation
Emphasis on fixing major problems between sessions
1. Schedule time for interface changes
2. Agree on critical tasks and success criteria
3. Record problems encountered
4. Discuss problems worth fixing
5. Implement fixes and continue
45. Result: Whole team understands issues better
Stakeholder: “Every participant expects
saving. Is there a design that better conveys
that it’s a one-time process?”
Stakeholder: “We could try the term
‘Apply’ plus some descriptive text to see if
it changes expectations?”
Stakeholder: “It’s a great idea, but we won’t
have the resources to implement saving.
This can only be a one-time process.”
Participant: “It says ‘save’
but I’m just modifying
the settings here. Unless
I can re-use this later?”