UI:UX Design and Empowerment Strategies for Underprivileged Transgender Indiv...
Payel ux portfolio
1. Welcome to my UX
journey
By Payel Bandyopadhyay
pbandyop@vt.edu
2. Who am I?
● A graduate student at Virginia Tech, Blacksburg, USA
(2017 - Present)
○ Pursuing MS in CS with thesis. Specialization:
HCI
○ UX Design Researcher, InfoViz Lab and 3D
Interaction Group, Virginia Tech
● UX Research intern @Informatica, Redwood City, CA
(summer 2019)
● UX and Data modelling intern @UPS, Advanced
Technology Group, GA (summer 2018)
● UX Research Assistant, Interface Ecology Lab, Texas
A&M University, TX (2016)
● UX Research Assistant, Ubiquitous Interaction (Uix)
research group, Helsinki Institute for Information
Technology, FI (2013-2015)
4. UX Design
Researcher
@InfoViz Lab and 3D Interaction
Group, Virginia Tech, VA
MY ROLE:
Lead UX Research Assistant. Working with
researchers, designers and stakeholders for this role. I
was part of both UX design and research team.
METHODS:
Background research, surveys, questionnaire, field
observation, expert study, novice study, usability
testing, task based analysis, design experiments,
participant observation, data analysis
TOOLS:
Qualtrics, Sketch, Paper Prototype, Unity, Oculus,
HoloLens, In-lab study, Qualitative/Quantitative data
TIMELINE:
August 2017 - Present
5. Research - Immersive Analytics
● Complex data-sets are
often hard for users to
parse and extract meaning
● Creating storylines from the
data is becoming difficult
with Big Data
● Need more space to
analyze information
● Immersive analytics, in turn,
is a blend of data analytics
with virtual environments.
6. Research Product:
Immersive Space to
Think (IST)
Research question:
How users organize documents
inside 3D immersive space?
● Understand how users organize
documents in 2D space
● What challenges are present in
2D space?
● How can 3D space overcome
those challenges?
8. An FBI analyst analyzing documents in 2D
(participant observation)
9. Target Persona GOALS:
● Gather and analyze crime data
● Finding unique stories from the reports
● Synthesize variant information
Frustrations:
● Keeping track of different reports
● Overwhelming documents
● Lack of space to arrange documents in
2D space
● Difficulty in connecting the dots in the
information
DEMOGRAPHICS: NAME: Annie
Age group: 25-35
Role: FBI Analyst
Location: Washington, DC
11. IST tool
● Conducted field study to
understand how analysts’
conducts document analysis
● Gathered design
requirements
● Used iterative design
approach to develop the
prototype
● Conducted pilot studies
● Developed pre-study
questionnaires
● Designed experiments
● Conducted post study
interviews
13. TASK for user study
TASK:
You are a FBI analyst. Something big is going to happen. Your boss gave you a bunch of reports to analyze the plot. You need to report
to your boss about the plot. Using out tool try to answer these questions:
Find out:
1. What the plot is?
2. Who’s involved?
3. Where it’s going to happen?
4. When?
5. What time?
You have 60 mins to complete the task.
14. Forms for User study
● Research Protocol:
http://people.cs.vt.edu/~pbandyop/homepageFiles/IST_Research_Protocol.pdf
● Consent form:
http://people.cs.vt.edu/~pbandyop/homepageFiles/Informed_Consent.pdf
● Recruitment email:
http://people.cs.vt.edu/~pbandyop/homepageFiles/Recruitment_Email.pdf
● Pre-study questionnaire:
http://people.cs.vt.edu/~pbandyop/homepageFiles/Prestudy_questionnaire.pdf
● Post-study Interview:
http://people.cs.vt.edu/~pbandyop/homepageFiles/Poststudy_Interview.pdf
15. My discoveries
1. Which font size is best suited in immersive 3D
display?
2. Which color is best suited for document
background?
3. How far the users can see the documents within 3D
immersive space?
4. What are the usability issues related to walking
while wearing the immersive headsets?
5. How to balance cognitive load with physical load
involved in this task?
6. Do users understand how to use the tool?
7. Is it too complex for them?
8. How much technical knowledge is needed to use
the tool?
Left: User using IST
using HoloLens
headset
Below: Users using
IST using Oculus
headset
16. Outcome:
[Bandyopadhyay 2019] Bandyopadhyay, P., Lisle, L., North, C., Bowman, D., and Ragan, E. Immersive
Space to Think: the Role of 3D Immersive Space in Sensemaking of Textual Data. In review.
http://infovis.cs.vt.edu/sites/default/files/vrst19a-sub1093-i5.pdf
17. UX Design and
Researcher
@UX Lab, Informatica, CA
MY ROLE:
UX Design and Researcher. Worked with UX
Researchers, designers, PM, stakeholders and
developers for this role.
METHODS:
Background research, competitive analysis, surveys,
questionnaire, expert study, usability testing, task
based analysis, design experiments, data analysis
TOOLS:
Qualtrics, Paper Prototype, Sketch, In-lab/Remote
study, UserTesting, Qualitative/Quantitative data
TIMELINE:
May 2019 - August 2019
18. UX WORKSHOP
DESIGN VALIDATION DEVELOP
BACKGROUND STUDY
★ Understand Lineage
★ Competitor Analysis
★ Academic Analysis
★ EDC Product
Research
★ Identify product
pain points
BRAINSTORM IDEAS
★ Propose design
concepts
★ Identify personas
★ Plan of execution
MAKE IT SIMPLE
★ Hand wireframe
★ Low fidelity
wireframe
★ High fidelity
wireframe
★ Mockups
★ Conduct interviews
VALIDATE DESIGNS
★ EDC Designers
★ EDC PMs
★ EDC Salespersons
★ EDC Researchers
★ EDC Developers
★ End user customers
19. What is the problem?
Visualizing
Hierarchical Big Data
When the data is small
lineage can be easily
identified.
When the data size
increases understanding
the lineage diagram is
confusing and
cumbersome.
23. GOALS:
● Analyze business data
● Finding how data changed over time
● Create charts from data lineage for his
manager to take business decisions
Frustrations:
● Keeping track of the lineage
● Overwhelming information
● Too many data processing taking place
● Lineage takes too much time to load
TARGET
PERSONA
Business
Analyst
DEMOGRAPHICS:
NAME: John
Age group: 25-35
Role: Business Analyst
Location: Redwood City, CA
32. Design Motivation
Google map visualizes data in layers:
➔ What Overview first
Gain an overview of the entire
collection.
➔ Zoom
Zoom in on items of interest.
➔ Filter/ Search
Filter/Search out un-/interesting items.
➔ Details on Demand
Select an item or group and get
➔ Relate
View relationships among items
34. High fidelity
Mockups
(Sketch and Invision)
Iterative design process
Wireframes updated
periodically after each design
validation.
Let’s see demo.
https://drive.google.com/file/
d/17kF8tZzvWPM5hwjsUH-JCt
9HNwos3xjB/view?usp=sharing
37. What people are saying
Design address
most of the
current EDC
pain points
Awez Syed, EDC PM,
USA
Promising
design
concepts
Fouad Boulbellout, EDC
Salesperson, France
Design has
scalability
power
Vikram Tyrala, EDC
Developer, India
Powerful lower
navigation bar
Giri, EDC Designer, India
38. Design concepts
developed
● Tree Layout
● Hierarchy
D3
D3 provides us with
some APIs which can be
tweaked to satisfy the
design requirements.
39. Final Product
Under NDA!
● The mockup created during this internship will be released next year.
● Showcased at the internal design conference and intern demo day to the CEO
and 1000+ employees.
40.
41. UX and Data
Modeling
Researcher
@Advanced Technology Group, UPS
HQ, GA
MY ROLE:
UX and Data Modeling intern. Worked with PMs and
stakeholders for this role.
METHODS:
Background research, surveys, expert study, design
decisions, usability analysis, task based analysis, data
visualization and analysis
TOOLS:
Qualtrics, InVision, Paper Prototype, In-lab study,
Qualitative/Quantitative data
TIMELINE:
May 2018 - August 2018
42. Summary
● UPS delivers packages to customers within US and
worldwide.
● I conducted UX research on the dataset collected by UPS
over a period of time.
● The results of the research helped internal stakeholders to
take business decisions.
43. My Role
As the team's only UX Researcher, I owned many stages of the research process.
● Wrote research protocol by communicating with internal stakeholders to prioritize objectives and
matching them with appropriate methods
● Conducted Expert Review sessions with stakeholders to understand the data
● Recruited participants. Created personas.
● Wrote usability session protocol that included semi-structured interview questions and tasks
● Moderated remote usability sessions with 20 customers
● Ran research debriefs with team members to identify key findings
● Created mockup visualizations
● Worked with PM to validate the visualizations
● Designed final presentation to stakeholders and identifying action items for future work
44. As the team's only UX Researcher, I owned many stages of the research process.
● Wrote research protocol by communicating with internal stakeholders to prioritize objectives and
matching them with appropriate methods
● Conducted Expert Review sessions with stakeholders to understand the data
● Recruited participants. Created personas.
● Wrote usability session protocol that included semi-structured interview questions and tasks
● Moderated remote usability sessions with 20 customers
● Ran research debriefs with team members to identify key findings
● Designed final presentation to stakeholders and identifying action items for future work
45. My Role
As the team's only UX Researcher, I owned many stages of the research process.
● Wrote research protocol by communicating with internal stakeholders to prioritize objectives and
matching them with appropriate methods
● Conducted Expert Review sessions with stakeholders to understand the data
● Recruited participants. Created personas.
● Wrote usability session protocol that included semi-structured interview questions and tasks
● Moderated remote usability sessions with 20 customers
● Ran research debriefs with team members to identify key findings
● Designed final presentation to stakeholders and identifying action items for future work
46. Usability testing with the mockups
OBJECTIVE
1. Obtain feedback on the strengths, struggles, and failures of the visualizations.
EXAMPLE RESEARCH QUESTIONS
1. Task Success: Can stakeholders understand the data? Can they make the required business decision from the
visualizations?
2. Icons: Can stakeholders understand the colors? What do they expect them to do?
3. Errors: Do stakeholders experience errors while understanding the data?
4. Timing: How long do stakeholders take to understand the data and finally come to a business decision? Do they need
more options in the UI to understand the data? If yes, what are they?
47. Usability testing with the mockups
METHODOLOGY: TASK SCENARIOS IN A REMOTE USABILITY TEST
1. 20 live sessions (+ 1 dry run)
2. Think-aloud testing
3. 30 minutes each - 5 mins describing the study, 15 mins to complete the task and 10 mins for post study interview
4. Remote screen sharing software
5. Testing a Sketch and inVision prototype
6. Start: Provide Scenario
7. Middle: Have them use the visualization prototypes
8. End: Interview + System Usability Scale (SUS) survey
49. Difficulty specs
● Observation
○ When did this difficulty appear?
○ How many users faced this difficulty?
● Interpretation
○ Did the user used any alternative action to
complete the target goal?
● Recommendation
○ Provide supporting recommendation to
mitigate the difficulty “User quote regarding the
difficulty”
50. Turning insights into suggested design
principles
● Icons for various filter options
● Notifications when new data is
available to upload
● Provide recommendations
based on user activity
● Drag and drop features
● Help features
Five best practices for
visualizing data for analysis
and business decisions
51. What did I learn?
● To be an evangelist for the UX research practice: Communicate how research can have a positive impact to
stakeholders.
● That we are all designers: When presenting research outcomes, suggest design solutions.
● To be proactive after sharing research results: Make sure the team has a shared understanding of what action
items will come out of the research results, e.g. which re-designs are associated to which team members.
● Come with a point of view: When asking for feedback on a research approach or design method from a team
member, coming prepared with a point of view.
● Be an educator to your coworkers: Learn and assist coworkers with their continual learning by providing
advice or hosting workshops on new design tools, research methods, or visual techniques.
52. Impact
Under NDA!
● The visualizations are being successfully used by UPS researchers
● Showcased the visualization to 1000+ employees on demo day.
53. UX Design Researcher
Helsinki Institute for Information Technology
MY ROLE:
UX Research Assistant. Worked with Design
Researchers and HCI Researchers for this role.
METHODS:
Background research, surveys, expert study,
participant observation, diary logs, semi-structured
interviews, design decisions, usability testing, task
based analysis, data analysis
TOOLS:
In-lab/Remote study, Qualitative/Quantitative data
TIMELINE:
2013 - 2015
54. Research Product:
Scinet
Innovative way to search your data
Research question:
How researchers search information
in the web?
● Understand how users
formulate query?
● What challenges are present in
searching desired information?
● Could we mine the relevance or
interest of the user directly from
the human mind?
55. Target Persona GOALS:
● Gather and read papers related to HCI
● Currently trying to learn updated papers
in the field of ML
● Synthesize design, psychology and ML
Frustrations:
● Trying to formulate query about papers
which are new to him
● Spends a lot of time reading papers to
find if it’s relevant or not
● Loses track most of the time
● Even after spending hours searching for
high quality papers, ends up in vain
DEMOGRAPHICS: NAME: Sam
Age group: 35-45
Role: Researcher
Location: Helsinki, Finland
56. Traditional search engine UI pain points
● Sam searched for user
interface and machine learning
● Received results about papers
dated back in 1994, 2015, 2009
● Sam is not happy with the
results
● Sam is pretty new to the field
and doesn’t know how to
improve the search results
● The UI doesn’t provide any
recommendations to Sam
● Sam spends hours reading the
old papers to gain more
knowledge about the field and
then tries to iterate the search
query
60. Designed UI for AI assisted search engine
● Project Link: http://augmentedresearch.hiit.fi/ DEMO: https://youtu.be/zOoFNpF6eFk
61. Forms used in this study
● Recruitment email/poster:
https://docs.google.com/document/d/1If1zwpBzGJme7JwwOjVzCklB83cw22VLmVAoxarBVYg/edit?us
p=sharing
● Task for System 1:
https://docs.google.com/document/d/1p1Tx1qbQBOwG8Prz9yOktp0U0acsLK0l6ezmlodRXD4/edit?us
p=sharing
● Task for System 2:
https://docs.google.com/document/d/1Ct9Av2GhJJq5R4N4WXdvtPP0p1OmzcgmSETqB5qB94Y/edit?
usp=sharing
● Track of usernames:
https://docs.google.com/document/d/1bqBdUdkWD76D8i_3OtA3m0HwS23IKJZ39uMKoYayURc/edit
?usp=sharing
65. Outcome
● Chirayu Wongchockprasitti, Jaakko Peltonen, Tuukka Ruotsalo, Payel Bandyopadhyay, Giulio
Jacucci and Peter Brusilovsky
User Model In a Box: Cross-System User Model Transfer for Resolving Cold Start
Problems
In: Proceedings of 23rd International Conference on User Modeling, Adaptation, and
Personalization (UMAP 2015), 2015
Dublin, Ireland, June 29 - July 3, 2015. (Acceptance rate: 16% for long paper)
[Link]
● Payel Bandyopadhyay, Tuukka Ruotsalo, Antti Ukkonen, Giulio Jacucci
Navigating Complex Information Spaces: A Portfolio Theory Approach
Third International Workshop, Symbiotic 2014
Springer Lecture Notes in Computer Science, Volume 8820, Pages 130-136.
[Link]
68. ❏ What is Augmented Reality (AR) application?
Any application having the following properties can be classified as an
augmented reality application [1]
❏ combination of real and virtual objects in a real environment
❏ interactive and real time operation
❏ registration (alignment) of real and virtual objects with each other
Figure 1: Milgram’s reality–virtuality continuum [2]
Introduction
69. Chosen AR Prototype
❏ “Augment” - 3D Augmented Reality [3]
Figure 2: Logo of Augment [3]
70. “Augment is a mobile app that lets you and your
customers visualize your 3D models in Augmented
Reality, integrated in real time in their actual size
and environment. Augment is the perfect
Augmented Reality app to boost your sales and
bring your print to life in 3 simple steps.” [3]
Application Description
71. Software Requirement
❏ “Augment” is available only on the following OS:
❏ iOS
❏ Android
❏ OS used in this project:
❏ Android
❏ Devices used for evaluation:
❏ Android version of mobile
❏ 4.1.1 (Jelly bean)
❏ Android version of tablet
❏ 4.0.4 (Ice cream sandwich)
72. ❏ Application can be used for following 2 purposes:
❏ Sales and design
❏ Interactive print
Figure 3: (a) A screenshot of Scan user interface [3] (b) A screenshot of 3D Model user interface [3]
System functionality
73. Example of usage
❏ Select 3D model
❏ Select marker
❏ See the AR content
❏ Interact (scale, rotate,
take photo, share…)
Figure 4: A screenshot of user interface of Augment
77. ❏ What is Cognitive walkthrough?
❏ Exploratory learning
❏ Evaluators involved
❏ Scribe
❏ Facilitator
❏ CE+ theory [5]
❏ The user sets a goal to be accomplished;
❏ The user searches the interface for available actions;
❏ The user selects an action that seems likely to make progress toward the goal; and
❏ The user performs the action and checks to see whether the feedback indicates that
progress is being made towards the goal.
CW Description
79. ❏ Task description
“Select a furniture 3D model and place it
in your room in a desired place and save
it. The system will be in a state such that
someone could immediately start
testing.”
CW Experiment Design (2/3)
80. CW Experiment Design (3/3)
Figure 5 : Correct actions: Part of from cognitive walkthrough start-up sheet
84. CW Results (4/4)
Table 2: Specific issues identified during our cognitive walkthrough. Problems marked with an asterisk
(*) indicate problems that were discussed before the CW was done, but were also revealed by the CW.
86. ❏ What is Heuristic evaluation?
❏ Evaluator
❏ Based on Nielsen’s 10 heuristics
HE Description
87. Table 3: 10 HEURISTICS [6]
No Heuristic Description
1 Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.
2 Match between system and the real
world
The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information
appear in a natural and logical order.
3 User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support
undo and redo.
4 Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.
5 Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present
users with a confirmation option before they commit to the action.
6 Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for
use of the system should be visible or easily retrievable whenever appropriate.
7 Flexibility and efficiency of use Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users
to tailor frequent actions.
8 Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their
relative visibility.
9 Help users recognize, diagnose, and
recover from errors
Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
10 Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on
the user's task, list concrete steps to be carried out, and not be too large.
88. ❏ The members of the group have acted as the evaluators of
the heuristics.
❏ The heuristic evaluation has been based on Nielsen’s 10
heuristics.
❏ Evaluator 1
❏ Smartphone
❏ Language of app: Spanish
❏ Evaluator 2
❏ Tablet
❏ Language of app: in English
HE Experiment Design
89. HE Results (1/2)
❏ Evaluator 1 found the following usability
problems
❏ https://docs.google.com/document/d/1WiP7kMHcWfNj_I83uboZ7qC0igMIaGKpcEzGmmpJ1z4/edit
❏ Evaluator 2 found the following usability
problems
❏ https://docs.google.com/document/d/1-hKAAWRNbpxP3aiYc7iokqwdbR6wS-vMgWfHo2eB9D8/edit
90. Table 4: The total number of problems found in Heuristics evaluation phase. Second column shows
problems found in “Augment” when installed in Smartphone. Third column shows problems found in
“Augment” when installed in Tablet.
91. ❏ More usability problems in the case of smartphone than in
tablet:
❏ Smartphone is less powerful → More errors and
crashes
❏ Smaller screen in smartphone → App layout is better
designed for big screen → Lack of consistency
❏ Smartphone has been evaluated in Spanish → Severe
translation problems
HE Results (2/2)
93. LO Description
❏ What is laboratory observation?
❏ Conducted in laboratories with test
❏ Not necessary to take place in dedicated “laboratory” [7]
❏ Controlled environment
❏ Can be conducted in various controlled environments [7]
❏ office
❏ hall way
❏ simulator
❏ others
❏ Mimic real life scenario
❏ Users
❏ Video/Audio Recordings
94. LO Experimental Design
❏ Users recruited
❏ 7 users
❏ Smartphone users
❏ CS students
❏ Device provided
❏ Samsung Galaxy Tab 10.1
(Touchscreen)
❏ Model Number: GT-P7500
❏ Recordings
❏ Video
❏ Audio
Figure 5: Laboratory set up done
95. LO Results (1/5)
❏ Links to the video and audio recordings:
❏ User 1: https://www.youtube.com/watch?v=o8DF-CXkX2c
❏ User 2: https://www.youtube.com/watch?v=7vORzJ0s55A
❏ User 3: https://www.youtube.com/watch?v=yGKHAsY_T_8
❏ User 4: https://www.youtube.com/watch?v=Nz-Ciq3XFvc
❏ User 5: https://www.youtube.com/watch?v=YnWGwb6_Tn8
❏ User 6: https://www.youtube.com/watch?v=PXwrRO4KVPs
❏ User 7: https://www.youtube.com/watch?v=1w934zTn0KE
96. LO Results (2/5)
Figure 7 : Graph showing the task completion times of the 7 users. All the 7 users were novice users in
terms of using Augmented Reality applications. All the 7 users were shown a live demo of the
application and how to perform the task. User number 4 saw the demo once, saw the previous user
performing the same task, did a demo herself before performing the experiment. Hence, the task
completion time of user 4 was approximately half that of the average of other users but the user made
a lot of errors while completing the task.
97. LO Results (3/5)
(a) (b)
Figure 7: (a): The blue color represents the percentage of users who could find the desired 3D model
very easily. The red color represents the percentage of users who could not find the desired 3D model
easily. (b) : The blue color represents the percentage of users who could easily create the tracker. The
red color represents the percentage of users who could not create the tracker easily.
98. LO Results (4/5)
Figure 8: Number of users who tried to rotate the 3D model in other axis (x-axis and z-axis) other than
the one provided in the current user interface.
99. LO Results (5/5)
Table 2: Number of identified usability problems (total number of individual problems found in all
users and total number of common problems experienced by all users
104. QE Results (1/2)
Figure 10: Results from the questionnaires. The image shows the frequency of every mark for each
statement. The reader should note that there were 4 statements (number 4, 5, 8 and 12) that were not
answered by all users.
105. ❏ Users have rated positively the questionnaire
❏ They have been more concentrated in the
novelty of AR than in the usability
❏ For future:
❏ Tests without pre-instructing users
❏ Tests with smartphone
❏ Tests with users with previous knowledge of AR
❏ Tests in other languages
QE Results (2/2)
106. Conclusions (1/5)
❏ AR new technology
❏ Novice users (90%)
❏ Current Learnability curve
❏ high (Figure 11)
❏ Learnability curve of user interface of “Augment”
❏ should be lowered
❏ Alternative of help menu
❏ none of the users used it
❏ Informative feedback
❏ addition needed
❏ Usability methods from 3 categories should be
employed
❏ few unique problems were found in each of 4 methods
107. Figure 11: Red color represents percentage of users who faced both the problems. Prussian blue color
represents percentage of users who did not face any problem. Light blue color represents user who
found selecting 3D model a problem but could easily create the tracker. Yellow color represents
percentage of users who could easily select a 3D model but could not create the tracker.
108. Conclusions (2/5)
❏ Consistency should be added
❏ phone version explain why tracker is not working
❏ tablet version explain why tracker required
❏ Rotation button
❏ y-axis (available)
❏ x-axiz (addition required)
❏ z-axis (addition required)
❏ “Email” and share options
❏ need to be improved
110. Conclusions (4/5)
❏ “Augment” design guidelines:
❏ Follow the standards of Android platform.
❏ Provide more intuitive interfaces and a more
organized “option dialog".
❏ Improve the manipulation of 3D objects.
❏ Provide more information about AR concept and
about what a good marker is.
❏ Provide help tips in the appropriate context.
❏ Provide better translations of the languages.
111. Conclusions (5/5)
❏ AR evaluation design guidelines:
❏ Combining one usability inspection method with one
usability testing method is recommended to obtain a
reliable outcome.
❏ Using more than one expert in the inspection
methods is suggested.
❏ If questionnaire method is going to be used, the
number of users to fill it should be large enough and
contain a variety of users, including AR experts.
113. References
[1] Nektarios N. Kostaras and Michalis N. Xenos. Assessing the usability of augmented reality systems.
[2] P. Milgram and F. Kishino. A taxonomy of mixed reality visual displays. IEICE Trans. Information Systems, E77-D(12):1321–
1329, dec 1994.
[3] Jean-François Chianetta, Mickaël Jordan, Cyril Champier, Aril, 2014, http://augmentedev.com/
[4] M.N. Mahrin, P. Strooper, and D. Carrington. Selecting usability evaluation methods for software process descriptions. In
Software Engineering Conference, 2009. APSEC ’09. Asia- Pacific, pages 523–529, Dec 2009.
[5] Xiangyu Wang. Using cognitive walkthrough procedure to prototype and evaluate dynamic menu interfaces: A design
improvement. In Computer Supported Cooperative Work in Design, 2008. CSCWD 2008. 12th International Conference on, pages
76–80, April 2008.
[6] Jakob Nielsen. http://www.nngroup.com/articles/ten-usability-heuristics/, 2005.
[7] A Review of MobileHCI Research Methods. / Kjeldskov, Jesper; Graham, C., Proceedings of the 5th International Mobile HCI
2003 conference, September 8-11 2003, Udine, Italy: Lecture Notes in Computer Science. Springer, 2003.