5. Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
12. Natural Gesture
• Freehand gesture input
• Depth sensors for gesture capture
• Move beyond simple pointing
• Rich two handed gestures
• Eg Microsoft Research Hand Tracker
• 3D hand tracking, 30 fps, single sensor
• Commercial Systems
• Meta, MS Hololens, Occulus, Intel, etc
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
16. Multi-Scale Gesture
• Combine different gesture types
• In-air gestures – natural but imprecise
• Micro-gesture – fine scale gestures
• Gross motion + fine tuning interaction
Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint: Exploring
Mixed-Scale Gesture Interaction for AR Applications. In Extended Abstracts of the 2018 CHI Conference on
Human Factors in Computing Systems (p. LBW120). ACM.
18. What Gesture Do People Want to Use?
• Limitations of Previous work in AR
• Limited range of gestures
• Gestures designed for optimal recognition
• Gestures studied as add-on to speech
• Solution – elicit desired gestures from users
• Eg. Gestures for surface computing [Wobbrock]
• Previous work in unistroke getsures, mobile
gestures
19. User Defined Gesture Study
• Use AR view
• HMD + AR tracking
• Present AR animations
• 40 tasks in six categories
• Editing, transforms, menu, etc.
• Ask users to produce
gestures causing animations
• Record gesture (video, depth)
Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. (2013). User-defined gestures for
augmented reality. In CHI'13 Extended Abstracts on Human Factors in Computing Systems
20. Data Recorded
• 20 participants
• Gestures recorded (video, depth data)
• 800 gestures from 40 tasks
• Subjective rankings
• Likert ranking of goodness, ease of use
• Think aloud transcripts
21. Results
• Gestures grouped according to similarity (320)
• 44 consensus (62% all gestures), 11 poses seen
22. Lessons Learned
• AR animation can elicit desired gestures
• For some tasks there is a high degree of
similarity in user defined gestures
• Especially command gestures (eg Open), select
• Less agreement in manipulation gestures
• Move (40%), rotate (30%), grouping (10%)
• Small portion of two handed gestures (22%)
• Scaling, group selection
23. Multimodal Input
• Combine gesture and speech input
• Gesture good for qualitative input
• Speech good for quantitative input
• Support combined commands
• “Put that there” + pointing
• Eg HIT Lab NZ multimodal input
• 3D hand tracking, speech
• Multimodal fusion module
• Complete tasks faster with MMI, less errors
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with
Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
24. Gaze Interaction
• Eye tracking in MR displays
• Commercially available
• Fast/robust
• What type of interaction?
• Conscious vs. unconscious interaction
25. Eye Tracking Input
• Smaller/cheaper eye-tracking systems
• More HMDs with integrated eye-tracking
• Research questions
• How can eye gaze be used for interaction?
• What interaction metaphors are natural?
• What technology can be used for eye-tracking?
26. Eye Gaze Interaction Methods
• Gaze for interaction
• Implicit vs. explicit input
• Exploring different gaze interaction
• Duo reticles – use eye saccade input
• Radial pursuit – use smooth pursuit motion
• Nod and roll – use the vestibular ocular reflex
• Hardware
• HTC Vive + Pupil Labs integrated eye-tracking
• User study to compare between methods for 3DUI
Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March).
Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User
Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
27. Duo-Reticles (DR)
Inertial Reticle (IR)
Real-time Reticle (RR) or Eye-gaze Reticle (original
name)
A-1
As RR and IR are aligned,
alignment time counts
down
A-2 A-3
Selection completed
32. Dependent Variables
Study
Dependent Variables
Objective Measures Subjective Measures
Part1
Task Completion Time
# Errors
Usability Ratings
Semi-structured interview
Part 2
Task Completion Time
# Errors
Usability Ratings
Semi-structured interview
Part 3 None
Usability Ratings
Semi-structured interview
33. Usability Ratings
33
Cond Median p Cond Median p Cond Median
GD1 5 GD2 5
DR 5 RP 5
GD1 5 GD2 5
DR 4 RP 5
GD1 5 GD2 5
DR 5 RP 5
GD1 5 GD2 5
DR 5 RP 6
GD1 6 GD2 6
DR 6 RP 6
GD1 6 GD2 6
DR 6 RP 6
GD1 5 GD2 5
DR 6 RP 4
GD1 3 GD2 2
DR 2 RP 3
GD1 2 GD2 3
DR 6 RP 5
1 2 3 4 5 6 7
I prefered this technique 0.02 0.12
I felt tired using it 0.09 0.03 NR 4
It was frustrating to use 0.14 0.30 NR 3
It was fun to use 0.14 0.07 NR 6
I need to concentrate to use it 0.33 0.09 NR 5
It was easy for me to use 0.07 0.07 NR 5
I felt satisfied using it 0.14 0.03 NR 5
It felt natural to use 0.17 0.07 NR 4
I could interact precisely 0.23 0.17 NR 4
PART 1 PART 2 PART 3
Frequencies Frequencies Frequencies
• No performance difference (as expected)
• Most participants preferred Duo-Reticles over Gaze-Dwell 1
• Radial Pursuit was more satisfying and less fatigue than Gaze-Dwell 2
34. Gaze Summary
• Three novel eye-gaze-based interaction
techniques inspired by natural eye movements
• An initial study found positive results supporting
our approaches where our techniques had
similar performance with Gaze-Dwell, but
superior user experience
• Continue to apply the same principles to improve
user experience using eye gaze for immersive
VR.
35. PinPointing
• Combining pointing + refinement
• Pointing: Head, eye gaze
• Refinement: None, gesture, device (clicker), head
Kytö, M., Ens, B., Piumsomboon, T., Lee, G. A., & Billinghurst, M. (2018). Pinpointing: Precise
Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI
Conference on Human Factors in Computing Systems (p. 81). ACM.
45. Empathy
“Seeing with the Eyes of another,
Listening with the Ears of another,
and Feeling with the Heart of another..”
Alfred Adler
46. Lab Research Focus
Can we develop systems
that allow us to share what
we are seeing, hearing and
feeling with others?
Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017). Empathic
Mixed Reality: Sharing What You Feel and Interacting with What You See. In Ubiquitous
Virtual Reality (ISUVR), 2017 International Symposium on (pp. 38-41). IEEE.
47. Using AR/VR/Wearables for Empathy
• Remove technology barriers
• Enhance communication
• Change perspective
• Share experiences
• Enhance interaction in real world
48. Example Projects
• Remote collaboration in Wearable AR
• Sharing of non-verbal cues (gaze, pointing, face expression)
• Shared Empathic VR experiences
• Use VR to put a viewer inside the players view
• Measuring emotion
• Detecting emotion from heart rate, GSR, eye gaze, etc.
49. Empathy Glasses
• Combine together eye-tracking, display, face expression
• Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
50. Remote Collboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
53. Ranking Results
"I ranked the (A) condition best, because I could easily point to
communicate, and when I needed it I could check the facial
expression to make sure I was being understood.”
Q2: Communication Q3: Understanding partner
HMD – Local User Computer – Remote User
54. Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration
through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics &
Interactive Applications (p. 14). ACM.
60. AR and VR for Empathic Computing
• VR systems are ideal for trying experiences:
• Strong story telling medium
• Provide total immersion/3D experience
• Easy to change virtual body scale and representation
• AR systems are idea for live sharing:
• Allow overlay on real world view/can share viewpoints
• Support remote annotation/communication
• Enhance real world task
61. MR Technology Trends
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
62. MR Technology Trends
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
63. Empathic Tele-Existence
• Move from Observer to Participant
• Explicit to Implicit communication
• Experiential collaboration – doing together