SlideShare ist ein Scribd-Unternehmen logo
1 von 64
Downloaden Sie, um offline zu lesen
MULTIMODAL, MULTISENSORY
INTERACTION FOR MIXED
REALITY
Mark Billinghurst
September 20th 2018
Who am I?
1986 1990 1994
BCMS (Hons)
Waikato
MPhil.
Waikato
PhD (EE)
Univ. Washington
1992
HIT Lab (Univ. Wash)
Virtual Reality
2002
ARToolKit
MagicBook
SharedSpace
MIT Media Lab
BT Labs
Timeline
2013
ARToolWorks (CEO)
2016
Univ. South
Australia
Nokia
HUT
Google
Glass
Envisage AR (CEO)
SuperVentures
Director, HIT Lab NZ (NZ)
2002 20092005
AR Tennis
AR Advertising Empathic
Computing
AR Conf. CityViewAR
Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
Mixed Reality
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
Milgram’s Reality-Virtuality continuum
Evolution of MR Displays
Interaction
1968 2018
User Interaction with MR Displays
• Headworn
• Handheld controller
• Head pose, touch
• Gesture, speech
• Handheld
• Touch, stylus, button
• Device motion
Multisensory Input
Natural Interaction Modalities
• Speech
• Gesture
• Touch
• Gaze
• Body motion
• Autonomic
• Etc..
Natural Gesture
• Freehand gesture input
• Depth sensors for gesture capture
• Move beyond simple pointing
• Rich two handed gestures
• Eg Microsoft Research Hand Tracker
• 3D hand tracking, 30 fps, single sensor
• Commercial Systems
• Meta, MS Hololens, Occulus, Intel, etc
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
https://www.youtube.com/watch?v=QTz1zQAnMcU
https://www.youtube.com/watch?v=LblxKvbfEoo
https://www.youtube.com/watch?v=635PVGicxng
Multi-Scale Gesture
• Combine different gesture types
• In-air gestures – natural but imprecise
• Micro-gesture – fine scale gestures
• Gross motion + fine tuning interaction
Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint: Exploring
Mixed-Scale Gesture Interaction for AR Applications. In Extended Abstracts of the 2018 CHI Conference on
Human Factors in Computing Systems (p. LBW120). ACM.
https://www.youtube.com/watch?v=TRfqNtt1VxY&t=23s
What Gesture Do People Want to Use?
• Limitations of Previous work in AR
• Limited range of gestures
• Gestures designed for optimal recognition
• Gestures studied as add-on to speech
• Solution – elicit desired gestures from users
• Eg. Gestures for surface computing [Wobbrock]
• Previous work in unistroke getsures, mobile
gestures
User Defined Gesture Study
• Use AR view
• HMD + AR tracking
• Present AR animations
• 40 tasks in six categories
• Editing, transforms, menu, etc.
• Ask users to produce
gestures causing animations
• Record gesture (video, depth)
Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. (2013). User-defined gestures for
augmented reality. In CHI'13 Extended Abstracts on Human Factors in Computing Systems
Data Recorded
• 20 participants
• Gestures recorded (video, depth data)
• 800 gestures from 40 tasks
• Subjective rankings
• Likert ranking of goodness, ease of use
• Think aloud transcripts
Results
• Gestures grouped according to similarity (320)
• 44 consensus (62% all gestures), 11 poses seen
Lessons Learned
• AR animation can elicit desired gestures
• For some tasks there is a high degree of
similarity in user defined gestures
• Especially command gestures (eg Open), select
• Less agreement in manipulation gestures
• Move (40%), rotate (30%), grouping (10%)
• Small portion of two handed gestures (22%)
• Scaling, group selection
Multimodal Input
• Combine gesture and speech input
• Gesture good for qualitative input
• Speech good for quantitative input
• Support combined commands
• “Put that there” + pointing
• Eg HIT Lab NZ multimodal input
• 3D hand tracking, speech
• Multimodal fusion module
• Complete tasks faster with MMI, less errors
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with
Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
Gaze Interaction
• Eye tracking in MR displays
• Commercially available
• Fast/robust
• What type of interaction?
• Conscious vs. unconscious interaction
Eye Tracking Input
• Smaller/cheaper eye-tracking systems
• More HMDs with integrated eye-tracking
• Research questions
• How can eye gaze be used for interaction?
• What interaction metaphors are natural?
• What technology can be used for eye-tracking?
Eye Gaze Interaction Methods
• Gaze for interaction
• Implicit vs. explicit input
• Exploring different gaze interaction
• Duo reticles – use eye saccade input
• Radial pursuit – use smooth pursuit motion
• Nod and roll – use the vestibular ocular reflex
• Hardware
• HTC Vive + Pupil Labs integrated eye-tracking
• User study to compare between methods for 3DUI
Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March).
Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User
Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
Duo-Reticles (DR)
Inertial Reticle (IR)
Real-time Reticle (RR) or Eye-gaze Reticle (original
name)
A-1
As RR and IR are aligned,
alignment time counts
down
A-2 A-3
Selection completed
Radial Pursuit (RP)
B-1
Real-time Reticle
(RR)
B-2 B-3 B-4
𝑑"#$ = min 𝑑), 𝑑+, … , 𝑑- , 𝑑# =	∑ |𝑝(𝑖)5	 −	𝑝′5	 |$
5859:9;9<=
Nod and Roll (NR)
C-2
C-1
Head-gaze Reticle (HR)
Real-time Reticle
(RR)
C-3
https://www.youtube.com/watch?v=EpCGqxkmBKE
Initial Study
Study
Independent Variable
Interaction Techniques
Part 1 Duo-Reticles vs Gaze-Dwell 1 (GD1)
Part 2 Radial Pursuit vs Gaze-Dwell 2 (GD2)
Part 3 Explorative
Dependent Variables
Study
Dependent Variables
Objective Measures Subjective Measures
Part1
Task Completion Time
# Errors
Usability Ratings
Semi-structured interview
Part 2
Task Completion Time
# Errors
Usability Ratings
Semi-structured interview
Part 3 None
Usability Ratings
Semi-structured interview
Usability Ratings
33
Cond Median p Cond Median p Cond Median
GD1 5 GD2 5
DR 5 RP 5
GD1 5 GD2 5
DR 4 RP 5
GD1 5 GD2 5
DR 5 RP 5
GD1 5 GD2 5
DR 5 RP 6
GD1 6 GD2 6
DR 6 RP 6
GD1 6 GD2 6
DR 6 RP 6
GD1 5 GD2 5
DR 6 RP 4
GD1 3 GD2 2
DR 2 RP 3
GD1 2 GD2 3
DR 6 RP 5
1 2 3 4 5 6 7
I prefered this technique 0.02 0.12
I felt tired using it 0.09 0.03 NR 4
It was frustrating to use 0.14 0.30 NR 3
It was fun to use 0.14 0.07 NR 6
I need to concentrate to use it 0.33 0.09 NR 5
It was easy for me to use 0.07 0.07 NR 5
I felt satisfied using it 0.14 0.03 NR 5
It felt natural to use 0.17 0.07 NR 4
I could interact precisely 0.23 0.17 NR 4
PART 1 PART 2 PART 3
Frequencies Frequencies Frequencies
• No performance difference (as expected)
• Most participants preferred Duo-Reticles over Gaze-Dwell 1
• Radial Pursuit was more satisfying and less fatigue than Gaze-Dwell 2
Gaze Summary
• Three novel eye-gaze-based interaction
techniques inspired by natural eye movements
• An initial study found positive results supporting
our approaches where our techniques had
similar performance with Gaze-Dwell, but
superior user experience
• Continue to apply the same principles to improve
user experience using eye gaze for immersive
VR.
PinPointing
• Combining pointing + refinement
• Pointing: Head, eye gaze
• Refinement: None, gesture, device (clicker), head
Kytö, M., Ens, B., Piumsomboon, T., Lee, G. A., & Billinghurst, M. (2018). Pinpointing: Precise
Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI
Conference on Human Factors in Computing Systems (p. 81). ACM.
Multimodal Input Modes
https://www.youtube.com/watch?v=q9cbAfxKAPI
Behaviour
• Two phase movement
Results - time
Results - accuracy
Results - accuracy
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Empathic
Computing
Empathy
“Seeing with the Eyes of another,
Listening with the Ears of another,
and Feeling with the Heart of another..”
Alfred Adler
Lab Research Focus
Can we develop systems
that allow us to share what
we are seeing, hearing and
feeling with others?
Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017). Empathic
Mixed Reality: Sharing What You Feel and Interacting with What You See. In Ubiquitous
Virtual Reality (ISUVR), 2017 International Symposium on (pp. 38-41). IEEE.
Using AR/VR/Wearables for Empathy
• Remove technology barriers
• Enhance communication
• Change perspective
• Share experiences
• Enhance interaction in real world
Example Projects
• Remote collaboration in Wearable AR
• Sharing of non-verbal cues (gaze, pointing, face expression)
• Shared Empathic VR experiences
• Use VR to put a viewer inside the players view
• Measuring emotion
• Detecting emotion from heart rate, GSR, eye gaze, etc.
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Demo Video
https://www.youtube.com/watch?v=CdgWVDbMwp4&t=6s
Task Performance
• Performance Time (seconds)
Ranking Results
"I ranked the (A) condition best, because I could easily point to
communicate, and when I needed it I could check the facial
expression to make sure I was being understood.”
Q2: Communication Q3: Understanding partner
HMD – Local User Computer – Remote User
Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration
through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics &
Interactive Applications (p. 14). ACM.
https://www.youtube.com/watch?v=q_giuLot76k
Sharing VR Experiences
• HTC Vive HMD
• Empathic glove
• Empatica E4
VR Environments
• Butterfly World: calm scene, collect butterflies
• Zombie Attack: scary scene, fighting zombies
CoVAR - AR/VR Collaboration
• HTC Vive (VR User)
• HoloLens (AR User)
• Room scale tracking
• Gesture input (Leap Motion)
Demo: Multi-scale Collaboration
https://www.youtube.com/watch?v=K_afCWZtExk
AR and VR for Empathic Computing
• VR systems are ideal for trying experiences:
• Strong story telling medium
• Provide total immersion/3D experience
• Easy to change virtual body scale and representation
• AR systems are idea for live sharing:
• Allow overlay on real world view/can share viewpoints
• Support remote annotation/communication
• Enhance real world task
MR Technology Trends
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
MR Technology Trends
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Empathic Tele-Existence
• Move from Observer to Participant
• Explicit to Implicit communication
• Experiential collaboration – doing together
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz

Weitere ähnliche Inhalte

Was ist angesagt?

Was ist angesagt? (20)

Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with Unity
 
Comp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionComp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and Interaction
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR Tracking
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyCOMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR Technology
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 

Ähnlich wie Multimodal Multi-sensory Interaction for Mixed Reality

Ähnlich wie Multimodal Multi-sensory Interaction for Mixed Reality (20)

COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
 
Empathic Mixed Reality
Empathic Mixed RealityEmpathic Mixed Reality
Empathic Mixed Reality
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Reality As A Knowledge Medium
Reality As A Knowledge MediumReality As A Knowledge Medium
Reality As A Knowledge Medium
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
 
[Seminar] 20210108 Seunghyeong Choe
[Seminar] 20210108 Seunghyeong Choe[Seminar] 20210108 Seunghyeong Choe
[Seminar] 20210108 Seunghyeong Choe
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
 
Designing Augmented Reality Experiences
Designing Augmented Reality ExperiencesDesigning Augmented Reality Experiences
Designing Augmented Reality Experiences
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
 
Natural Interaction for Augmented Reality Applications
Natural Interaction for Augmented Reality ApplicationsNatural Interaction for Augmented Reality Applications
Natural Interaction for Augmented Reality Applications
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
 

Mehr von Mark Billinghurst

Mehr von Mark Billinghurst (7)

IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Kürzlich hochgeladen

Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 

Kürzlich hochgeladen (20)

Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 

Multimodal Multi-sensory Interaction for Mixed Reality

  • 1. MULTIMODAL, MULTISENSORY INTERACTION FOR MIXED REALITY Mark Billinghurst September 20th 2018
  • 2.
  • 3. Who am I? 1986 1990 1994 BCMS (Hons) Waikato MPhil. Waikato PhD (EE) Univ. Washington 1992 HIT Lab (Univ. Wash) Virtual Reality 2002 ARToolKit MagicBook SharedSpace MIT Media Lab BT Labs
  • 4. Timeline 2013 ARToolWorks (CEO) 2016 Univ. South Australia Nokia HUT Google Glass Envisage AR (CEO) SuperVentures Director, HIT Lab NZ (NZ) 2002 20092005 AR Tennis AR Advertising Empathic Computing AR Conf. CityViewAR
  • 5. Milgram’s Reality-Virtuality continuum Mixed Reality Reality - Virtuality (RV) Continuum Real Environment Augmented Reality (AR) Augmented Virtuality (AV) Virtual Environment "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  • 6. Mixed Reality Real Environment Augmented Reality (AR) Augmented Virtuality (AV) Virtual Environment Milgram’s Reality-Virtuality continuum
  • 7. Evolution of MR Displays
  • 9. User Interaction with MR Displays • Headworn • Handheld controller • Head pose, touch • Gesture, speech • Handheld • Touch, stylus, button • Device motion
  • 11. Natural Interaction Modalities • Speech • Gesture • Touch • Gaze • Body motion • Autonomic • Etc..
  • 12. Natural Gesture • Freehand gesture input • Depth sensors for gesture capture • Move beyond simple pointing • Rich two handed gestures • Eg Microsoft Research Hand Tracker • 3D hand tracking, 30 fps, single sensor • Commercial Systems • Meta, MS Hololens, Occulus, Intel, etc Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S. (2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
  • 16. Multi-Scale Gesture • Combine different gesture types • In-air gestures – natural but imprecise • Micro-gesture – fine scale gestures • Gross motion + fine tuning interaction Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint: Exploring Mixed-Scale Gesture Interaction for AR Applications. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (p. LBW120). ACM.
  • 18. What Gesture Do People Want to Use? • Limitations of Previous work in AR • Limited range of gestures • Gestures designed for optimal recognition • Gestures studied as add-on to speech • Solution – elicit desired gestures from users • Eg. Gestures for surface computing [Wobbrock] • Previous work in unistroke getsures, mobile gestures
  • 19. User Defined Gesture Study • Use AR view • HMD + AR tracking • Present AR animations • 40 tasks in six categories • Editing, transforms, menu, etc. • Ask users to produce gestures causing animations • Record gesture (video, depth) Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. (2013). User-defined gestures for augmented reality. In CHI'13 Extended Abstracts on Human Factors in Computing Systems
  • 20. Data Recorded • 20 participants • Gestures recorded (video, depth data) • 800 gestures from 40 tasks • Subjective rankings • Likert ranking of goodness, ease of use • Think aloud transcripts
  • 21. Results • Gestures grouped according to similarity (320) • 44 consensus (62% all gestures), 11 poses seen
  • 22. Lessons Learned • AR animation can elicit desired gestures • For some tasks there is a high degree of similarity in user defined gestures • Especially command gestures (eg Open), select • Less agreement in manipulation gestures • Move (40%), rotate (30%), grouping (10%) • Small portion of two handed gestures (22%) • Scaling, group selection
  • 23. Multimodal Input • Combine gesture and speech input • Gesture good for qualitative input • Speech good for quantitative input • Support combined commands • “Put that there” + pointing • Eg HIT Lab NZ multimodal input • 3D hand tracking, speech • Multimodal fusion module • Complete tasks faster with MMI, less errors Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
  • 24. Gaze Interaction • Eye tracking in MR displays • Commercially available • Fast/robust • What type of interaction? • Conscious vs. unconscious interaction
  • 25. Eye Tracking Input • Smaller/cheaper eye-tracking systems • More HMDs with integrated eye-tracking • Research questions • How can eye gaze be used for interaction? • What interaction metaphors are natural? • What technology can be used for eye-tracking?
  • 26. Eye Gaze Interaction Methods • Gaze for interaction • Implicit vs. explicit input • Exploring different gaze interaction • Duo reticles – use eye saccade input • Radial pursuit – use smooth pursuit motion • Nod and roll – use the vestibular ocular reflex • Hardware • HTC Vive + Pupil Labs integrated eye-tracking • User study to compare between methods for 3DUI Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March). Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
  • 27. Duo-Reticles (DR) Inertial Reticle (IR) Real-time Reticle (RR) or Eye-gaze Reticle (original name) A-1 As RR and IR are aligned, alignment time counts down A-2 A-3 Selection completed
  • 28. Radial Pursuit (RP) B-1 Real-time Reticle (RR) B-2 B-3 B-4 𝑑"#$ = min 𝑑), 𝑑+, … , 𝑑- , 𝑑# = ∑ |𝑝(𝑖)5 − 𝑝′5 |$ 5859:9;9<=
  • 29. Nod and Roll (NR) C-2 C-1 Head-gaze Reticle (HR) Real-time Reticle (RR) C-3
  • 31. Initial Study Study Independent Variable Interaction Techniques Part 1 Duo-Reticles vs Gaze-Dwell 1 (GD1) Part 2 Radial Pursuit vs Gaze-Dwell 2 (GD2) Part 3 Explorative
  • 32. Dependent Variables Study Dependent Variables Objective Measures Subjective Measures Part1 Task Completion Time # Errors Usability Ratings Semi-structured interview Part 2 Task Completion Time # Errors Usability Ratings Semi-structured interview Part 3 None Usability Ratings Semi-structured interview
  • 33. Usability Ratings 33 Cond Median p Cond Median p Cond Median GD1 5 GD2 5 DR 5 RP 5 GD1 5 GD2 5 DR 4 RP 5 GD1 5 GD2 5 DR 5 RP 5 GD1 5 GD2 5 DR 5 RP 6 GD1 6 GD2 6 DR 6 RP 6 GD1 6 GD2 6 DR 6 RP 6 GD1 5 GD2 5 DR 6 RP 4 GD1 3 GD2 2 DR 2 RP 3 GD1 2 GD2 3 DR 6 RP 5 1 2 3 4 5 6 7 I prefered this technique 0.02 0.12 I felt tired using it 0.09 0.03 NR 4 It was frustrating to use 0.14 0.30 NR 3 It was fun to use 0.14 0.07 NR 6 I need to concentrate to use it 0.33 0.09 NR 5 It was easy for me to use 0.07 0.07 NR 5 I felt satisfied using it 0.14 0.03 NR 5 It felt natural to use 0.17 0.07 NR 4 I could interact precisely 0.23 0.17 NR 4 PART 1 PART 2 PART 3 Frequencies Frequencies Frequencies • No performance difference (as expected) • Most participants preferred Duo-Reticles over Gaze-Dwell 1 • Radial Pursuit was more satisfying and less fatigue than Gaze-Dwell 2
  • 34. Gaze Summary • Three novel eye-gaze-based interaction techniques inspired by natural eye movements • An initial study found positive results supporting our approaches where our techniques had similar performance with Gaze-Dwell, but superior user experience • Continue to apply the same principles to improve user experience using eye gaze for immersive VR.
  • 35. PinPointing • Combining pointing + refinement • Pointing: Head, eye gaze • Refinement: None, gesture, device (clicker), head Kytö, M., Ens, B., Piumsomboon, T., Lee, G. A., & Billinghurst, M. (2018). Pinpointing: Precise Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 81). ACM.
  • 42.
  • 45. Empathy “Seeing with the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 46. Lab Research Focus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others? Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017). Empathic Mixed Reality: Sharing What You Feel and Interacting with What You See. In Ubiquitous Virtual Reality (ISUVR), 2017 International Symposium on (pp. 38-41). IEEE.
  • 47. Using AR/VR/Wearables for Empathy • Remove technology barriers • Enhance communication • Change perspective • Share experiences • Enhance interaction in real world
  • 48. Example Projects • Remote collaboration in Wearable AR • Sharing of non-verbal cues (gaze, pointing, face expression) • Shared Empathic VR experiences • Use VR to put a viewer inside the players view • Measuring emotion • Detecting emotion from heart rate, GSR, eye gaze, etc.
  • 49. Empathy Glasses • Combine together eye-tracking, display, face expression • Impicit cues – eye gaze, face expression ++ Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 50. Remote Collboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 53. Ranking Results "I ranked the (A) condition best, because I could easily point to communicate, and when I needed it I could check the facial expression to make sure I was being understood.” Q2: Communication Q3: Understanding partner HMD – Local User Computer – Remote User
  • 54. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (p. 14). ACM.
  • 56. Sharing VR Experiences • HTC Vive HMD • Empathic glove • Empatica E4
  • 57. VR Environments • Butterfly World: calm scene, collect butterflies • Zombie Attack: scary scene, fighting zombies
  • 58. CoVAR - AR/VR Collaboration • HTC Vive (VR User) • HoloLens (AR User) • Room scale tracking • Gesture input (Leap Motion)
  • 60. AR and VR for Empathic Computing • VR systems are ideal for trying experiences: • Strong story telling medium • Provide total immersion/3D experience • Easy to change virtual body scale and representation • AR systems are idea for live sharing: • Allow overlay on real world view/can share viewpoints • Support remote annotation/communication • Enhance real world task
  • 61. MR Technology Trends • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing
  • 62. MR Technology Trends • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  • 63. Empathic Tele-Existence • Move from Observer to Participant • Explicit to Implicit communication • Experiential collaboration – doing together