4. The ultimate display would, of course, be a room within
which the computer can control the existence of matter. A
chair displayed in such a room would be good enough to sit
in. Handcuffs displayed in such a room would be confining,
and a bullet displayed in such a room would be fatal..
Sutherland, Ivan. "The ultimate display." (1965).
21. Today vs. Tomorrow
VR in 2021 VR in 2045
Graphics High quality Photo-realistic
Display 110-150 degrees Total immersion
Interaction Handheld controller/some gesture Full gesture/body/gaze
Navigation Limited movement Natural
Multiuser Few users Millions of users
24. “.. the technologies that will significantly
affect our lives over the next 10 years
have been around for a decade. The
future is with us ... The trick is learning
how to spot it”
25. Key Technologies for MR Systems
• Stimulate visual, hearing/touch sense
• Changing viewpoint, registered content
• Supporting user input
37. Natural Gesture
• Freehand gesture input
• Depth sensors for gesture capture
• Move beyond simple pointing
• Rich two handed gestures
• Eg Microsoft Research Hand Tracker
• 3D hand tracking, 30 fps, single sensor
• Commercial Systems
• Hololens2, Oculus, Intel, MagicLeap, etc
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
39. Multi-Scale Gesture
• Combine different gesture types
• In-air gestures – natural but imprecise
• Micro-gesture – fine scale gestures
• Gross motion + fine tuning interaction
Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint:
Exploring Mixed-Scale Gesture Interaction for AR Applications. In Extended Abstracts of the 2018 CHI
Conference on Human Factors in Computing Systems (p. LBW120). ACM.
50. Multimodal Input
• Combine gesture and speech input
• Gesture good for qualitative input
• Speech good for quantitative input
• Support combined commands
• “Put that there” + pointing
• E.g. HIT Lab NZ multimodal input
• 3D hand tracking, speech
• Multimodal fusion module
• Complete tasks faster with MMI, less errors
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with
Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
51. Intelligent Interfaces
• Move to Implicit Input vs. Explicit
• Recognize user behaviour
• Provide adaptive feedback
• Move beyond check-lists of actions
• E.g. AR + Intelligent Tutoring
• Constraint based ITS + AR
• PC Assembly (Westerfield, 2015)
• 30% faster, 25% better retention
Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for
Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
54. Evolution of Tracking
• Location based, marker based,
• Image based, hybrid tracking
• Model based
55. Model Based Tracking
• Track from known 3D model
• Use depth + colour information
• Match input to model template
• Use CAD model of targets
• Recent innovations
• Learn models online
• Tracking from cluttered scene
• Track from deformable objects
Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013). Model based training, detection
and pose estimation of texture-less 3D objects in heavily cluttered scenes. In Computer Vision–ACCV 2012 (pp. 548-562).
57. Environmental Tracking (3+ yrs)
• Environment capture
• Use depth sensors to capture scene & track from model
• InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/)
• Real time scene capture, dense or sparse capture, open source
• iPad Pro LiDAR
• Scene scanning up to 5m
61. Wide Area Outdoor Tracking
• Combine panoramas into point cloud model (offline)
• Initialize camera tracking from point cloud
• Update pose by aligning camera image to point cloud
• Accurate to 25 cm, 0.5 degree over very wide area
Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed
and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
63. AR Cloud Based Tracking
• AR Cloud
• a machine-readable 1:1 scale model of the real world
• processing recognition/tracking data in the cloud
• Can create cloud from input from multiple devices
• Store key visual features in cloud, Stitch features from multiple devices
• Retrieve for tracking/interaction
• AR Cloud Companies
• 6D.ai, Vertical.ai, Ubiquity6, etc
66. AR/VR as Perceptual Phenomena
• Virtual Reality
• Do I perceive myself as being in the Virtual Environment?
• Sense of Presence
• Augmented Reality
• Is that virtual object part of my real world?
• Sense of Object Presence
• What perceptual cues create a sense of Presence/Object Presence?
• How can Presence/Object Presence be measured?
67. Measuring Presence
• Presence is very subjective so how to measure it ?
• Subjective Measures
• Self report questionnaire
• University College London Questionnaire (Slater 1999)
• Witmer and Singer Presence Questionnaire (Witmer 1998)
• ITC Sense Of Presence Inventory (Lessiter 2000)
• Continuous measure
• Person moves slider bar in VE depending on Presence felt
• Objective Measures
• reflex/flinch measure, startle response
• Physiological measures
• change in heart rate, skin conductance, skin temperature
68. Using Neuro-Physiological Presence Measures
• Put people in High Presence/Low Presence VR
• Measure physiological cues
• EEG, ECG, GSR
• Measure subjective cues
• Presence surveys
• SUS, Witmer-Singer
• Correlate subjective and physiological results
Dey, A., Phoon, J., Saha, S., Dobbins, C., & Billinghurst, M. (2020, November). A Neurophysiological Approach for
Measuring Presence in Immersive Virtual Environments. In 2020 IEEE International Symposium on Mixed and Augmented
Reality (ISMAR) (pp. 474-485). IEEE.
• Significant difference in subjective presence scores between HP/LP VE
• Significant difference in EEG power and heart rate between HP/LP VE
EEG Power Heart Rate
70. Perception Based Graphics
• Eye Physiology
• Rods in eye centre = colour vision, cones in periphery = motion, B+W
• Foveated Rendering
• Use eye tracking to draw highest resolution where user looking
• Reduces graphics throughput
72. Making AR Content Appear Real
• Audio cues
• Touch/Haptic feedback
73. Key Perceptual Issues in AR
• Classification of Perceptual Issues
• Environment, Capturing, Augmentation
• Display device, User
Kruijff, E., Swan, J. E., & Feiner, S. (2010, October). Perceptual issues in augmented reality revisited.
In 2010 IEEE International Symposium on Mixed and Augmented Reality (pp. 3-12). IEEE.
76. Social Acceptance
• People don’t want to look silly
• Only 12% of 4,600 adults would be willing to wear AR glasses
• 20% of mobile AR browser users experience social issues
• Acceptance more due to Social than Technical issues
• Needs further study (ethnographic, field tests, longitudinal)
80. Ethical Issues
• Persuasive Technology
• Affecting emotions
• Behaviour modification
• Privacy Concerns
• Facial recognition
• Space capture
• Personal data
• Safety Concerns
• Sim sickness, Distraction
• Long term effects
Pase, S. (2012). Ethical considerations in augmented reality applications. In Proceedings of the International Conference on
e-Learning, e-Business, Enterprise Information Systems, and e-Government (EEE) (p. 1). The Steering Committee of The
World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).
85. Shared Sphere – 360 Video Sharing
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
95. Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive
avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
96. Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
102. Research Issues
• How to easily create?
• How realistic should they be?
• How can you communicate social cues?
• Hybrid Interfaces
• How can you provide equity across difference devices?
• Social Presence
• How can you objectively measure Social Presence?
• How use AR/VR cues to increase Social Presence?
103. Collaboration Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
106. Empathic Computing
Can we develop systems
that allow us to share what
we are seeing, hearing and
feeling with others?
Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017). Empathic Mixed
Reality: Sharing What You Feel and Interacting with What You See. In Ubiquitous Virtual
Reality (ISUVR), 2017 International Symposium on (pp. 38-41). IEEE.
107. Empathy Glasses (CHI 2016)
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
122. The Metaverse
• Neal Stephenson’s “SnowCrash”
• VR successor to the internet
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
• Metaverse Roadmap
• AR/VR/MR is becoming commonly available
• Significant advances over 50+ years
• In order to achieve Sutherland’s vision, need research in
• Display, Tracking, Input
• New MR technologies will enable this to happen
• Display devices, Interaction, Tracking technologies
• There are still significant areas for research
• Social Acceptance, Perception, Collaboration, Etc.