SlideShare ist ein Scribd-Unternehmen logo
1 von 105
Downloaden Sie, um offline zu lesen
AR INTERACTION
COMP 4010 Lecture Four
Mark Billinghurst
August 18th 2022
mark.billinghurst@unisa.edu.au
REVIEW
AR RequiresTracking and Registration
• Registration
• Positioning virtual object wrt real world
• Fixing virtual object on real object when view is fixed
• Calibration
• Offline measurements
• Measure camera relative to head mounted display
• Tracking
• Continually locating the user’s viewpoint when view moving
• Position (x,y,z), Orientation (r,p,y)
Sources of Registration Errors
•Static errors
• Optical distortions (in HMD)
• Mechanical misalignments
• Tracker errors
• Incorrect viewing parameters
•Dynamic errors
• System delays (largest source of error)
• 1 ms delay = 1/3 mm registration error
Reducing Static Errors
•Distortion compensation
• For lens or display distortions
•Manual adjustments
• Have user manually alighn AR andVR content
•View-based or direct measurements
• Have user measure eye position
•Camera calibration (video AR)
• Measuring camera properties
Reducing dynamic errors (1)
•Reduce system lag
•Faster components/system modules
•Reduce apparent lag
•Image deflection
•Image warping
TRACKING
Frames of Reference
• Word-stabilized
• E.g., billboard or signpost
• Body-stabilized
• E.g., virtual tool-belt
• Screen-stabilized
• Heads-up display
Tracking Requirements
• Augmented Reality Information Display
• World Stabilized
• Body Stabilized
• Head Stabilized
Increasing Tracking
Requirements
Head Stabilized Body Stabilized World Stabilized
Tracking Technologies
§ Active
• Mechanical, Magnetic, Ultrasonic
• GPS, Wifi, cell location
§ Passive
• Inertial sensors (compass, accelerometer, gyro)
• Computer Vision
• Marker based, Natural feature tracking
§ Hybrid Tracking
• Combined sensors (eg Vision + Inertial)
Tracking Types
Magnetic
Tracker
Inertial
Tracker
Ultrasonic
Tracker
Optical
Tracker
Marker-Based
Tracking
Markerless
Tracking
Specialize
dTracking
Edge-Based
Tracking
Template-
BasedTracking
Interest Point
Tracking
Mechanical
Tracker
Why Optical Tracking for AR?
• Many AR devices have cameras
• Mobile phone/tablet, Video see-through display
• Provides precise alignment between video and AR overlay
• Using features in video to generate pixel perfect alignment
• Real world has many visual features that can be tracked from
• Computer Vision well established discipline
• Over 40 years of research to draw on
• Old non real time algorithms can be run in real time on todays devices
Common AR Optical Tracking Types
• Marker Tracking
• Tracking known artificial markers/images
• e.g. ARToolKit square markers
• Markerless Tracking
• Tracking from known features in real world
• e.g. Vuforia image tracking
• Unprepared Tracking
• Tracking in unknown environment
• e.g. SLAM tracking
Marker Based Tracking: ARToolKit
http://www.artoolkit.org
Natural Feature Tracking
• Use Natural Cues of Real Elements
• Edges
• Surface Texture
• Interest Points
• Model or Model-Free
• No visual pollution
Contours
Features Points
Surfaces
Detection and Tracking
Detection
Incremental
tracking
Tracking target
detected
Tracking target
lost
Tracking target
not detected
Incremental
tracking ok
Start
+ Recognize target type
+ Detect target
+ Initialize camera pose
+ Fast
+ Robust to blur, lighting
changes
+ Robust to tilt
Tracking and detection are complementary approaches.
After successful detection, the target is tracked incrementally.
If the target is lost, the detection is activated again
Marker vs.Natural FeatureTracking
• Marker tracking
• Usually requires no database to be stored
• Markers can be an eye-catcher
• Tracking is less demanding
• The environment must be instrumented
• Markers usually work only when fully in view
• Natural feature tracking
• A database of keypoints must be stored/downloaded
• Natural feature targets might catch the attention less
• Natural feature targets are potentially everywhere
• Natural feature targets work also if partially in view
Model BasedTracking
• Tracking from 3D object shape
• Example: OpenTL - www.opentl.org
• General purpose library for model based visual tracking
Tracking from an Unknown Environment
• What to do when you don’t know any features?
• Very important problem in mobile robotics - Where am I?
• SLAM
• Simultaneously Localize And Map the environment
• Goal: to recover both camera pose and map structure
while initially knowing neither.
• Mapping:
• Building a map of the environment which the robot is in
• Localisation:
• Navigating this environment using the map while keeping
track of the robot’s relative position and orientation
Parallel Tracking and Mapping
Tracking Mapping
New keyframes
Map updates
+ Estimate camera pose
+ For every frame
+ Extend map
+ Improve map
+ Slow updates rate
Parallel tracking and mapping uses two
concurrent threads, one for tracking and one for
mapping, which run at different speeds
Parallel Tracking and Mapping
Video stream
New frames
Map updates
Tracking Mapping
Tracked local pose
FAST SLOW
Simultaneous
localization and mapping
(SLAM)
in small workspaces
Klein/Drummond, U.
Cambridge
Visual SLAM
• Early SLAM systems (1986 - )
• Computer visions and sensors (e.g. IMU, laser, etc.)
• One of the most important algorithms in Robotics
• Visual SLAM
• Using cameras only, such as stereo view
• MonoSLAM (single camera) developed in 2007 (Davidson)
Combining Sensors andVision
• Sensors
• Produces noisy output (= jittering augmentations)
• Are not sufficiently accurate (= wrongly placed augmentations)
• Gives us first information on where we are in the world,
and what we are looking at
• Vision
• Is more accurate (= stable and correct augmentations)
• Requires choosing the correct keypoint database to track from
• Requires registering our local coordinate frame (online-
generated model) to the global one (world)
ARKit – Visual Inertial Odometry
• Uses both computer vision + inertial sensing
• Tracking position twice
• Computer Vision – feature tracking, 2D plane tracking
• Inertial sensing – using the phone IMU
• Output combined via Kalman filter
• Determine which output is most accurate
• Pass pose to ARKit SDK
• Each system compliments the other
• Computer vision – needs visual features
• IMU - drifts over time, doesn’t need features
ARKit –Visual Inertial Odometry
• Slow camera
• Fast IMU
• If camera drops out IMU takes over
• Camera corrects IMU errors
Conclusions
• Tracking and Registration are key problems
• Registration error
• Measures against static error
• Measures against dynamic error
• AR typically requires multiple tracking technologies
• Computer vision most popular
• Research Areas:
• SLAM systems, Deformable models, Mobile outdoor tracking
3: AR INTERACTION
Augmented Reality technology
• Combines Real and Virtual Images
• Needs: Display technology
• Interactive in real-time
• Needs: Input and interaction technology
• Registered in 3D
• Needs: Viewpoint tracking technology
How Do You Design an Interface for This?
AR Interaction
• Designing AR Systems = Interface Design
• Using different input and output technologies
• Objective is a high quality of user experience
• Ease of use and learning
• Performance and satisfaction
Typical Interface Design Path
1/ Prototype Demonstration
2/ Adoption of Interaction Techniques from
other interface metaphors
3/ Development of new interface metaphors
appropriate to the medium
4/ Development of formal theoretical models
for predicting and modeling user actions
Desktop WIMP
Virtual Reality
Augmented Reality
Interacting with AR Content
• You can see spatially registered AR..
how can you interact with it?
Different Types of AR Interaction
• Browsing Interfaces
• simple (conceptually!), unobtrusive
• 3D AR Interfaces
• expressive, creative, require attention
• Tangible Interfaces
• Embedded into conventional environments
• Tangible AR
• Combines TUI input + AR display
AR Interfaces as Data Browsers
• 2D/3D virtual objects are
registered in 3D
• “VR in Real World”
• Interaction
• 2D/3D virtual viewpoint control
• Applications
• Visualization, training
AR Information Browsers
• Information is registered
to
real-world context
• Hand held AR displays
• Interaction
• Manipulation of a window
into information space
• Applications
• Context-aware information
displays
Rekimoto, et al. 1997
NaviCam Demo (1997)
Navicam Architecture
Current AR Information Browsers
• Mobile AR
• GPS + compass
• Many Applications
• Wikitude
• Yelp
• Google maps
• …
Example: Google Maps AR Mode
• AR Navigation Aid
• GPS + compass, 2D/3D object placement
Advantages and Disadvantages
• Important class of AR interfaces
• Wearable computers
• AR simulation, training
• Limited interactivity
• Modification of virtual
content is difficult
Rekimoto, et al. 1997
3D AR Interfaces
• Virtual objects displayed in 3D
physical space and manipulated
• HMDs and 6DOF head-tracking
• 6DOF hand trackers for input
• Interaction
• Viewpoint control
• Traditional 3D user interface
interaction: manipulation, selection,
etc.
Kiyokawa, et al. 2000
AR 3D Interaction (2000)
Example: AR Graffiti
www.nextwall.net
Advantages and Disadvantages
• Important class of AR interfaces
• Entertainment, design, training
• Advantages
• User can interact with 3D virtual
object everywhere in space
• Natural, familiar interaction
• Disadvantages
• Usually no tactile feedback
• User has to use different devices for
virtual and physical objects
Oshima, et al. 2000
3. Augmented Surfaces and Tangible Interfaces
• Basic principles
• Virtual images are projected
on a surface
• Physical objects are used as
controls for virtual objects
• Support for collaboration
Wellner, P. (1993). Interacting with paper on the
DigitalDesk. Communications of the ACM, 36(7), 87-96.
Augmented Surfaces
• Rekimoto, et al. 1999
• Front projection
• Marker-based tracking
• Multiple projection surfaces
• Object interaction
Rekimoto, J., & Saitoh, M. (1999, May). Augmented
surfaces: a spatially continuous work space for hybrid
computing environments. In Proceedings of the SIGCHI
conference on Human Factors in Computing
Systems (pp. 378-385).
Augmented Surfaces Demo (1999)
https://www.youtube.com/watch?v=r4g_fvnjVCA
Tangible User Interfaces (Ishii 97)
• Create digital shadows
for physical objects
• Foreground
• graspable UI
• Background
• ambient interfaces
Tangible Interfaces - Ambient
• Dangling String
• Jeremijenko 1995
• Ambient ethernet monitor
• Relies on peripheral cues
• Ambient Fixtures
• Dahley, Wisneski, Ishii 1998
• Use natural material qualities
for information display
Tangible Interface: ARgroove
• Collaborative Instrument
• Exploring Physically Based
Interaction
• Map physical actions to
Midi output
• Translation, rotation
• Tilt, shake
ARGroove Demo (2001)
ARgroove in Use
Visual Feedback
•Continuous Visual Feedback is Key
•Single Virtual Image Provides:
• Rotation
• Tilt
• Height
i/O Brush (Ryokai, Marti, Ishii) - 2004
Ryokai, K., Marti, S., & Ishii, H. (2004, April). I/O brush: drawing with everyday objects as ink.
In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 303-310).
i/O Brush Demo (2005)
https://www.youtube.com/watch?v=04v_v1gnyO8
Many Other Examples
• Triangles (Gorbert 1998)
• Triangular based story telling
• ActiveCube (Kitamura 2000-)
• Cubes with sensors
• Reactable (2007- )
• Cube based music interface
Lessons from Tangible Interfaces
• Physical objects make us smart
• Norman’s “Things that Make Us Smart”
• encode affordances, constraints
• Objects aid collaboration
• establish shared meaning
• Objects increase understanding
• serve as cognitive artifacts
But There are TUI Limitations
• Difficult to change object properties
• can’t tell state of digital data
• Limited display capabilities
• projection screen = 2D
• dependent on physical display surface
• Separation between object and display
• ARgroove – Interact on table, look at screen
Advantages and Disadvantages
•Advantages
• Natural - user’s hands are used for interacting
with both virtual and real objects.
• No need for special purpose input devices
•Disadvantages
• Interaction is limited only to 2D surface
• Full 3D interaction and manipulation is difficult
Orthogonal Nature of Interfaces
3D AR interfaces Tangible Interfaces
Spatial Gap No – Interaction is
Everywhere
Yes – Interaction is
only on 2D surfaces
Interaction Gap
Yes – separate
devices for physical
and virtual objects
No – same devices for
physical and virtual
objects
Orthogonal Nature of Interfaces
3D AR interfaces Tangible Interfaces
Spatial Gap No – Interaction is
Everywhere
Yes – Interaction is
only on 2D surfaces
Interaction Gap
Yes – separate
devices for physical
and virtual objects
No – same devices for
physical and virtual
objects
4. Tangible AR: Back to the Real World
• AR overcomes display limitation of TUIs
• enhance display possibilities
• merge task/display space
• provide public and private views
• TUI + AR = Tangible AR
• Apply TUI methods to AR interface design
Billinghurst, M., Kato, H., & Poupyrev, I. (2008). Tangible augmented reality. ACM Siggraph Asia, 7(2), 1-10.
Space vs. Time - Multiplexed
• Space-multiplexed
• Many devices each with one function
• Quicker to use, more intuitive, clutter
• Real Toolbox
• Time-multiplexed
• One device with many functions
• Space efficient
• mouse
Tangible AR: Tiles (Space Multiplexed)
• Tiles semantics
• data tiles
• operation tiles
• Operation on tiles
• proximity
• spatial arrangements
• space-multiplexed
Poupyrev, I., Tan, D. S., Billinghurst, M., Kato, H., Regenbrecht, H., & Tetsutani, N. (2001,
July). Tiles: A Mixed Reality Authoring Interface. In Interact (Vol. 1, pp. 334-341).
Space-multiplexed Interface
Data authoring in Tiles
Tiles Demo (2001)
Proximity-based Interaction
Tangible AR: Time-multiplexed Interaction
• Use of natural physical object manipulations to control
virtual objects
• VOMAR Demo
• Catalog book:
• Turn over the page
• Paddle operation:
• Push, shake, incline, hit, scoop
Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., & Tachibana, K. (2000, October). Virtual object manipulation on a table-top AR
environment. In Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) (pp. 111-119). IEEE.
VOMAR Interface
VOMAR Demo (2001)
Advantages and Disadvantages
•Advantages
• Natural interaction with virtual and physical tools
• No need for special purpose input devices
• Spatial interaction with virtual objects
• 3D manipulation with virtual objects anywhere in space
•Disadvantages
• Requires Head Mounted Display
5. Natural AR Interfaces
• Goal:
• Interact with AR content the same
way we interact in the real world
• Using natural user input
• Body motion
• Gesture
• Gaze
• Speech
• Input recognition
• Nature gestures, gaze
• Multimodal input
FingARtips (2004)
Tinmith (2001)
External Fixed Cameras
• Overhead depth sensing camera
• Capture real time hand model
• Create point cloud model
• Overlay graphics on AR view
• Perform gesture interaction
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in space: Gesture interaction with
augmented-reality interfaces. IEEE computer graphics and applications, 34(1), 77-80.
Examples
https://www.youtube.com/watch?v=7FLyDEdQ_vk
PhobiAR (2013)
https://www.youtube.com/watch?v=635PVGicxng
Google Glass (2013)
Head Mounted Cameras
• Attach cameras/depth sensor to HMD
• Connect to high end PC
• Computer vision capture/processing on PC
• Perform tracking/gesture recognition on PC
• Use custom tracking hardware
• Leap Motion (Structured IR)
• Intel RealSense (Stereo depth)
Project NorthStar (2018)
Meta2 (2016)
Project NorthStar Hand Interaction
Self Contained Systems
• Sensors and processors on device
• Fully mobile
• Customized hardware/software
• Example: Hololens 2 (2019)
• 3D hand tracking
• 21 points/hand tracked
• Gesture driven interface
• Constrained set of gestures
• Multimodal input (gesture, gaze, speech)
Hololens 2 Gesture Input Demo (MRTK)
https://www.youtube.com/watch?v=qfONlUCSWdg
Speech Input
• Reliable speech recognition
• Windows speech, Watson, etc.
• Indirect input with AR content
• No need for gesture
• Match with gaze/head pointing
• Look to select target
• Good for Quantitative input
• Numbers, text, etc.
• Keyword trigger
• “select”, ”hey cortana”, etc https://www.youtube.com/watch?v=eHMkOpNUtR8
Eye Tracking Interfaces
• Use IR light to find gaze direction
• IR sources + cameras in HMD
• Support implicit input
• Always look before interact
• Natural pointing input
• Multimodal Input
• Combine with gesture/speech
Camera
IR light
IR view
Processed image
Hololens 2
Gaze Demo
https://www.youtube.com/watch?v=UPH4lk1jAWs
Evolution of AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Expressiveness, Intuitiveness
DESIGNING AR INTERFACES
Interaction Design
“Designing interactive products to support
people in their everyday and working lives”
Preece, J., (2002). Interaction Design
• Design of User Experience with Technology
Bill Verplank on Interaction Design
https://www.youtube.com/watch?v=Gk6XAmALOWI
•Interaction Design involves answering three questions:
•What do you do? - How do you affect the world?
•What do you feel? – What do you sense of the world?
•What do you know? – What do you learn?
Bill Verplank
Typical Interaction Design Cycle
Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
PROTOTYPING
How Do You Prototype This?
Example: Google Glass
View Through Google Glass
Google Glass Prototyping
https://www.youtube.com/watch?v=d5_h1VuwD6g
Early Glass Prototyping
Tom Chi’s Prototyping Rules
1. Find the quickest path to experience
2. Doing is the best kind of thinking
3. Use materials that move at the speed of
thought to maximize your rate of learning
How can we quickly prototype
XR experiences with little or no
coding?
Prototyping in Interaction Design
Key Prototyping
Steps
● Quick visual design
● Capture key interactions
● Focus on user experience
● Communicate design ideas
● “Learn by doing/experiencing”
Why Prototype?
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Weitere ähnliche Inhalte

Was ist angesagt?

Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual RealityMark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingMark Billinghurst
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRMark Billinghurst
 

Was ist angesagt? (20)

Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 Prototyping
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
 
Lecture 4: VR Systems
Lecture 4: VR SystemsLecture 4: VR Systems
Lecture 4: VR Systems
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
Lecture1 introduction to VR
Lecture1 introduction to VRLecture1 introduction to VR
Lecture1 introduction to VR
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR Tracking
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 

Ähnlich wie 2022 COMP4010 Lecture4: AR Interaction

Comp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionComp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionMark Billinghurst
 
Mobile Augmented Reality
Mobile Augmented RealityMobile Augmented Reality
Mobile Augmented RealityMarios Bikos
 
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...Aritra Sarkar
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityMark Billinghurst
 
pick and place robotic arm
pick and place robotic armpick and place robotic arm
pick and place robotic armANJANA ANILKUMAR
 
Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892ANJANA ANILKUMAR
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMark Billinghurst
 
FastCampus 2018 SLAM Workshop
FastCampus 2018 SLAM WorkshopFastCampus 2018 SLAM Workshop
FastCampus 2018 SLAM WorkshopDong-Won Shin
 
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...Aritra Sarkar
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5Mark Billinghurst
 
Fcv core szeliski_zisserman
Fcv core szeliski_zissermanFcv core szeliski_zisserman
Fcv core szeliski_zissermanzukun
 
My presentation-Augmented Reality at DDIT Nadiad
My presentation-Augmented Reality at DDIT NadiadMy presentation-Augmented Reality at DDIT Nadiad
My presentation-Augmented Reality at DDIT NadiadVisualBee.com
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMark Billinghurst
 
Augmented reality (Access virtual world)
Augmented reality (Access virtual world)Augmented reality (Access virtual world)
Augmented reality (Access virtual world)chirag thakkar
 
Intelligente visie maakt drones autonoom
Intelligente visie maakt drones autonoomIntelligente visie maakt drones autonoom
Intelligente visie maakt drones autonoomEUKA
 
Introductory Level of SLAM Seminar
Introductory Level of SLAM SeminarIntroductory Level of SLAM Seminar
Introductory Level of SLAM SeminarDong-Won Shin
 
Augmented reality in E-commerce
Augmented reality in E-commerceAugmented reality in E-commerce
Augmented reality in E-commerceAshwin P
 

Ähnlich wie 2022 COMP4010 Lecture4: AR Interaction (20)

Comp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionComp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and Interaction
 
Mobile Augmented Reality
Mobile Augmented RealityMobile Augmented Reality
Mobile Augmented Reality
 
ICS1020CV_2022.pdf
ICS1020CV_2022.pdfICS1020CV_2022.pdf
ICS1020CV_2022.pdf
 
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
 
pick and place robotic arm
pick and place robotic armpick and place robotic arm
pick and place robotic arm
 
Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
 
2013 Lecture3: AR Tracking
2013 Lecture3: AR Tracking 2013 Lecture3: AR Tracking
2013 Lecture3: AR Tracking
 
FastCampus 2018 SLAM Workshop
FastCampus 2018 SLAM WorkshopFastCampus 2018 SLAM Workshop
FastCampus 2018 SLAM Workshop
 
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
Fcv core szeliski_zisserman
Fcv core szeliski_zissermanFcv core szeliski_zisserman
Fcv core szeliski_zisserman
 
My presentation-Augmented Reality at DDIT Nadiad
My presentation-Augmented Reality at DDIT NadiadMy presentation-Augmented Reality at DDIT Nadiad
My presentation-Augmented Reality at DDIT Nadiad
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - Technology
 
Augmented reality (Access virtual world)
Augmented reality (Access virtual world)Augmented reality (Access virtual world)
Augmented reality (Access virtual world)
 
Intelligente visie maakt drones autonoom
Intelligente visie maakt drones autonoomIntelligente visie maakt drones autonoom
Intelligente visie maakt drones autonoom
 
Introductory Level of SLAM Seminar
Introductory Level of SLAM SeminarIntroductory Level of SLAM Seminar
Introductory Level of SLAM Seminar
 
PPT s01-machine vision-s2
PPT s01-machine vision-s2PPT s01-machine vision-s2
PPT s01-machine vision-s2
 
Augmented reality in E-commerce
Augmented reality in E-commerceAugmented reality in E-commerce
Augmented reality in E-commerce
 

Mehr von Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

Mehr von Mark Billinghurst (14)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Kürzlich hochgeladen

Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Zilliz
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelDeepika Singh
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityWSO2
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Victor Rentea
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistandanishmna97
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 

Kürzlich hochgeladen (20)

Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 

2022 COMP4010 Lecture4: AR Interaction

  • 1. AR INTERACTION COMP 4010 Lecture Four Mark Billinghurst August 18th 2022 mark.billinghurst@unisa.edu.au
  • 3. AR RequiresTracking and Registration • Registration • Positioning virtual object wrt real world • Fixing virtual object on real object when view is fixed • Calibration • Offline measurements • Measure camera relative to head mounted display • Tracking • Continually locating the user’s viewpoint when view moving • Position (x,y,z), Orientation (r,p,y)
  • 4. Sources of Registration Errors •Static errors • Optical distortions (in HMD) • Mechanical misalignments • Tracker errors • Incorrect viewing parameters •Dynamic errors • System delays (largest source of error) • 1 ms delay = 1/3 mm registration error
  • 5. Reducing Static Errors •Distortion compensation • For lens or display distortions •Manual adjustments • Have user manually alighn AR andVR content •View-based or direct measurements • Have user measure eye position •Camera calibration (video AR) • Measuring camera properties
  • 6. Reducing dynamic errors (1) •Reduce system lag •Faster components/system modules •Reduce apparent lag •Image deflection •Image warping
  • 8. Frames of Reference • Word-stabilized • E.g., billboard or signpost • Body-stabilized • E.g., virtual tool-belt • Screen-stabilized • Heads-up display
  • 9. Tracking Requirements • Augmented Reality Information Display • World Stabilized • Body Stabilized • Head Stabilized Increasing Tracking Requirements Head Stabilized Body Stabilized World Stabilized
  • 10. Tracking Technologies § Active • Mechanical, Magnetic, Ultrasonic • GPS, Wifi, cell location § Passive • Inertial sensors (compass, accelerometer, gyro) • Computer Vision • Marker based, Natural feature tracking § Hybrid Tracking • Combined sensors (eg Vision + Inertial)
  • 12. Why Optical Tracking for AR? • Many AR devices have cameras • Mobile phone/tablet, Video see-through display • Provides precise alignment between video and AR overlay • Using features in video to generate pixel perfect alignment • Real world has many visual features that can be tracked from • Computer Vision well established discipline • Over 40 years of research to draw on • Old non real time algorithms can be run in real time on todays devices
  • 13. Common AR Optical Tracking Types • Marker Tracking • Tracking known artificial markers/images • e.g. ARToolKit square markers • Markerless Tracking • Tracking from known features in real world • e.g. Vuforia image tracking • Unprepared Tracking • Tracking in unknown environment • e.g. SLAM tracking
  • 14. Marker Based Tracking: ARToolKit http://www.artoolkit.org
  • 15. Natural Feature Tracking • Use Natural Cues of Real Elements • Edges • Surface Texture • Interest Points • Model or Model-Free • No visual pollution Contours Features Points Surfaces
  • 16. Detection and Tracking Detection Incremental tracking Tracking target detected Tracking target lost Tracking target not detected Incremental tracking ok Start + Recognize target type + Detect target + Initialize camera pose + Fast + Robust to blur, lighting changes + Robust to tilt Tracking and detection are complementary approaches. After successful detection, the target is tracked incrementally. If the target is lost, the detection is activated again
  • 17. Marker vs.Natural FeatureTracking • Marker tracking • Usually requires no database to be stored • Markers can be an eye-catcher • Tracking is less demanding • The environment must be instrumented • Markers usually work only when fully in view • Natural feature tracking • A database of keypoints must be stored/downloaded • Natural feature targets might catch the attention less • Natural feature targets are potentially everywhere • Natural feature targets work also if partially in view
  • 18. Model BasedTracking • Tracking from 3D object shape • Example: OpenTL - www.opentl.org • General purpose library for model based visual tracking
  • 19. Tracking from an Unknown Environment • What to do when you don’t know any features? • Very important problem in mobile robotics - Where am I? • SLAM • Simultaneously Localize And Map the environment • Goal: to recover both camera pose and map structure while initially knowing neither. • Mapping: • Building a map of the environment which the robot is in • Localisation: • Navigating this environment using the map while keeping track of the robot’s relative position and orientation
  • 20. Parallel Tracking and Mapping Tracking Mapping New keyframes Map updates + Estimate camera pose + For every frame + Extend map + Improve map + Slow updates rate Parallel tracking and mapping uses two concurrent threads, one for tracking and one for mapping, which run at different speeds
  • 21. Parallel Tracking and Mapping Video stream New frames Map updates Tracking Mapping Tracked local pose FAST SLOW Simultaneous localization and mapping (SLAM) in small workspaces Klein/Drummond, U. Cambridge
  • 22. Visual SLAM • Early SLAM systems (1986 - ) • Computer visions and sensors (e.g. IMU, laser, etc.) • One of the most important algorithms in Robotics • Visual SLAM • Using cameras only, such as stereo view • MonoSLAM (single camera) developed in 2007 (Davidson)
  • 23. Combining Sensors andVision • Sensors • Produces noisy output (= jittering augmentations) • Are not sufficiently accurate (= wrongly placed augmentations) • Gives us first information on where we are in the world, and what we are looking at • Vision • Is more accurate (= stable and correct augmentations) • Requires choosing the correct keypoint database to track from • Requires registering our local coordinate frame (online- generated model) to the global one (world)
  • 24. ARKit – Visual Inertial Odometry • Uses both computer vision + inertial sensing • Tracking position twice • Computer Vision – feature tracking, 2D plane tracking • Inertial sensing – using the phone IMU • Output combined via Kalman filter • Determine which output is most accurate • Pass pose to ARKit SDK • Each system compliments the other • Computer vision – needs visual features • IMU - drifts over time, doesn’t need features
  • 25. ARKit –Visual Inertial Odometry • Slow camera • Fast IMU • If camera drops out IMU takes over • Camera corrects IMU errors
  • 26. Conclusions • Tracking and Registration are key problems • Registration error • Measures against static error • Measures against dynamic error • AR typically requires multiple tracking technologies • Computer vision most popular • Research Areas: • SLAM systems, Deformable models, Mobile outdoor tracking
  • 28. Augmented Reality technology • Combines Real and Virtual Images • Needs: Display technology • Interactive in real-time • Needs: Input and interaction technology • Registered in 3D • Needs: Viewpoint tracking technology
  • 29. How Do You Design an Interface for This?
  • 30. AR Interaction • Designing AR Systems = Interface Design • Using different input and output technologies • Objective is a high quality of user experience • Ease of use and learning • Performance and satisfaction
  • 31. Typical Interface Design Path 1/ Prototype Demonstration 2/ Adoption of Interaction Techniques from other interface metaphors 3/ Development of new interface metaphors appropriate to the medium 4/ Development of formal theoretical models for predicting and modeling user actions Desktop WIMP Virtual Reality Augmented Reality
  • 32. Interacting with AR Content • You can see spatially registered AR.. how can you interact with it?
  • 33. Different Types of AR Interaction • Browsing Interfaces • simple (conceptually!), unobtrusive • 3D AR Interfaces • expressive, creative, require attention • Tangible Interfaces • Embedded into conventional environments • Tangible AR • Combines TUI input + AR display
  • 34. AR Interfaces as Data Browsers • 2D/3D virtual objects are registered in 3D • “VR in Real World” • Interaction • 2D/3D virtual viewpoint control • Applications • Visualization, training
  • 35. AR Information Browsers • Information is registered to real-world context • Hand held AR displays • Interaction • Manipulation of a window into information space • Applications • Context-aware information displays Rekimoto, et al. 1997
  • 38. Current AR Information Browsers • Mobile AR • GPS + compass • Many Applications • Wikitude • Yelp • Google maps • …
  • 39. Example: Google Maps AR Mode • AR Navigation Aid • GPS + compass, 2D/3D object placement
  • 40.
  • 41. Advantages and Disadvantages • Important class of AR interfaces • Wearable computers • AR simulation, training • Limited interactivity • Modification of virtual content is difficult Rekimoto, et al. 1997
  • 42. 3D AR Interfaces • Virtual objects displayed in 3D physical space and manipulated • HMDs and 6DOF head-tracking • 6DOF hand trackers for input • Interaction • Viewpoint control • Traditional 3D user interface interaction: manipulation, selection, etc. Kiyokawa, et al. 2000
  • 45.
  • 46. Advantages and Disadvantages • Important class of AR interfaces • Entertainment, design, training • Advantages • User can interact with 3D virtual object everywhere in space • Natural, familiar interaction • Disadvantages • Usually no tactile feedback • User has to use different devices for virtual and physical objects Oshima, et al. 2000
  • 47. 3. Augmented Surfaces and Tangible Interfaces • Basic principles • Virtual images are projected on a surface • Physical objects are used as controls for virtual objects • Support for collaboration Wellner, P. (1993). Interacting with paper on the DigitalDesk. Communications of the ACM, 36(7), 87-96.
  • 48. Augmented Surfaces • Rekimoto, et al. 1999 • Front projection • Marker-based tracking • Multiple projection surfaces • Object interaction Rekimoto, J., & Saitoh, M. (1999, May). Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 378-385).
  • 49. Augmented Surfaces Demo (1999) https://www.youtube.com/watch?v=r4g_fvnjVCA
  • 50. Tangible User Interfaces (Ishii 97) • Create digital shadows for physical objects • Foreground • graspable UI • Background • ambient interfaces
  • 51. Tangible Interfaces - Ambient • Dangling String • Jeremijenko 1995 • Ambient ethernet monitor • Relies on peripheral cues • Ambient Fixtures • Dahley, Wisneski, Ishii 1998 • Use natural material qualities for information display
  • 52. Tangible Interface: ARgroove • Collaborative Instrument • Exploring Physically Based Interaction • Map physical actions to Midi output • Translation, rotation • Tilt, shake
  • 55. Visual Feedback •Continuous Visual Feedback is Key •Single Virtual Image Provides: • Rotation • Tilt • Height
  • 56. i/O Brush (Ryokai, Marti, Ishii) - 2004 Ryokai, K., Marti, S., & Ishii, H. (2004, April). I/O brush: drawing with everyday objects as ink. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 303-310).
  • 57. i/O Brush Demo (2005) https://www.youtube.com/watch?v=04v_v1gnyO8
  • 58. Many Other Examples • Triangles (Gorbert 1998) • Triangular based story telling • ActiveCube (Kitamura 2000-) • Cubes with sensors • Reactable (2007- ) • Cube based music interface
  • 59. Lessons from Tangible Interfaces • Physical objects make us smart • Norman’s “Things that Make Us Smart” • encode affordances, constraints • Objects aid collaboration • establish shared meaning • Objects increase understanding • serve as cognitive artifacts
  • 60. But There are TUI Limitations • Difficult to change object properties • can’t tell state of digital data • Limited display capabilities • projection screen = 2D • dependent on physical display surface • Separation between object and display • ARgroove – Interact on table, look at screen
  • 61. Advantages and Disadvantages •Advantages • Natural - user’s hands are used for interacting with both virtual and real objects. • No need for special purpose input devices •Disadvantages • Interaction is limited only to 2D surface • Full 3D interaction and manipulation is difficult
  • 62. Orthogonal Nature of Interfaces 3D AR interfaces Tangible Interfaces Spatial Gap No – Interaction is Everywhere Yes – Interaction is only on 2D surfaces Interaction Gap Yes – separate devices for physical and virtual objects No – same devices for physical and virtual objects
  • 63. Orthogonal Nature of Interfaces 3D AR interfaces Tangible Interfaces Spatial Gap No – Interaction is Everywhere Yes – Interaction is only on 2D surfaces Interaction Gap Yes – separate devices for physical and virtual objects No – same devices for physical and virtual objects
  • 64. 4. Tangible AR: Back to the Real World • AR overcomes display limitation of TUIs • enhance display possibilities • merge task/display space • provide public and private views • TUI + AR = Tangible AR • Apply TUI methods to AR interface design Billinghurst, M., Kato, H., & Poupyrev, I. (2008). Tangible augmented reality. ACM Siggraph Asia, 7(2), 1-10.
  • 65. Space vs. Time - Multiplexed • Space-multiplexed • Many devices each with one function • Quicker to use, more intuitive, clutter • Real Toolbox • Time-multiplexed • One device with many functions • Space efficient • mouse
  • 66. Tangible AR: Tiles (Space Multiplexed) • Tiles semantics • data tiles • operation tiles • Operation on tiles • proximity • spatial arrangements • space-multiplexed Poupyrev, I., Tan, D. S., Billinghurst, M., Kato, H., Regenbrecht, H., & Tetsutani, N. (2001, July). Tiles: A Mixed Reality Authoring Interface. In Interact (Vol. 1, pp. 334-341).
  • 70. Tangible AR: Time-multiplexed Interaction • Use of natural physical object manipulations to control virtual objects • VOMAR Demo • Catalog book: • Turn over the page • Paddle operation: • Push, shake, incline, hit, scoop Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., & Tachibana, K. (2000, October). Virtual object manipulation on a table-top AR environment. In Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) (pp. 111-119). IEEE.
  • 73. Advantages and Disadvantages •Advantages • Natural interaction with virtual and physical tools • No need for special purpose input devices • Spatial interaction with virtual objects • 3D manipulation with virtual objects anywhere in space •Disadvantages • Requires Head Mounted Display
  • 74. 5. Natural AR Interfaces • Goal: • Interact with AR content the same way we interact in the real world • Using natural user input • Body motion • Gesture • Gaze • Speech • Input recognition • Nature gestures, gaze • Multimodal input FingARtips (2004) Tinmith (2001)
  • 75. External Fixed Cameras • Overhead depth sensing camera • Capture real time hand model • Create point cloud model • Overlay graphics on AR view • Perform gesture interaction Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in space: Gesture interaction with augmented-reality interfaces. IEEE computer graphics and applications, 34(1), 77-80.
  • 77. Head Mounted Cameras • Attach cameras/depth sensor to HMD • Connect to high end PC • Computer vision capture/processing on PC • Perform tracking/gesture recognition on PC • Use custom tracking hardware • Leap Motion (Structured IR) • Intel RealSense (Stereo depth) Project NorthStar (2018) Meta2 (2016)
  • 78. Project NorthStar Hand Interaction
  • 79. Self Contained Systems • Sensors and processors on device • Fully mobile • Customized hardware/software • Example: Hololens 2 (2019) • 3D hand tracking • 21 points/hand tracked • Gesture driven interface • Constrained set of gestures • Multimodal input (gesture, gaze, speech)
  • 80. Hololens 2 Gesture Input Demo (MRTK) https://www.youtube.com/watch?v=qfONlUCSWdg
  • 81. Speech Input • Reliable speech recognition • Windows speech, Watson, etc. • Indirect input with AR content • No need for gesture • Match with gaze/head pointing • Look to select target • Good for Quantitative input • Numbers, text, etc. • Keyword trigger • “select”, ”hey cortana”, etc https://www.youtube.com/watch?v=eHMkOpNUtR8
  • 82. Eye Tracking Interfaces • Use IR light to find gaze direction • IR sources + cameras in HMD • Support implicit input • Always look before interact • Natural pointing input • Multimodal Input • Combine with gesture/speech Camera IR light IR view Processed image Hololens 2
  • 84. Evolution of AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Expressiveness, Intuitiveness
  • 86. Interaction Design “Designing interactive products to support people in their everyday and working lives” Preece, J., (2002). Interaction Design • Design of User Experience with Technology
  • 87. Bill Verplank on Interaction Design https://www.youtube.com/watch?v=Gk6XAmALOWI
  • 88. •Interaction Design involves answering three questions: •What do you do? - How do you affect the world? •What do you feel? – What do you sense of the world? •What do you know? – What do you learn? Bill Verplank
  • 89. Typical Interaction Design Cycle Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
  • 91. How Do You Prototype This?
  • 95.
  • 96.
  • 97.
  • 99.
  • 100.
  • 101. Tom Chi’s Prototyping Rules 1. Find the quickest path to experience 2. Doing is the best kind of thinking 3. Use materials that move at the speed of thought to maximize your rate of learning
  • 102. How can we quickly prototype XR experiences with little or no coding?
  • 103. Prototyping in Interaction Design Key Prototyping Steps
  • 104. ● Quick visual design ● Capture key interactions ● Focus on user experience ● Communicate design ideas ● “Learn by doing/experiencing” Why Prototype?