Background: Introduction to Augmented Reality
Projection-based Augmented Reality
Ongoing Research of the Speaker
Ending remarks: Further Research & Future Path
2. Speaker Biography
Jung-Hyun Byun
junghyun.ian.byun@gmail.com
• Education
▪ B.Sc. in Computer Science and Engineering, Yonsei University, Korea
– 2011.03 ~ 2015.02
▪ Integrated Ph.D. course in Computer Science, Yonsei University, Korea
– 2015.09 ~ 2020.08 (expected)
– Media System Lab. (Advisor: Prof. Tack-Don Han)
• Publication
▪ Byun et al. "AIR: Anywhere Immersive Reality with User-Perspective Projection.“ Eurographics 2017
▪ Byun et al. "Accurate control of a pan-tilt system based on parameterization of rotational motion." Eurographics 2018
• Research Interest
▪ augmented reality, projection mapping, point cloud processing and scene reconstruction
Projection Mapping and Augmented Reality for Pervasive AR Environment
2
3. Table of Contents
1. Introduction: Augmented Reality and Projection
▪ Augmented reality continuum/definition/categories
▪ Projection AR comparison/research trend
2. Ongoing research of the speaker
▪ Motivation
▪ Pervasive Augmented Reality
▪ Perspective Projection Mapping
▪ Pan-Tilt Control and Registration
3. Ending remarks: Further Research & Future Path
▪ Summary
▪ Deep Learning for Mixed Reality
▪ Context-aware Pervasive AR
▪ Consumer Robotics with Pervasive AR
Projection Mapping and Augmented Reality for Pervasive AR Environment
3
5. Augmented Reality
• Augmented reality that has been around in the fiction for some time …
• has recently come to life for real
Public image
(1st row from left) Star Wars, Horizon Zero Dawn, Memories of the Alhambra, (2nd row from left) Pokémon GO, Spatial Systems, Inc. 5
6. Augmented Reality
• Head-Mounted Displays
▪ Microsoft HoloLens 2016
▪ Magic Leap 2018
• Mobile Augmented Reality
▪ Apple ARKit 2017
▪ Google ARCore 2017
• But there’s one more:
Services & Hardware
6
2016 2017 2018
Spatial Augmented Reality
7. Augmented Reality
• Reality–Virtuality continuum, Milgram, ITIS 1994
The continuum
Milgram, Paul, et al. "Augmented reality: A class of displays on the reality-virtuality continuum." Telemanipulator and telepresence technologies. Vol. 2351. International Society for Optics and Photonics, 1995.
https://en.wikipedia.org/wiki/Reality%E2%80%93virtuality_continuum 7
Mixed WorldReal World Virtual World
8. Augmented Reality
Definition and categorization
[1] Azuma, Ronald T. "A survey of augmented reality." Presence: Teleoperators & Virtual Environments 6.4 (1997): 355-385.
[2] Bimber, Oliver, and Ramesh Raskar. "Modern approaches to augmented reality." ACM SIGGRAPH 2006 Courses. ACM, 2006. 8
• Definitionally augmented reality systems … [1]
1. combine real elements and virtual images
2. are capable of interacting with the user in real-time
3. are registered in the 3D real world
• Spatial Augmented Reality [2]
▪ Virtual objects are directly superimposed onto the real world
▪ Free of instruments that cause users discomfort and fatigue
▪ Uses displays with spatial characteristics – usually projectors
• Projection-based Augmented Reality (Projection AR) means …
▪ technically, any augmented reality generated by wearable, mobile or spatial projector(s)
▪ commonly, a spatial augmented reality generated by projector(s) that are detached from the user
Image generation for augmented reality displays [2]
9. Augmented Reality
Characteristics and comparison
Bimber, Oliver, and Ramesh Raskar. Spatial augmented reality: merging real and virtual worlds. CRC press, 2005. 9
Head Worn Displays
strong presence
freely movable/active
small display (virtual)
fatigue/discomfort
focus != convergence
Mobile devices
accessible by many
easily carried
limits user’s sight
small display (virtual)
Projection-based
highly immersive
theoretically unlimited
FOV/display
projection distortion
device perspective
10. Projection Augmented Reality
• Steadily studied at Tokyo University (orange), MIT Media Lab. (green) and Microsoft Research (blue)
Research trend
10
The Everywhere
Displays
Projector, IBM
UbiComp 2001
2001 2003
iLamps
Ramesh Rasker,
MIT Media Lab
SIGGRAPH 2003
FoveAR,
Hrvoje Benko,
Microsoft,
UIST 2015
2015
Room2Room,
Tomislav Pejsa,
Microsoft,
CSCW 2016
2016
Augmented
Surfaces
Jun Rekimoto,
Sony CHI 1999
1999
Beamatron
Andrew Wilson,
Microsoft
UIST 2012
2012
illumiRoom,
Hrvoje Benko,
Microsoft,
CHI 2013
2013
LuminAR
Pattie Maes,
MIT Media Lab
UIST 2010
2010
SixthSense
Pranav Mistry,
MIT Media Lab
CHI 2009
2009
OmniTouch
Chris Harrison,
Microsoft,
UIST 2011
2011
ExtVision,
Naoki Kimura,
Tokyo University
CHI 2018
2018
Dyadic Project,
Hrvoje Benko,
Microsoft,
UIST 2014
20142014
RoomAlive,
Brett Jones,
Microsoft,
UIST 2014
MeetAlive,
Andreas Fender,
Microsoft,
ISS 2017
2017
12. Pervasive AR interaction platform
• AR / VR continuum in conjunction with Pervasive AR concept
Overview
National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2015R1A2A1A10055673). 12
Real World Virtual World
Tangible
User Interface
(TUI)
Augmented
Reality
(AR)
Augmented
Virtuality
(AV)
Virtual
Reality
(VR)
Spatial AR (SAR) Mobile AR (MAR)
Mixed World
[ Mixed Reality, Milgram, ITIS
1994 ]
[ Spatial AR, Ramesh Raskar,
ISMAR 2004 ]
[ Tangible Bits, Hiroshi Ishii,
SIG CHI 1997 ]
[ A survey of augmented reality,
Azuma, 1997 ]
Wall Display &
Pervasive Display
[ Pervasive Display Applications,
Strohbach, M. Martin, 2011 ]
Immersive Display
[ Immersive displays, Lantz, Ed.,
ACM SIGGRAPH, 1996 ]
Projection AR
(stationary)
[ Illumiroom, H. Benko,
ACM CHI 2013 ]
Pervasive AR
[ Mobile AR, Hollerer,
Telegeoinformatics, 2004 ]
[ Mixed Reality, Milgram,
ITIS 1994 ]
[ Virtual Reality : CAVE, Cruz-Neira,
SIGGRAPH, 1993 ]
Natural
User Interface
(NUI)
[ Natural User Interface,
Petersen, ISMAR 2009]
< Pervasive AR concept diagram >
13. Pervasive AR interaction platform
• This work defines a new concept, Pervasive AR, and conduct researches that constitute it
Overview
National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2015R1A2A1A10055673). 13
▪ Pervasive AR Display environment
- Users can project their needed information
onto any desired space and surface
- Overcome the limitations of traditional,
stationary projection technology by integrating
autonomous robot with augmented reality
▪ Seamless Interaction functionality
- Provide AR-based interaction in real-time in
conjunction with varying spaces and situations
- In addition to the interaction between user-
device, provide interaction with device-device
▪ Pervasive Interface design
- Combines TUI and NUI technology to provide
intuitive, natural interaction technology
- Provides an interface that combines motion
sensor with computer vision technology
• Scalable Pervasive AR platform
- Develop a scalable, heterogeneous,
integrated interaction platform
that provides a consistent method to
integrate a variety of technologies
14. Motivation
• Ordinary meeting room
▪ Projection screen in front
▪ Maybe whiteboard on side
▪ Most of the space is just there
• What can be done to make use of the remaining space?
▪ “The office of the future”
Ongoing Research
Raskar, Ramesh, et al. "The office of the future: A unified approach to image-based modeling and spatially immersive displays." Proceedings of the 25th annual conference on Computer graphics and interactive techniques. ACM, 1998. 14
Engineering Hall D, Room 816
Projection screen
Whiteboard
Vacant wall
Projector
15. Motivation
• Smart meeting room with Pervasive AR
▪ Projection screen in front -> Projection on any surface
▪ Maybe whiteboard on side -> Digital board
▪ Most of the space is just there -> Interactive wall
• Pervasive AR transforms a typical room into smart space with
▪ Minimal hardware
▪ Immersive environment
▪ Responsive system
▪ Seamless interaction
Ongoing Research
15
Engineering Hall D, Room 816
Projection on
any surface
Digital
board
Interactive wall
360° steerable Projector
16. Contribution
• Pervasive AR Key features x Research aspects x Literature contributions
Immersive Projection Environment
16
Minimal hardware
Immersive environment
Responsive system
Seamless interaction
Steerable platform with pan-tilt servos
User perspective 360° projection AR
Geometric distortion-corrected projection
Point cloud registration of indoor scene
Spatial manipulation of virtual contents
Anywhere Immersive, EG 2017
Accurate Control, EG 2018
Dynamic Perspective*, 2019
Axis Bound Registration*, 2019
Pervasive Interface^, 2019
Pervasive AR Key features
Pervasive AR Research aspects Literature Contributions
*in submission
^ work in progress
18. Proposed System
• The front camera captures the data of the surface geometry that is to be projection-mapped
• The rear camera tracks the user’s position and interaction in an AR scene
• The pan-tilt platform is steered to provide 360° projection with correct user’s perspective
System Environment
18
< The ceiling-type design >
Frontal RGB-D Camera
- Acquire geometry
Projector
- Correct distortions
Rear RGB-D Camera
- Track user’s viewpointPan/tilting Servos
- Steer platform pose
< The table-type design >
19. Proposed System
System Prototype
19
Asus Xtion
PICO Projector
Pan-Tilt
Servos
360 PROJECTOR mini
• Key components
▪ Front-facing camera
▪ Rear-facing camera
▪ Projector
▪ Steerable Platform
• H/W configurations
▪ Microsoft Kinect v2
▪ Microsoft Kinect 360
▪ Epson 1771W Projector
▪ HS-785HB Servos + Arduino board
Kinect 360
Pan-Tilt
Platform
Epson Projector
360 PROJECTOR V1 360 PROJECTOR V2
Epson Projector
Kinect 360
Pan-Tilt
Platform
Arduino
Kinect V2
20. Proposed System
• The front camera captures the data of the surface geometry that is to be projection-mapped
• The rear camera tracks the user’s position and interaction in an AR scene
• The pan-tilt platform is steered to provide 360° projection with correct user’s perspective
• Ceiling-type design implemented in a 3.7 x 4.0 x 2.25 m3 (W x D x H) cubic space
System Implementation
20
< The schematic design > < The implemented environment >
21. Proposed System
• Overview: the proposed system …
▪ tracks a user’s position and interaction
▪ renders augmented virtual contents from correct perspective
▪ performs projection mapping without geometric distortion
▪ acquires and reconstructs spatial information
System Flow
21
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
22. PAN-TILT AXIS CALIBRATION AND CONTROL
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“Accurate Control of a Pan-tilt System Based on Parameterization of Rotational Motion”
Eurographics 2018
23. Motivation
• We want to control a pan-tilt system in a way that …
▪ can accurately “target” a point in 3D space
▪ can find the rotation so that the target point captured after rotation lies on the optical axis
▪ i.e., the captured target point should coincide with the optical center
▪ can model its rotation trajectories and plan its motion ahead
• To accurately control a pan-tilt system, we implemented:
▪ A general pan-tilt assembly model without specifications
▪ A physics-consistent pan-tilt rotation model
▪ A calibration process to recover rotation parameters
▪ An inverse kinematics interpretation for manipulation
Accurate Control of a Pan-tilt System
23
Tiltrotation
Pan rotation
Target Point
24. Pan-Tilt Rotation Modeling
• Two-step process to recover rotation parameters
▪ estimate the direction vector of the rotation plane
▪ estimate the center pivot of the circular trajectory
• Points rotated about an axis form a closed circular rotation trajectory on a 3-dimensional plane, where
▪ the rotation axis is the normal of the plane
▪ the circle center is the intersection with the plane
• Capture multiple frames of a large checkerboard with pre-known the structure, during rotation
▪ all the trajectories can be represented with respect to the trajectory of the top-leftmost corner (𝒙 𝟎𝟎)
▪ thus stable global optimization of the trajectory is possible
Rotation Parameters Acquisition
24
Optical
Center
Rotation
Center
Pan Axis
Tilt Axis
Rotation model formation
26. Pan-Tilt Rotation Modeling
• For the rotation of the corner at the i-th row and j-th column
▪ Let us denote
– the distance of foots on the rotation axis between vertical corners: 𝒅 𝒉
– the distance of foots on the rotation axis between horizontal corners: 𝒅 𝒘
▪ Then equations form as:
– the rotation circle center: 𝒑𝒊𝒋 = 𝒂𝒊𝒋, 𝒃𝒊𝒋, 𝒄𝒊𝒋
𝑻
𝒂𝒊𝒋 = 𝒂 − 𝒏 𝒙 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘
𝒃𝒊𝒋 = 𝒃 − 𝒏 𝒚(𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘)
𝒄𝒊𝒋 = 𝒄 − 𝒏 𝒛(𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘)
– the rotation plane:
𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 + 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 = 𝟎
– the rotation trajectory:
𝒙 − 𝒂𝒊𝒋
𝟐
+ 𝒚 − 𝒃𝒊𝒋
𝟐
+ 𝒛 − 𝒄𝒊𝒋
𝟐
= 𝒓𝒊𝒋
𝟐
Rotation Parameters Acquisition
26
𝒙 𝟏𝟎
𝒙 𝟎𝟏
𝒙 − 𝒂𝒊𝒋
𝟐
+ 𝒚 − 𝒃𝒊𝒋
𝟐
+ 𝒛 − 𝒄𝒊𝒋
𝟐
= 𝒓𝒊𝒋
𝟐
𝒅 𝒘
𝒓𝒊𝒋
(𝒂𝒊𝒋, 𝒃𝒊𝒋, 𝒄𝒊𝒋)
𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛
+ 𝒅 + 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 = 𝟎
𝒅 𝒉
𝒙 𝟎𝟎
𝒊
𝒋
27. Pan-Tilt Rotation Modeling
• For k-th captured checkerboard frame
▪ Let us denote
– the 3D coordinate of the corner at i-th row and j-th column:
𝒗𝒊𝒋𝒌 = 𝒙𝒊𝒋𝒌, 𝒚𝒊𝒋𝒌, 𝒛𝒊𝒋𝒌
𝑻
▪ Minimize errors of two objective functions using the global least squares method
– Find the parameters 𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛, 𝒅, 𝒅 𝒉, 𝒅 𝒘 that minimize the planar term:
– With recovered parameters, find parameters 𝒂, 𝒃, 𝒄, 𝒓𝒊𝒋 that minimize the circular term:
Rotation Parameters Acquisition
Chen, Ping, et al. "Rotation axis calibration of a turntable using constrained global optimization." Optik-International Journal for Light and Electron Optics 125.17 (2014): 4831-4836.
27
28. Evaluation
• Captured 28 frames in pan rotation (-25°~15°), and 11 frames in tilt rotation (-17° ~ 16°)
• Pan and tilt rotation trajectories were computed and from those, the visual odometry of the camera can also be tracked
System Configuration and Calibration
28
Bottommost Tilt
Topmost Tilt
LeftmostPan
RightmostPan
Point cloud accumulation
29. Evaluation
• The rotation axis calibration results of the pan-tilt system
• Pan and tilt trajectories were calculated (green circle)
• The point cloud of the projector was captured from behind by the rear camera
• Thus the front camera was occluded, and its positions and poses were estimated as the “pan-tilt camera” (white cube)
System Configuration and Calibration
29
Rear Kinect 360
Pan-Tilt Platform
Front Kinect V2
Projector
30. Evaluation
• The task at hand:
▪ Want to orient the system so that it can accurately “target” a point in 3D space
▪ Given the target point, the optical axis of the camera should “pass through” the target object
▪ In other words, the target should coincide with the optical center after the rotation
• Interpretation in inverse kinematics terms
▪ Think of the optical axis as a “linear actuator”
▪ and the target point on the axis as an “end effector”
Servo Control with Inverse Kinematics
Buss, Samuel R. "Introduction to inverse kinematics with jacobian transpose, pseudoinverse and damped least squares methods." IEEE Journal of Robotics and Automation 17.1-19 (2004): 16. 30
Tiltrotationangle
Pan rotation angle
“End effector”
31. Evaluation
• Testing of the accurate targeting capability
▪ The system is tasked to adjust its attitude so that the target point is at the optical center of the camera
▪ The coordinates of the 70 checkerboard corners are used as target points
▪ Proposed inverse kinematics control method and Single Point Calibration Method (SPCM) are compared
• Errors between the intended destination of a point and the actually captured point after the rotation are measured
▪ Root Mean Squared Errors (RMSE) of L2-norms in XY image pixels and real distances (mm)
▪ Mean Absolute Errors (MAE) of L2-norms in each X, Y (and Z) directions in both pixels and mm
Experiment Result
Li, Yunting, et al. "Method for pan-tilt camera calibration using single control point." JOSA A 32.1 (2015): 156-163. 31
32. GEOMETRIC DISTORTION COMPENSATED PROJECTION
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“AIR: Anywhere Immersive Reality with User-Perspective Projection”
Eurographics 2017
33. Motivation
• Geometric Calibration required by most projector-camera applications [1]
▪ Geometrically register the projectors to the surface and to enable the generation of a consistent projection onto a
complex surface geometry
• User Perspective Rendering in tablet-based augmented reality [2]
▪ Comparison of device-perspective rendering, user-perspective rendering, and a ground truth
User-Perspective Projection
[1] Grundhöfer, Anselm, and Daisuke Iwai. "Recent advances in projection mapping algorithms, hardware and applications." Computer Graphics Forum. Vol. 37. No. 2. 2018.
[2] Tomioka, Makoto, Sei Ikeda, and Kosuke Sato. "Approximated user-perspective rendering in tablet-based augmented reality." 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2013. 33
34. Motivation
• Distortion Correction and User Perspective for projection AR
User-Perspective Projection
Tomioka, Makoto, Sei Ikeda, and Kosuke Sato. "Approximated user-perspective rendering in tablet-based augmented reality." 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2013. 34
Adapt &
substitute
35. Proposed System
• Schematic model and process overview of the proposed system
▪ consists of a projector, two RGB-D cameras, pan-tilt servos, and a control board
▪ designed to be portable, so that it can placed on a table-top with unknown surroundings
System Overview
35
Front RGB-D Camera
- Acquire geometry
Projector
- Correct distortion
Rear RGB-D Camera
- Track user’s view pt.
Pan/tilting Servos
- Steer platform pose
RGB-D Camera Pan/tilt motors
World Geometry
User view pt.
tracking
View-depend.
rendering
Perspective
project. mat.
Distortion-corrected Projection
Perspective
Rendering
Projective
Texturing
Texture map.
on geometry
Workflow
36. Proposed System
• Calibrating between cameras and a projector with gray code patterns.
• First, the pixel correspondences between the projector and both front and rear RGB cameras are computed.
• Then, corresponding points in the color images are transformed to depth image points.
• Finally, the depth image points are un-projected to 3D points, which are used to calibrate the projector and between
front and rear cameras.
Projector-Camera Calibration
36
37. Proposed System
• Calibration & Two-pass Rendering
• Calibration: geometry construction
▪ Register poses of cameras, a projector and a user
• First-pass: perspective rendering
▪ Track the user and render scenes from perspective
• Second-pass: distortion correction
▪ Projectively map texture onto geometry
• Finally, display mapped rendering from the projector
System Flow
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Front
Camera
Rear
Camera
Projector/
Servos
Track User
View
Perspective
Matrix
Offscreen
Rendering
Projective
Matrix
Texture
Coordinate
Projective
Mapping
38. Proposed System
• Register separate coordinates front, proj, rear in one global system world
• 𝑷 𝒘𝒐𝒓𝒍𝒅 = 𝑇𝑟𝑒𝑎𝑟→𝑤𝑜𝑟𝑙𝑑 𝑷 𝒓𝒆𝒂𝒓 = 𝑅 𝑝𝑎𝑛 𝛼 𝑅𝑡𝑖𝑙𝑡 𝛽 𝑷 𝒇𝒓𝒐𝒏𝒕
• 𝑠 ⋅ 𝒙 𝒑𝒓𝒐𝒋 = 𝐴 𝑝𝑟𝑜𝑗 𝑇𝑓𝑟𝑜𝑛𝑡→𝑝𝑟𝑜𝑗 𝑷 𝒇𝒓𝒐𝒏𝒕
• Solve internal and external parameters as reverse-camera model
System Flow
38
Front Camera (front)
- Changes w.r.t servos Rear Camera (rear)
- Fixed in world
Pan/tilt Servos (servo)
- Rotate in world
User Perspective (user)
- Subordinate to rear
Global world coordinate
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Projector (proj)
- Subordinate to front
39. Proposed System
• Rear camera tracks the user’s viewpoint 𝐸 𝑢𝑠𝑒𝑟
▪ Setup the perspective matrix 𝐴 𝑢𝑠𝑒𝑟 from 𝐸 𝑢𝑠𝑒𝑟
• Configure perspective equation: 𝐴 𝑢𝑝𝑟 = 𝐴 𝑢𝑠𝑒𝑟 𝑇 𝑤𝑜𝑟𝑙𝑑→𝑟𝑒𝑎𝑟
• First-pass: render off-screen scene to a texture, projTex , with FrameBuffer Object
System Flow
TOMIOKA M., IKEDA S., SATO K.: Approximated user perspective rendering in tablet-based augmented reality. In Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on (2013), IEEE, pp. 21–28.
39
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Global world coordinate
Rear Camera (rear)
- Fixed in world
𝐴 𝑢𝑝𝑟
User Perspective (user)
- Subordinate to rear
40. Proposed System
• Setup the projective matrix 𝐴 𝑢𝑝𝑟
−1
• Configure projective equation: 𝑠 𝐴 𝑢𝑝𝑟
−1
projTexprojCoord = 𝑃𝑔𝑒𝑜𝑚𝑒𝑡𝑟𝑦
• Second-pass: map projTex onto geometry with projCoord
▪ Display user-perspective rendering from the projector’s viewpoint
System Flow
40
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Global world coordinate
Pan/tilt Servos (servo)
- Rotate in world
User Perspective (user)
- Subordinate to rear
Rear Camera (rear)
- Fixed in world
Front Camera (front)
- Changes w.r.t servos
Projector (proj)
- Subordinate to front
41. Evaluation
• Geometric distortion correction results of XY grid images. Before & after
Implementation Result
41
(c) Pan = -45°, Tilt = 45°(a) Pan = 0°, Tilt = 0° (b) Pan = 0°, Tilt = 45°
DistortioncorrectedSurfacedistortion
42. • To quantify the image degradation, MirageTable[1] authors…
▪ computed Root Mean Square(RMS) errors of absolute intensity differences of corrected projections, pixel-by-pixel
▪ varying color conditions yield substantially greater, irrelevant RMS errors
• Instead we…
1
𝑁
𝑖=1
𝑁
𝑃𝑏𝑎𝑠𝑒 − 𝑃𝑖 2
2
, where N: the number of corners
▪ compute differences in structural compositions, point-by-point
▪ setup the base image by projecting a checkerboard on a planar surface
▪ place different obstacles and correct projection output accordingly
▪ match corners of checkerboards with the base image and compute dislocations
Quantitative metric
Evaluation
[1] Benko, Hrvoje, Ricardo Jota, and Andrew Wilson. "MirageTable: freehand interaction on a projected augmented reality tabletop." Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 2012.
43. • Distortion correction works well on
▪ (d), (e) complex geometry
▪ (f) a freely deformable object
Results
Evaluation
Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20.
(a) Base (b) Drawer (14.98) (c) Heater (10.97)
(d) Panels (6.28) (f) Plastic (5.21)(e) Box (6.45)
44. • Highest dislocation errors on
▪ (b) rigid geometry
▪ (c) curved geometry
Results
Evaluation
Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20.
(a) Base (b) Drawer (14.98) (c) Heater (10.97)
(d) Panels (6.28) (f) Plastic (5.21)(e) Box (6.45)
45. • Slopes in red dotted region …
▪ are almost parallel to the depth camera
▪ result in inaccurate depth data acquisition
▪ the current system cannot cope with inherently invalid values
• “KinectToF affected by a low incident angle (angles below 10°), detects the corrupted pixels resulting in
extremely range errors up to 800 mm.”
• “For incident angles between 10° and 30° the KinectToF delivers up to 100% invalid pixel.”
Limitation
Evaluation
Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20.
(a) Base (b) Drawer (14.98) (c) Heater (10.97)
46. DYNAMIC PERSPECTIVE PROJECTION MAPPING
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“PPAP: Perspective Projection Augment Platform with Dynamic Control of Steerable Pan-Tilt System”
In submission
47. Motivation
• Cues for Perceiving Spatial Relationships
Spatial Perception in CGI
Grossman, Tovi, and Ravin Balakrishnan. "An evaluation of depth perception on volumetric displays." Proceedings of the working conference on Advanced visual interfaces. ACM, 2006.
Wanger, Leonard R., James Ferwerda, and Donald P. Greenberg. "Perceiving spatial relationships in computer-generated images." IEEE Computer Graphics and Applications 12.3 (1992): 44-58.
Benko, Hrvoje, Andrew D. Wilson, and Federico Zannier. "Dyadic projected spatial augmented reality." Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014. 47
Perspective projection
• Render with projective geometry
• Gives a perspective
Shading and Shadow
• Shadow from object to surface
• Gives positional information
• Basics of graphics rendering
• Feels off when done wrong
• But there’s not much to add
• Degrades rendering quality
• Need to wear 3D glasses
• Not instrument-free for user
Stereopsis
• Using the difference in vision
• Gives a sense of depth
Motion parallax
• Change in position with motion
• Gives spatial relations
• Perception of 3D in far distance
• Works in monocular imagery
• Good fit with pan-tilt mechanism
Our focus in monocular projection
48. System Description
• (a) The user’s position and perspective are tracked by the rear camera (gray arrow).
• (b) As the user inspects the virtual object (green circle), the user’s line-of-sight (blue arrow) is ray-casted to locate its
view center on the surface geometry (green rectangle). The view center is projected (yellow arrow) on the projector’s
image domain, and appropriate pan and tilt angles (a, b) are computed.
• (c) The projector rotates to match its projection center and the user’s view center to render more parts of the virtual object.
Servo Control with User Perspective
48
49. System Description
• In dynamic perspective projection, the projector can rotate itself using the pan-tilt servo motors
• The pan-tilt projector automatically rotates focusing on the perspective-mapped surface at the user's point of view
• Therefore, the dynamic perspective projection system can greatly expand the viewing angle compared to the static
perspective projection, thereby providing a more immersive augmented reality experience to the user
Servo Control with User Perspective
49
< Static Perspective Projection > < Dynamic Perspective Projection >
50. System Description
• The user’s perspective is tracked by the rear camera and ray-casted to determine its view center on the surface in the
reference coordinate.
• The view center coordinate is sequentially transformed to the front camera and the projector coordinates.
• Finally, the coordinate is projected onto the image domain of the projector to determine the corresponding point in the
projection texture.
Dynamic Projection with Perspective Mapping
50
51. System Description
• The pan-tilt platform is rotated to match the view center and the projection center, so that the user can perceive the
augmented content in more widened viewing angle.
• Given the user’s viewpoint, the goal is to find the rotation angles α and β that minimize the following displacement
error in the image domain:
Dynamic Projection with Perspective Mapping
51
Servo Rotation Control Projection Viewpoint Control
52. User Experiments
• H1. The participants are able to perceive spatial representation of the virtual objects, projected by the proposed system.
• H2. The participants are able to rate the size and distance more correctly when they can view wider range of the
augmented object.
• H3. The further distance of the virtual object to the projected surface of the wall affects negatively on its presence that is
perceived by the participants.
Main Hypotheses
52
< Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
53. User Experiments
• Investigated other factors that can affect the experiment results: Gender, Gaming, Driving
• Participants were asked to rate the size and distance of the virtual objects with
▪ three different sizes (30 cm, 40 cm and 50 cm radii)
▪ three different distances (1.6 m, 2.0 m and 2.4 m away)
▪ two different conditions (static projection vs dynamic projection)
Experiment Design
53
< Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
54. User Experiments
• 11 participants aged from 25 to 31 (M = 27.1 years, SD = 1.8 years).
• 18 configurations (Size (3) x Distance (3) x Condition (2)) with 3 trials => Total 54 ratings per participant
• Within-subject design -> the presentation order of static and dynamic conditions counterbalanced
• 15 seconds time limit per configuration / once time lapsed, the projection was turned off
Experiment Procedure
54
< Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
55. Results and Analysis
• The participants were able to correctly rate 61.1% of size variations and 68.7% of distance variations
• The participants were able to correctly rate 47.0% of both size & distance combinations (Overall correctness)
• A naïve guess -> 11.1% chance of being correct (1 out of 9 size*distance combinations)
• H1. The participants are able to perceive spatial representation of the virtual objects, projected by the proposed system.
=> Accepted
Hypothesis H1
55
56. Results and Analysis
• Only 4.7% of the total ratings missed by more than one option, such as mistaking a “near” distance as “far”.
• Encoded participants’ responses into a binary scale, either “correct” or “incorrect”
▪ allows us to analyze the performance results using binomial regression
▪ simplifies the interpretation and analysis of effective factors on spatial presence and perception (H2, H3)
• Within-subject design + categorical responses -> the repeated measures logistic regression
▪ analyzed using Generalized Estimating Equations (GEE) with IBM SPSS Statistics software
▪ used to estimate the parameters of a model with a possible unknown correlation between outcomes
• Wald Chi-Square test -> to find the correlation between categorical factors
• Three variables as predictors: Size, Distance and Condition w/ binary correctness
• Significance level as α = 0.05 (literature standard)
▪ “Assuming that the null hypothesis is true, we may reject the null only if the observed data are so unusual that
they would have occurred by chance at most 5 % of the time.“
▪ “α sets the standard for how extreme the data must be before we can reject the null hypothesis.”
▪ “The p-value indicates how extreme the data are.” -> reject the null hypothesis, if p <= 0.05.
Hypothesis H2 & H3
Dr. Janxin Leu, “What is the difference between an alpha level and a p-value”, Fundamentals of Psychological Research, University of Washington 56
57. Results and Analysis
• The condition (static or dynamic) was found to be highly significant (𝛘2 = 12.431, df = 1, p < .001) over size/distance
▪ 34.3% correct in the static condition
▪ 59.6% correct in the dynamic condition
• H2. The participants are able to rate the size and distance more correctly when they can view wider range of the
augmented object. => Accepted
• Participants reported the increased viewing angle (150° for the dynamic and 102° for the static) helped them feel more
comfortable and immersed, so that it was more easy to deduce the spatial relationship of virtual object.
Hypothesis H2
57
58. Results and Analysis
• The distance was found to be statistically not significant (𝛘2 = 5.213, df = 2, p = .074) at first
• The result was quite contradictory as several researches reported
▪ the quality of the projection degrades as it is distanced from the projection surface
▪ and thus users find a virtual object less present
Hypothesis H3
58
59. Results and Analysis
• Analyses on the separated subsets were found significant (𝛘2 = 5.992, df = 2, p = .049, in dynamic condition)
▪ the projector FOV is fixed, thus the visible area is limited depending on the projection distance
▪ the dynamic projection rotates the projector to match the user’s view, increasing the effective FOV of the projector
▪ Increased viewing angle -> more virtual parallax -> more immersion and presence in AR
• H3. The further distance of the virtual object to the projected surface of the wall affects negatively on its presence that is
perceived by the participants. => Accepted.
Hypothesis H3
59
60. PAN-TILT POINT CLOUD REGISTRATION
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“Fast and Accurate Reconstruction of Pan-Tilt RGB-D Scans via Axis Bound Registration”
In submission
61. Motivation
• Spatial Mapping for Pervasive AR
• SLAM/ICP-based registration
• Calibration-based registration
Spatial Mapping for Pervasive AR
Wilson, Andrew, et al. "Steerable augmented reality with the beamatron." Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012.
Fender, Andreas Rene, Hrvoje Benko, and Andy Wilson. "MeetAlive: Room-Scale Omni-Directional Display System for Multi-User Content and Control Sharing." Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces. ACM, 2017.
https://msrstudio99.wordpress.com/2016/01/25/a-panorama-of-skies/ 61
62. Motivation
• Iterative Closest Point point-to-point
▪ a method that iteratively disregards outliers in order to improve upon the previous estimate of the rotation and
translation parameters.
• (a) obtains point correspondences by searching for the nearest neighbor target point from the source point cloud.
• (b) estimates the rotation R and translation t by minimizing the squared distance between the corresponding pairs
• ICP then iteratively solves (a) and (b) to improve upon the estimates of the previous iterations
General ICP Formulation
Bellekens, Ben, et al. "A survey of rigid 3D pointcloud registration algorithms." AMBIENT 2014: the Fourth International Conference on Ambient Computing, Applications, Services and Technologies, August 24-28, 2014, Rome, Italy. 2014.
62
(a) (b)
(a)
(b)
Uncertain point correspondences induce the iterative nature, which leads to high computational cost
63. Motivation
• Formulation a light-weight solution for indoor scene registration by …
1. Utilizing pan-tilt rotation trajectories as the visual odometry
▪ Initial rough estimation of the camera pose
▪ Rotation trajectory-based matching and rejection
2. Dividing into and alternatingly optimizing sub-problems
▪ Pairwise axis bound transform estimation
Rotation axis calibration
63
Tiltrotation
Pan rotation
Target Point
Axis Bound Registration
64. Axis Bound Registration
• Offline calibration
• Online registration
▪ Global registration
▪ Local registration
Overview of the pipeline
64
Rotation axis
calibration*
Pulse-width
mapping*
1. Initial global
pose estimation
2. Trajectory-based
feature rejection
Input RGB frame
Input depth frame
3D keypoint cloud
generation
Descriptor-based
feature matching
3. Corresponding
point pairs check
4. Pairwise axis
bound transform
Final local
registration matrix
*offline process
RGB-D camera
Pan-tilt servos
65. Axis Bound Registration
• Servo motors …
▪ use potentiometers to orient themselves to certain directions
▪ rotate a certain angle that is linearly proportional to the applied pulse width
• The camera pose can be roughly estimated based on the servo control
• Rotation axis calibration yields the global transform and pulse-width mapping
1. Initial global pose estimation
65
66. Axis Bound Registration
• Trajectory constraint is applied to reject outlier matching pairs between two proximate frames
▪ (a) Keypoints are extracted from the color images and matched using feature descriptors
▪ (b) Keypoint pairs are un-projected to 3D points in global space, represented with halos.
▪ (c) Keypoint pairs whose distance is below the threshold are accepted as correspondence pairs.
▪ (d) Keypoint pairs whose distance is above the threshold are rejected as outlier matches.
2. Trajectory-based feature rejection
66
67. Axis Bound Registration
• Outliers are rejected if the distance in global coordinates to estimated pairs exceeds a certain threshold (e.g., 100 mm).
• Notice the keypoint pairs removed …
▪ are matched in diagonal directions (opposed to the camera movement)
▪ have weak correspondences (erroneous depth data around edges and on black/reflective surfaces)
3. Corresponding point pairs check
67
Initial matching of 364 keypoint pairs (based on descriptor distance).
Outlier rejection of 97 keypoint pairs (based on trajectory distance).
Final correspondence result of 267 keypoint pairs.
68. Axis Bound Registration
• Rotation axes …
▪ downsize the problem of 6 DoF [R|t] transform into two variables α, β rotation
▪ serve as a constraint for efficient and robust local registration
4. Pairwise axis bound transform
68
Alternating optimization
Sub-problem division
SVD SVD
69. Evaluation
• No available public dataset for pan–tilt RGB-D scan registration including the rotation axes calibration data
• The dataset was made in-house and used for the evaluation
Pan-tilt registration dataset
69
(a) Scene 0 and 10 (13.6°) (b) Scene 0 and 20 (27.1°) (c) Scene 0 and 30 (40.7°)
(d) Scene 39 and 45 (12.7°) (e) Scene 39 and 49 (21.2°) (f) Scene 39 and 55 (33.9°)
(g) Scene 0 and 47 (27.1°/8.5°)
(h) Scene 47 and 37 (23°/8.5°)
Pan rotation dataset Pan & tilt rotation dataset
Tilt rotation dataset
70. Evaluation
• Registration without ICP
• Consecutive scenes 0~9
• Pan rotation ~12.2°
• Fast – under 0.0002 sec
Qualitative Result
70
71. Evaluation
• RGBD-Calib [1], ORB-SLAM2 [2], FGR [3], and S4PCS [4] are compared with the proposed method.
• The RMS Euclidean distance error of N closest point pairs is measured [1].
▪ The shortest time to complete is highlighted in yellow.
▪ The lowest RMS error is highlighted in red.
▪ The algorithm with the shortest time and lowest RMSE is emboldened.
• Out of 8 test cases, the proposed method was able to
▪ score the lowest error in 5 cases
▪ complete as the fastest in 5 cases
▪ and was both the fastest & lowest 3 cases
Quantitative Result
[1] Chi-Yi Tsai and Chih-Hung Huang. Indoor scene point cloud registration algorithm based on rgb-d camera calibration. Sensors, 17(8):1874, 2017.
[2] Raúl Mur-Artal and Juan D. Tardós. ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras. IEEE Transactions on Robotics, 33(5):1255–1262, 2017.
[3] Qian-Yi Zhou, Jaesik Park, and Vladlen Koltun. Fast global registration. In European Conference on Computer Vision, pages 766–782. Springer, 2016.
[4] Nicolas Mellado, Dror Aiger, and Niloy J Mitra. Super 4pcs fast global pointcloud registration via smart indexing. In Computer Graphics Forum, volume 33, pages 205–215. Wiley Online Library, 2014. 71
72. PERVASIVE INTERFACE REGISTRATION
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“PRISM: Interactive Projection System for Pervasive Registration of Interface with Spatial Manipulation”
Work in progress
73. Motivation
• Pervasive computing environment in MR / VR / AR
• Applications and widgets in the form of windows or grids
• Intuitive & Natural & Comfortable UI/UX for Pervasive Computing in Immersive AR
Pervasive Computing & Immersive AR
Microsoft HoloLens demo onstage at BUILD 2015, https://www.youtube.com/watch?v=3AADEqLIALk
Introducing Oculus Dash, https://www.youtube.com/watch?v=SvP_RI_S-bw
Fender, Andreas Rene, Hrvoje Benko, and Andy Wilson. "Meetalive: Room-scale omni-directional display system for multi-user content and control sharing." Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces. ACM, 2017. 73
< Oculus Dash > < Microsoft MeetAlive >< Microsoft HoloLens >
74. Project Overview
• Implementation of pervasive projection AR environment and spatial interaction
▪ 3D reconstruction of an indoor scene
▪ Tracking of the user’s behavior and interaction
▪ Selection and proposal of the projection appropriate region
▪ Window-based registration and mapping of applications and widgets
▪ Spatial interaction for manipulation and control in the AR environment
PRISM: Pervasive Registration of Interface with Spatial Manipulation
74
(a) Spatial Interaction – Tracking (b) Spatial Interaction – Selection (c) Mobile Interaction – Cursor
75. Implementation Detail
1. Axis Bound Registration for Indoor Scene Reconstruction
2. Hough Plane Detection for Projection Region Selection
3. Planar Grid Partitioning for Multi-Window Generation
4. Touch and Ray Casting based Window Manipulation
PRISM: Pervasive Registration of Interface with Spatial Manipulation
75
Point Accumulation
Coordinate polarization
Distance check
Hough transform
Plane Formation
Inlier check
Region clustering
Largest inner rectangle
Scene Reconstruction
Rotation axis calibration
Pan-tilt RGB-D scan
Axis-bound registration
Window Generation
XY grid projection
Contour & boundary
check
Multi window
partitioning
76. Implementation Detail
1. Axis Bound Registration for Indoor Scene Reconstruction
2. Hough Plane Detection for Projection Region Selection
3. Planar Grid Partitioning for Multi-Window Generation
4. Touch and Ray Casting based Window Manipulation
PRISM: Pervasive Registration of Interface with Spatial Manipulation
76
< Skeleton Ray Casting – Debug View> < Skeleton Ray Casting – User View>
78. Summary
• I have …
1. presented a novel concept of Pervasive AR space and its implementation
2. proposed a geometric distortion correction method to realize projection on any surface
3. showed that enhanced presence can be delivered in monocular views with dynamic perspective projection
4. proposed an inverse kinematics servo control method in conjunction with rotation axis calibration
5. proposed a fast and efficient method to register point clouds with pan-tilt rotation priors
Projection Mapping and Augmented Reality for Pervasive AR Environment
78
79. Deep Learning for Mixed Reality
• Magic Leap Principal Engineer ’Tomasz Malisiewicz‘
▪ Deep learning is going to be the most important ingredient for this mixed reality …
▪ because it is the "master algorithm" for working with perceptual data
▪ Virtual characters are positioned behind actual surfaces –
one has to reason about the geometry, the ordering of the surfaces around the characters
Further Research
Tomasz Malisiewicz - Deep Learning for Augmented Reality Applications https://vimeo.com/229184982 79
80. Deep Learning for Mixed Reality
• “Towards Pervasive Augmented Reality”, TVCG 2017
▪ Focusing on Context-awareness in Augmented Reality
▪ Proposed and defined the concept of Pervasive Augmented Reality as Context-Awareness
▪ Comparison and contrast of various aspects of Conventional and Pervasive AR
Further Research
Grubert, Jens, et al. "Towards pervasive augmented reality: Context-awareness in augmented reality.” IEEE transactions on visualization and computer graphics 23.6 (2017): 1706-1724.
80
Conventional Augmented Reality Pervasive Augmented Reality
Use Sporadic Continuous
Control User Controlled Context-Controlled
Applications Specific or Niche Multi-Purpose
Hardware General Purpose Tailored/Specific
Context of Use Specific/Restricted Multi-Purpose/Adaptive/Aware
User Interface Prototypical/No Standard/Obtrusive Subtle/Disappearing/Unobtrusive
Mode of Use Task- or Goal-Oriented Context-Driven
Information Access Information Overlay Information Augmentation
Information Visualization Added Integrated/Embedded
Environment Indoors OR Outdoors Indoors AND Outdoors
Flow of Information User Seeking Information Information Seeking Users
Use of Device One Size Fits All Individualized
81. Context-aware Pervasive AR platform
• Pervasive AR continuum in conjunction with context-awareness
Overview
the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2018R1A2A1A10055673). 81
Context-aware
Pervasive AR
Vision-based
Artificial Intelligence
Scene
Understanding
[ImageNet Classification,
Krizhevsky, NIPS 2012]
Context
Awareness
[Modeling 3d environments,
Jiang, TPAMI 2016]
Geometry
Learning
[Pointnet, Qi, CVPR 2017]
< Context-aware Pervasive AR concept diagram >
Real World Virtual World
Tangible
User Interface
(TUI)
Augmented
Reality
(AR)
Augmented
Virtuality
(AV)
Virtual
Reality
(VR)
Spatial AR (SAR) Mobile AR (MAR)
Mixed World
[ Mixed Reality, Milgram, ITIS 1994 ]
[ Spatial AR, Ramesh Raskar,
ISMAR 2004 ]
[ Tangible Bits, Hiroshi Ishii,
SIG CHI 1997 ]
[ A survey of augmented reality,
Azuma, 1997 ]
Wall Display &
Pervasive Display
[ Pervasive Display Applications ,
Strohbach, M. Martin, 2011 ]
Immersive Display
[ Immersive displays , Lantz, Ed.,
ACM SIGGRAPH, 1996 ]
Projection AR
(stationary)
[ Illumiroom, H. Benko,
ACM CHI 2013 ]
Pervasive AR
[ Mobile AR, Hollerer,
Telegeoinformatics, 2004 ]
[ Mixed Reality, Milgram,
ITIS 1994 ]
[ Virtual Reality : CAVE ,
Cruz-Neira, SIGGRAPH, 1993 ]
Natural
User Interface
(NUI)
[ Natural User Interface,
Petersen, ISMAR 2009]
< Pervasive AR concept diagram >
82. Context-aware Pervasive AR platform
• Provide in-situ information/content/UI in accordance with user intent/situation by integrating projection augmented
reality system and context-aware technology
• Provide multi-modal interactions in an context-aware Pervasive AR space, enabling seamless interaction with the system
• Design and implement applications that can be used instantly in smart environments without any prior knowledge
Overview
the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2018R1A2A1A10055673). 82
Spatial mapping
RGB-D camera
(Input)
Projection AR
(Output)
Adaptive
Interface
Contextual
Information
In-situ
Content …
User #1
Standing
Position:
(x,y,z)
User
recognition
Scene
understanding
Context-awareness
Environment w/o prior knowledge
User
Real-world
Information/Content/UI
User
Pervasive AR environment with in-situ
info/content/UI via context analysis
Interaction
83. Consumer Robotics with Pervasive AR
• Personal assistant for smart home with AI & AR
▪ Clova AI device + projection AR system -> R2-D2?
Future Path
Gugenheimer, Jan, et al. "Ubibeam: exploring the interaction space for home deployed projector-camera systems." Human-Computer Interaction. Springer, Cham, 2015. 83
84. Consumer Robotics with Pervasive AR
• ADAS system with AR-enhanced situational awareness
▪ AHEAD device + projection AR system -> Mercedes-Maybach's Digital Light?
Future Path
https://newatlas.com/mercedes-maybach-digital-light-smart-headlights/53678/ March 6th, 2018
https://www.naverlabs.com/storyDetail/69 2018.11.09 84
85. THANK YOU FOR YOUR ATTENTION
Jung-Hyun Byun
junghyun.ian.byun@gmail.com