SlideShare a Scribd company logo
1 of 85
Download to read offline
Jung-Hyun Byun
junghyun.byun@msl.yonsei.ac.kr
Media System Lab, Yonsei University
PROJECTION MAPPING AND AUGMENTED REALITY
FOR PERVASIVE AR ENVIRONMENT
March 4th, 2019 / NAVER Tech Talk
Speaker Biography
Jung-Hyun Byun
junghyun.ian.byun@gmail.com
• Education
▪ B.Sc. in Computer Science and Engineering, Yonsei University, Korea
– 2011.03 ~ 2015.02
▪ Integrated Ph.D. course in Computer Science, Yonsei University, Korea
– 2015.09 ~ 2020.08 (expected)
– Media System Lab. (Advisor: Prof. Tack-Don Han)
• Publication
▪ Byun et al. "AIR: Anywhere Immersive Reality with User-Perspective Projection.“ Eurographics 2017
▪ Byun et al. "Accurate control of a pan-tilt system based on parameterization of rotational motion." Eurographics 2018
• Research Interest
▪ augmented reality, projection mapping, point cloud processing and scene reconstruction
Projection Mapping and Augmented Reality for Pervasive AR Environment
2
Table of Contents
1. Introduction: Augmented Reality and Projection
▪ Augmented reality continuum/definition/categories
▪ Projection AR comparison/research trend
2. Ongoing research of the speaker
▪ Motivation
▪ Pervasive Augmented Reality
▪ Perspective Projection Mapping
▪ Pan-Tilt Control and Registration
3. Ending remarks: Further Research & Future Path
▪ Summary
▪ Deep Learning for Mixed Reality
▪ Context-aware Pervasive AR
▪ Consumer Robotics with Pervasive AR
Projection Mapping and Augmented Reality for Pervasive AR Environment
3
AUGMENTED REALITY AND PROJECTION
Augmented Reality
• Augmented reality that has been around in the fiction for some time …
• has recently come to life for real
Public image
(1st row from left) Star Wars, Horizon Zero Dawn, Memories of the Alhambra, (2nd row from left) Pokémon GO, Spatial Systems, Inc. 5
Augmented Reality
• Head-Mounted Displays
▪ Microsoft HoloLens 2016
▪ Magic Leap 2018
• Mobile Augmented Reality
▪ Apple ARKit 2017
▪ Google ARCore 2017
• But there’s one more:
Services & Hardware
6
2016 2017 2018
Spatial Augmented Reality
Augmented Reality
• Reality–Virtuality continuum, Milgram, ITIS 1994
The continuum
Milgram, Paul, et al. "Augmented reality: A class of displays on the reality-virtuality continuum." Telemanipulator and telepresence technologies. Vol. 2351. International Society for Optics and Photonics, 1995.
https://en.wikipedia.org/wiki/Reality%E2%80%93virtuality_continuum 7
Mixed WorldReal World Virtual World
Augmented Reality
Definition and categorization
[1] Azuma, Ronald T. "A survey of augmented reality." Presence: Teleoperators & Virtual Environments 6.4 (1997): 355-385.
[2] Bimber, Oliver, and Ramesh Raskar. "Modern approaches to augmented reality." ACM SIGGRAPH 2006 Courses. ACM, 2006. 8
• Definitionally augmented reality systems … [1]
1. combine real elements and virtual images
2. are capable of interacting with the user in real-time
3. are registered in the 3D real world
• Spatial Augmented Reality [2]
▪ Virtual objects are directly superimposed onto the real world
▪ Free of instruments that cause users discomfort and fatigue
▪ Uses displays with spatial characteristics – usually projectors
• Projection-based Augmented Reality (Projection AR) means …
▪ technically, any augmented reality generated by wearable, mobile or spatial projector(s)
▪ commonly, a spatial augmented reality generated by projector(s) that are detached from the user
Image generation for augmented reality displays [2]
Augmented Reality
Characteristics and comparison
Bimber, Oliver, and Ramesh Raskar. Spatial augmented reality: merging real and virtual worlds. CRC press, 2005. 9
Head Worn Displays
strong presence
freely movable/active
small display (virtual)
fatigue/discomfort
focus != convergence
Mobile devices
accessible by many
easily carried
limits user’s sight
small display (virtual)
Projection-based
highly immersive
theoretically unlimited
FOV/display
projection distortion
device perspective
Projection Augmented Reality
• Steadily studied at Tokyo University (orange), MIT Media Lab. (green) and Microsoft Research (blue)
Research trend
10
The Everywhere
Displays
Projector, IBM
UbiComp 2001
2001 2003
iLamps
Ramesh Rasker,
MIT Media Lab
SIGGRAPH 2003
FoveAR,
Hrvoje Benko,
Microsoft,
UIST 2015
2015
Room2Room,
Tomislav Pejsa,
Microsoft,
CSCW 2016
2016
Augmented
Surfaces
Jun Rekimoto,
Sony CHI 1999
1999
Beamatron
Andrew Wilson,
Microsoft
UIST 2012
2012
illumiRoom,
Hrvoje Benko,
Microsoft,
CHI 2013
2013
LuminAR
Pattie Maes,
MIT Media Lab
UIST 2010
2010
SixthSense
Pranav Mistry,
MIT Media Lab
CHI 2009
2009
OmniTouch
Chris Harrison,
Microsoft,
UIST 2011
2011
ExtVision,
Naoki Kimura,
Tokyo University
CHI 2018
2018
Dyadic Project,
Hrvoje Benko,
Microsoft,
UIST 2014
20142014
RoomAlive,
Brett Jones,
Microsoft,
UIST 2014
MeetAlive,
Andreas Fender,
Microsoft,
ISS 2017
2017
ONGOING RESEARCH OF THE SPEAKER
Pervasive AR interaction platform
• AR / VR continuum in conjunction with Pervasive AR concept
Overview
National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2015R1A2A1A10055673). 12
Real World Virtual World
Tangible
User Interface
(TUI)
Augmented
Reality
(AR)
Augmented
Virtuality
(AV)
Virtual
Reality
(VR)
Spatial AR (SAR) Mobile AR (MAR)
Mixed World
[ Mixed Reality, Milgram, ITIS
1994 ]
[ Spatial AR, Ramesh Raskar,
ISMAR 2004 ]
[ Tangible Bits, Hiroshi Ishii,
SIG CHI 1997 ]
[ A survey of augmented reality,
Azuma, 1997 ]
Wall Display &
Pervasive Display
[ Pervasive Display Applications,
Strohbach, M. Martin, 2011 ]
Immersive Display
[ Immersive displays, Lantz, Ed.,
ACM SIGGRAPH, 1996 ]
Projection AR
(stationary)
[ Illumiroom, H. Benko,
ACM CHI 2013 ]
Pervasive AR
[ Mobile AR, Hollerer,
Telegeoinformatics, 2004 ]
[ Mixed Reality, Milgram,
ITIS 1994 ]
[ Virtual Reality : CAVE, Cruz-Neira,
SIGGRAPH, 1993 ]
Natural
User Interface
(NUI)
[ Natural User Interface,
Petersen, ISMAR 2009]
< Pervasive AR concept diagram >
Pervasive AR interaction platform
• This work defines a new concept, Pervasive AR, and conduct researches that constitute it
Overview
National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2015R1A2A1A10055673). 13
▪ Pervasive AR Display environment
- Users can project their needed information
onto any desired space and surface
- Overcome the limitations of traditional,
stationary projection technology by integrating
autonomous robot with augmented reality
▪ Seamless Interaction functionality
- Provide AR-based interaction in real-time in
conjunction with varying spaces and situations
- In addition to the interaction between user-
device, provide interaction with device-device
▪ Pervasive Interface design
- Combines TUI and NUI technology to provide
intuitive, natural interaction technology
- Provides an interface that combines motion
sensor with computer vision technology
• Scalable Pervasive AR platform
- Develop a scalable, heterogeneous,
integrated interaction platform
that provides a consistent method to
integrate a variety of technologies
Motivation
• Ordinary meeting room
▪ Projection screen in front
▪ Maybe whiteboard on side
▪ Most of the space is just there
• What can be done to make use of the remaining space?
▪ “The office of the future”
Ongoing Research
Raskar, Ramesh, et al. "The office of the future: A unified approach to image-based modeling and spatially immersive displays." Proceedings of the 25th annual conference on Computer graphics and interactive techniques. ACM, 1998. 14
Engineering Hall D, Room 816
Projection screen
Whiteboard
Vacant wall
Projector
Motivation
• Smart meeting room with Pervasive AR
▪ Projection screen in front -> Projection on any surface
▪ Maybe whiteboard on side -> Digital board
▪ Most of the space is just there -> Interactive wall
• Pervasive AR transforms a typical room into smart space with
▪ Minimal hardware
▪ Immersive environment
▪ Responsive system
▪ Seamless interaction
Ongoing Research
15
Engineering Hall D, Room 816
Projection on
any surface
Digital
board
Interactive wall
360° steerable Projector
Contribution
• Pervasive AR Key features x Research aspects x Literature contributions
Immersive Projection Environment
16
Minimal hardware
Immersive environment
Responsive system
Seamless interaction
Steerable platform with pan-tilt servos
User perspective 360° projection AR
Geometric distortion-corrected projection
Point cloud registration of indoor scene
Spatial manipulation of virtual contents
Anywhere Immersive, EG 2017
Accurate Control, EG 2018
Dynamic Perspective*, 2019
Axis Bound Registration*, 2019
Pervasive Interface^, 2019
Pervasive AR Key features
Pervasive AR Research aspects Literature Contributions
*in submission
^ work in progress
PROPOSED SYSTEM DESCRIPTION
Proposed System
• The front camera captures the data of the surface geometry that is to be projection-mapped
• The rear camera tracks the user’s position and interaction in an AR scene
• The pan-tilt platform is steered to provide 360° projection with correct user’s perspective
System Environment
18
< The ceiling-type design >
Frontal RGB-D Camera
- Acquire geometry
Projector
- Correct distortions
Rear RGB-D Camera
- Track user’s viewpointPan/tilting Servos
- Steer platform pose
< The table-type design >
Proposed System
System Prototype
19
Asus Xtion
PICO Projector
Pan-Tilt
Servos
360 PROJECTOR mini
• Key components
▪ Front-facing camera
▪ Rear-facing camera
▪ Projector
▪ Steerable Platform
• H/W configurations
▪ Microsoft Kinect v2
▪ Microsoft Kinect 360
▪ Epson 1771W Projector
▪ HS-785HB Servos + Arduino board
Kinect 360
Pan-Tilt
Platform
Epson Projector
360 PROJECTOR V1 360 PROJECTOR V2
Epson Projector
Kinect 360
Pan-Tilt
Platform
Arduino
Kinect V2
Proposed System
• The front camera captures the data of the surface geometry that is to be projection-mapped
• The rear camera tracks the user’s position and interaction in an AR scene
• The pan-tilt platform is steered to provide 360° projection with correct user’s perspective
• Ceiling-type design implemented in a 3.7 x 4.0 x 2.25 m3 (W x D x H) cubic space
System Implementation
20
< The schematic design > < The implemented environment >
Proposed System
• Overview: the proposed system …
▪ tracks a user’s position and interaction
▪ renders augmented virtual contents from correct perspective
▪ performs projection mapping without geometric distortion
▪ acquires and reconstructs spatial information
System Flow
21
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
PAN-TILT AXIS CALIBRATION AND CONTROL
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“Accurate Control of a Pan-tilt System Based on Parameterization of Rotational Motion”
Eurographics 2018
Motivation
• We want to control a pan-tilt system in a way that …
▪ can accurately “target” a point in 3D space
▪ can find the rotation so that the target point captured after rotation lies on the optical axis
▪ i.e., the captured target point should coincide with the optical center
▪ can model its rotation trajectories and plan its motion ahead
• To accurately control a pan-tilt system, we implemented:
▪ A general pan-tilt assembly model without specifications
▪ A physics-consistent pan-tilt rotation model
▪ A calibration process to recover rotation parameters
▪ An inverse kinematics interpretation for manipulation
Accurate Control of a Pan-tilt System
23
Tiltrotation
Pan rotation
Target Point
Pan-Tilt Rotation Modeling
• Two-step process to recover rotation parameters
▪ estimate the direction vector of the rotation plane
▪ estimate the center pivot of the circular trajectory
• Points rotated about an axis form a closed circular rotation trajectory on a 3-dimensional plane, where
▪ the rotation axis is the normal of the plane
▪ the circle center is the intersection with the plane
• Capture multiple frames of a large checkerboard with pre-known the structure, during rotation
▪ all the trajectories can be represented with respect to the trajectory of the top-leftmost corner (𝒙 𝟎𝟎)
▪ thus stable global optimization of the trajectory is possible
Rotation Parameters Acquisition
24
Optical
Center
Rotation
Center
Pan Axis
Tilt Axis
Rotation model formation
Pan-Tilt Rotation Modeling
• For the rotation of the upper-left corner
▪ Let us denote
– the rotation axis:
𝒙−𝒂
𝒏 𝒙
=
𝒚−𝒃
𝒏 𝒚
=
𝒛−𝒄
𝒏 𝒛
– the rotation direction vector: 𝒏 = 𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛
𝑻
, 𝒏 = 𝟏
– the rotation circle center: 𝒑 = (𝒂, 𝒃, 𝒄) 𝑻
▪ Then equations form as
– the rotation plane:
𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 = 𝟎
– the rotation trajectory:
𝒙 − 𝒂 𝟐
+ 𝒚 − 𝒃 𝟐
+ 𝒛 − 𝒄 𝟐
= 𝒓 𝟐
Rotation Parameters Acquisition
25
𝒙 𝟎𝟎
(𝒂, 𝒃, 𝒄)
[𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛]
𝒙 − 𝒂
𝒏 𝒙
=
𝒚 − 𝒃
𝒏 𝒚
=
𝒛 − 𝒄
𝒏 𝒛
𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 = 𝟎
𝒙 − 𝒂 𝟐
+ 𝒚 − 𝒃 𝟐
+ 𝒛 − 𝒄 𝟐
= 𝒓 𝟐
𝒓
Pan-Tilt Rotation Modeling
• For the rotation of the corner at the i-th row and j-th column
▪ Let us denote
– the distance of foots on the rotation axis between vertical corners: 𝒅 𝒉
– the distance of foots on the rotation axis between horizontal corners: 𝒅 𝒘
▪ Then equations form as:
– the rotation circle center: 𝒑𝒊𝒋 = 𝒂𝒊𝒋, 𝒃𝒊𝒋, 𝒄𝒊𝒋
𝑻
𝒂𝒊𝒋 = 𝒂 − 𝒏 𝒙 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘
𝒃𝒊𝒋 = 𝒃 − 𝒏 𝒚(𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘)
𝒄𝒊𝒋 = 𝒄 − 𝒏 𝒛(𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘)
– the rotation plane:
𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 + 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 = 𝟎
– the rotation trajectory:
𝒙 − 𝒂𝒊𝒋
𝟐
+ 𝒚 − 𝒃𝒊𝒋
𝟐
+ 𝒛 − 𝒄𝒊𝒋
𝟐
= 𝒓𝒊𝒋
𝟐
Rotation Parameters Acquisition
26
𝒙 𝟏𝟎
𝒙 𝟎𝟏
𝒙 − 𝒂𝒊𝒋
𝟐
+ 𝒚 − 𝒃𝒊𝒋
𝟐
+ 𝒛 − 𝒄𝒊𝒋
𝟐
= 𝒓𝒊𝒋
𝟐
𝒅 𝒘
𝒓𝒊𝒋
(𝒂𝒊𝒋, 𝒃𝒊𝒋, 𝒄𝒊𝒋)
𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛
+ 𝒅 + 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 = 𝟎
𝒅 𝒉
𝒙 𝟎𝟎
𝒊
𝒋
Pan-Tilt Rotation Modeling
• For k-th captured checkerboard frame
▪ Let us denote
– the 3D coordinate of the corner at i-th row and j-th column:
𝒗𝒊𝒋𝒌 = 𝒙𝒊𝒋𝒌, 𝒚𝒊𝒋𝒌, 𝒛𝒊𝒋𝒌
𝑻
▪ Minimize errors of two objective functions using the global least squares method
– Find the parameters 𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛, 𝒅, 𝒅 𝒉, 𝒅 𝒘 that minimize the planar term:
– With recovered parameters, find parameters 𝒂, 𝒃, 𝒄, 𝒓𝒊𝒋 that minimize the circular term:
Rotation Parameters Acquisition
Chen, Ping, et al. "Rotation axis calibration of a turntable using constrained global optimization." Optik-International Journal for Light and Electron Optics 125.17 (2014): 4831-4836.
27
Evaluation
• Captured 28 frames in pan rotation (-25°~15°), and 11 frames in tilt rotation (-17° ~ 16°)
• Pan and tilt rotation trajectories were computed and from those, the visual odometry of the camera can also be tracked
System Configuration and Calibration
28
Bottommost Tilt
Topmost Tilt
LeftmostPan
RightmostPan
Point cloud accumulation
Evaluation
• The rotation axis calibration results of the pan-tilt system
• Pan and tilt trajectories were calculated (green circle)
• The point cloud of the projector was captured from behind by the rear camera
• Thus the front camera was occluded, and its positions and poses were estimated as the “pan-tilt camera” (white cube)
System Configuration and Calibration
29
Rear Kinect 360
Pan-Tilt Platform
Front Kinect V2
Projector
Evaluation
• The task at hand:
▪ Want to orient the system so that it can accurately “target” a point in 3D space
▪ Given the target point, the optical axis of the camera should “pass through” the target object
▪ In other words, the target should coincide with the optical center after the rotation
• Interpretation in inverse kinematics terms
▪ Think of the optical axis as a “linear actuator”
▪ and the target point on the axis as an “end effector”
Servo Control with Inverse Kinematics
Buss, Samuel R. "Introduction to inverse kinematics with jacobian transpose, pseudoinverse and damped least squares methods." IEEE Journal of Robotics and Automation 17.1-19 (2004): 16. 30
Tiltrotationangle
Pan rotation angle
“End effector”
Evaluation
• Testing of the accurate targeting capability
▪ The system is tasked to adjust its attitude so that the target point is at the optical center of the camera
▪ The coordinates of the 70 checkerboard corners are used as target points
▪ Proposed inverse kinematics control method and Single Point Calibration Method (SPCM) are compared
• Errors between the intended destination of a point and the actually captured point after the rotation are measured
▪ Root Mean Squared Errors (RMSE) of L2-norms in XY image pixels and real distances (mm)
▪ Mean Absolute Errors (MAE) of L2-norms in each X, Y (and Z) directions in both pixels and mm
Experiment Result
Li, Yunting, et al. "Method for pan-tilt camera calibration using single control point." JOSA A 32.1 (2015): 156-163. 31
GEOMETRIC DISTORTION COMPENSATED PROJECTION
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“AIR: Anywhere Immersive Reality with User-Perspective Projection”
Eurographics 2017
Motivation
• Geometric Calibration required by most projector-camera applications [1]
▪ Geometrically register the projectors to the surface and to enable the generation of a consistent projection onto a
complex surface geometry
• User Perspective Rendering in tablet-based augmented reality [2]
▪ Comparison of device-perspective rendering, user-perspective rendering, and a ground truth
User-Perspective Projection
[1] Grundhöfer, Anselm, and Daisuke Iwai. "Recent advances in projection mapping algorithms, hardware and applications." Computer Graphics Forum. Vol. 37. No. 2. 2018.
[2] Tomioka, Makoto, Sei Ikeda, and Kosuke Sato. "Approximated user-perspective rendering in tablet-based augmented reality." 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2013. 33
Motivation
• Distortion Correction and User Perspective for projection AR
User-Perspective Projection
Tomioka, Makoto, Sei Ikeda, and Kosuke Sato. "Approximated user-perspective rendering in tablet-based augmented reality." 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2013. 34
Adapt &
substitute
Proposed System
• Schematic model and process overview of the proposed system
▪ consists of a projector, two RGB-D cameras, pan-tilt servos, and a control board
▪ designed to be portable, so that it can placed on a table-top with unknown surroundings
System Overview
35
Front RGB-D Camera
- Acquire geometry
Projector
- Correct distortion
Rear RGB-D Camera
- Track user’s view pt.
Pan/tilting Servos
- Steer platform pose
RGB-D Camera Pan/tilt motors
World Geometry
User view pt.
tracking
View-depend.
rendering
Perspective
project. mat.
Distortion-corrected Projection
Perspective
Rendering
Projective
Texturing
Texture map.
on geometry
Workflow
Proposed System
• Calibrating between cameras and a projector with gray code patterns.
• First, the pixel correspondences between the projector and both front and rear RGB cameras are computed.
• Then, corresponding points in the color images are transformed to depth image points.
• Finally, the depth image points are un-projected to 3D points, which are used to calibrate the projector and between
front and rear cameras.
Projector-Camera Calibration
36
Proposed System
• Calibration & Two-pass Rendering
• Calibration: geometry construction
▪ Register poses of cameras, a projector and a user
• First-pass: perspective rendering
▪ Track the user and render scenes from perspective
• Second-pass: distortion correction
▪ Projectively map texture onto geometry
• Finally, display mapped rendering from the projector
System Flow
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Front
Camera
Rear
Camera
Projector/
Servos
Track User
View
Perspective
Matrix
Offscreen
Rendering
Projective
Matrix
Texture
Coordinate
Projective
Mapping
Proposed System
• Register separate coordinates front, proj, rear in one global system world
• 𝑷 𝒘𝒐𝒓𝒍𝒅 = 𝑇𝑟𝑒𝑎𝑟→𝑤𝑜𝑟𝑙𝑑 𝑷 𝒓𝒆𝒂𝒓 = 𝑅 𝑝𝑎𝑛 𝛼 𝑅𝑡𝑖𝑙𝑡 𝛽 𝑷 𝒇𝒓𝒐𝒏𝒕
• 𝑠 ⋅ 𝒙 𝒑𝒓𝒐𝒋 = 𝐴 𝑝𝑟𝑜𝑗 𝑇𝑓𝑟𝑜𝑛𝑡→𝑝𝑟𝑜𝑗 𝑷 𝒇𝒓𝒐𝒏𝒕
• Solve internal and external parameters as reverse-camera model
System Flow
38
Front Camera (front)
- Changes w.r.t servos Rear Camera (rear)
- Fixed in world
Pan/tilt Servos (servo)
- Rotate in world
User Perspective (user)
- Subordinate to rear
Global world coordinate
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Projector (proj)
- Subordinate to front
Proposed System
• Rear camera tracks the user’s viewpoint 𝐸 𝑢𝑠𝑒𝑟
▪ Setup the perspective matrix 𝐴 𝑢𝑠𝑒𝑟 from 𝐸 𝑢𝑠𝑒𝑟
• Configure perspective equation: 𝐴 𝑢𝑝𝑟 = 𝐴 𝑢𝑠𝑒𝑟 𝑇 𝑤𝑜𝑟𝑙𝑑→𝑟𝑒𝑎𝑟
• First-pass: render off-screen scene to a texture, projTex , with FrameBuffer Object
System Flow
TOMIOKA M., IKEDA S., SATO K.: Approximated user perspective rendering in tablet-based augmented reality. In Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on (2013), IEEE, pp. 21–28.
39
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Global world coordinate
Rear Camera (rear)
- Fixed in world
𝐴 𝑢𝑝𝑟
User Perspective (user)
- Subordinate to rear
Proposed System
• Setup the projective matrix 𝐴 𝑢𝑝𝑟
−1
• Configure projective equation: 𝑠 𝐴 𝑢𝑝𝑟
−1
projTexprojCoord = 𝑃𝑔𝑒𝑜𝑚𝑒𝑡𝑟𝑦
• Second-pass: map projTex onto geometry with projCoord
▪ Display user-perspective rendering from the projector’s viewpoint
System Flow
40
Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction
Global world coordinate
Pan/tilt Servos (servo)
- Rotate in world
User Perspective (user)
- Subordinate to rear
Rear Camera (rear)
- Fixed in world
Front Camera (front)
- Changes w.r.t servos
Projector (proj)
- Subordinate to front
Evaluation
• Geometric distortion correction results of XY grid images. Before & after
Implementation Result
41
(c) Pan = -45°, Tilt = 45°(a) Pan = 0°, Tilt = 0° (b) Pan = 0°, Tilt = 45°
DistortioncorrectedSurfacedistortion
• To quantify the image degradation, MirageTable[1] authors…
▪ computed Root Mean Square(RMS) errors of absolute intensity differences of corrected projections, pixel-by-pixel
▪ varying color conditions yield substantially greater, irrelevant RMS errors
• Instead we…
1
𝑁
෍
𝑖=1
𝑁
𝑃𝑏𝑎𝑠𝑒 − 𝑃𝑖 2
2
, where N: the number of corners
▪ compute differences in structural compositions, point-by-point
▪ setup the base image by projecting a checkerboard on a planar surface
▪ place different obstacles and correct projection output accordingly
▪ match corners of checkerboards with the base image and compute dislocations
Quantitative metric
Evaluation
[1] Benko, Hrvoje, Ricardo Jota, and Andrew Wilson. "MirageTable: freehand interaction on a projected augmented reality tabletop." Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 2012.
• Distortion correction works well on
▪ (d), (e) complex geometry
▪ (f) a freely deformable object
Results
Evaluation
Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20.
(a) Base (b) Drawer (14.98) (c) Heater (10.97)
(d) Panels (6.28) (f) Plastic (5.21)(e) Box (6.45)
• Highest dislocation errors on
▪ (b) rigid geometry
▪ (c) curved geometry
Results
Evaluation
Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20.
(a) Base (b) Drawer (14.98) (c) Heater (10.97)
(d) Panels (6.28) (f) Plastic (5.21)(e) Box (6.45)
• Slopes in red dotted region …
▪ are almost parallel to the depth camera
▪ result in inaccurate depth data acquisition
▪ the current system cannot cope with inherently invalid values
• “KinectToF affected by a low incident angle (angles below 10°), detects the corrupted pixels resulting in
extremely range errors up to 800 mm.”
• “For incident angles between 10° and 30° the KinectToF delivers up to 100% invalid pixel.”
Limitation
Evaluation
Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20.
(a) Base (b) Drawer (14.98) (c) Heater (10.97)
DYNAMIC PERSPECTIVE PROJECTION MAPPING
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“PPAP: Perspective Projection Augment Platform with Dynamic Control of Steerable Pan-Tilt System”
In submission
Motivation
• Cues for Perceiving Spatial Relationships
Spatial Perception in CGI
Grossman, Tovi, and Ravin Balakrishnan. "An evaluation of depth perception on volumetric displays." Proceedings of the working conference on Advanced visual interfaces. ACM, 2006.
Wanger, Leonard R., James Ferwerda, and Donald P. Greenberg. "Perceiving spatial relationships in computer-generated images." IEEE Computer Graphics and Applications 12.3 (1992): 44-58.
Benko, Hrvoje, Andrew D. Wilson, and Federico Zannier. "Dyadic projected spatial augmented reality." Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014. 47
Perspective projection
• Render with projective geometry
• Gives a perspective
Shading and Shadow
• Shadow from object to surface
• Gives positional information
• Basics of graphics rendering
• Feels off when done wrong
• But there’s not much to add
• Degrades rendering quality
• Need to wear 3D glasses
• Not instrument-free for user
Stereopsis
• Using the difference in vision
• Gives a sense of depth
Motion parallax
• Change in position with motion
• Gives spatial relations
• Perception of 3D in far distance
• Works in monocular imagery
• Good fit with pan-tilt mechanism
Our focus in monocular projection
System Description
• (a) The user’s position and perspective are tracked by the rear camera (gray arrow).
• (b) As the user inspects the virtual object (green circle), the user’s line-of-sight (blue arrow) is ray-casted to locate its
view center on the surface geometry (green rectangle). The view center is projected (yellow arrow) on the projector’s
image domain, and appropriate pan and tilt angles (a, b) are computed.
• (c) The projector rotates to match its projection center and the user’s view center to render more parts of the virtual object.
Servo Control with User Perspective
48
System Description
• In dynamic perspective projection, the projector can rotate itself using the pan-tilt servo motors
• The pan-tilt projector automatically rotates focusing on the perspective-mapped surface at the user's point of view
• Therefore, the dynamic perspective projection system can greatly expand the viewing angle compared to the static
perspective projection, thereby providing a more immersive augmented reality experience to the user
Servo Control with User Perspective
49
< Static Perspective Projection > < Dynamic Perspective Projection >
System Description
• The user’s perspective is tracked by the rear camera and ray-casted to determine its view center on the surface in the
reference coordinate.
• The view center coordinate is sequentially transformed to the front camera and the projector coordinates.
• Finally, the coordinate is projected onto the image domain of the projector to determine the corresponding point in the
projection texture.
Dynamic Projection with Perspective Mapping
50
System Description
• The pan-tilt platform is rotated to match the view center and the projection center, so that the user can perceive the
augmented content in more widened viewing angle.
• Given the user’s viewpoint, the goal is to find the rotation angles α and β that minimize the following displacement
error in the image domain:
Dynamic Projection with Perspective Mapping
51
Servo Rotation Control Projection Viewpoint Control
User Experiments
• H1. The participants are able to perceive spatial representation of the virtual objects, projected by the proposed system.
• H2. The participants are able to rate the size and distance more correctly when they can view wider range of the
augmented object.
• H3. The further distance of the virtual object to the projected surface of the wall affects negatively on its presence that is
perceived by the participants.
Main Hypotheses
52
< Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
User Experiments
• Investigated other factors that can affect the experiment results: Gender, Gaming, Driving
• Participants were asked to rate the size and distance of the virtual objects with
▪ three different sizes (30 cm, 40 cm and 50 cm radii)
▪ three different distances (1.6 m, 2.0 m and 2.4 m away)
▪ two different conditions (static projection vs dynamic projection)
Experiment Design
53
< Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
User Experiments
• 11 participants aged from 25 to 31 (M = 27.1 years, SD = 1.8 years).
• 18 configurations (Size (3) x Distance (3) x Condition (2)) with 3 trials => Total 54 ratings per participant
• Within-subject design -> the presentation order of static and dynamic conditions counterbalanced
• 15 seconds time limit per configuration / once time lapsed, the projection was turned off
Experiment Procedure
54
< Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
Results and Analysis
• The participants were able to correctly rate 61.1% of size variations and 68.7% of distance variations
• The participants were able to correctly rate 47.0% of both size & distance combinations (Overall correctness)
• A naïve guess -> 11.1% chance of being correct (1 out of 9 size*distance combinations)
• H1. The participants are able to perceive spatial representation of the virtual objects, projected by the proposed system.
=> Accepted
Hypothesis H1
55
Results and Analysis
• Only 4.7% of the total ratings missed by more than one option, such as mistaking a “near” distance as “far”.
• Encoded participants’ responses into a binary scale, either “correct” or “incorrect”
▪ allows us to analyze the performance results using binomial regression
▪ simplifies the interpretation and analysis of effective factors on spatial presence and perception (H2, H3)
• Within-subject design + categorical responses -> the repeated measures logistic regression
▪ analyzed using Generalized Estimating Equations (GEE) with IBM SPSS Statistics software
▪ used to estimate the parameters of a model with a possible unknown correlation between outcomes
• Wald Chi-Square test -> to find the correlation between categorical factors
• Three variables as predictors: Size, Distance and Condition w/ binary correctness
• Significance level as α = 0.05 (literature standard)
▪ “Assuming that the null hypothesis is true, we may reject the null only if the observed data are so unusual that
they would have occurred by chance at most 5 % of the time.“
▪ “α sets the standard for how extreme the data must be before we can reject the null hypothesis.”
▪ “The p-value indicates how extreme the data are.” -> reject the null hypothesis, if p <= 0.05.
Hypothesis H2 & H3
Dr. Janxin Leu, “What is the difference between an alpha level and a p-value”, Fundamentals of Psychological Research, University of Washington 56
Results and Analysis
• The condition (static or dynamic) was found to be highly significant (𝛘2 = 12.431, df = 1, p < .001) over size/distance
▪ 34.3% correct in the static condition
▪ 59.6% correct in the dynamic condition
• H2. The participants are able to rate the size and distance more correctly when they can view wider range of the
augmented object. => Accepted
• Participants reported the increased viewing angle (150° for the dynamic and 102° for the static) helped them feel more
comfortable and immersed, so that it was more easy to deduce the spatial relationship of virtual object.
Hypothesis H2
57
Results and Analysis
• The distance was found to be statistically not significant (𝛘2 = 5.213, df = 2, p = .074) at first
• The result was quite contradictory as several researches reported
▪ the quality of the projection degrades as it is distanced from the projection surface
▪ and thus users find a virtual object less present
Hypothesis H3
58
Results and Analysis
• Analyses on the separated subsets were found significant (𝛘2 = 5.992, df = 2, p = .049, in dynamic condition)
▪ the projector FOV is fixed, thus the visible area is limited depending on the projection distance
▪ the dynamic projection rotates the projector to match the user’s view, increasing the effective FOV of the projector
▪ Increased viewing angle -> more virtual parallax -> more immersion and presence in AR
• H3. The further distance of the virtual object to the projected surface of the wall affects negatively on its presence that is
perceived by the participants. => Accepted.
Hypothesis H3
59
PAN-TILT POINT CLOUD REGISTRATION
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“Fast and Accurate Reconstruction of Pan-Tilt RGB-D Scans via Axis Bound Registration”
In submission
Motivation
• Spatial Mapping for Pervasive AR
• SLAM/ICP-based registration
• Calibration-based registration
Spatial Mapping for Pervasive AR
Wilson, Andrew, et al. "Steerable augmented reality with the beamatron." Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012.
Fender, Andreas Rene, Hrvoje Benko, and Andy Wilson. "MeetAlive: Room-Scale Omni-Directional Display System for Multi-User Content and Control Sharing." Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces. ACM, 2017.
https://msrstudio99.wordpress.com/2016/01/25/a-panorama-of-skies/ 61
Motivation
• Iterative Closest Point point-to-point
▪ a method that iteratively disregards outliers in order to improve upon the previous estimate of the rotation and
translation parameters.
• (a) obtains point correspondences by searching for the nearest neighbor target point from the source point cloud.
• (b) estimates the rotation R and translation t by minimizing the squared distance between the corresponding pairs
• ICP then iteratively solves (a) and (b) to improve upon the estimates of the previous iterations
General ICP Formulation
Bellekens, Ben, et al. "A survey of rigid 3D pointcloud registration algorithms." AMBIENT 2014: the Fourth International Conference on Ambient Computing, Applications, Services and Technologies, August 24-28, 2014, Rome, Italy. 2014.
62
(a) (b)
(a)
(b)
Uncertain point correspondences induce the iterative nature, which leads to high computational cost
Motivation
• Formulation a light-weight solution for indoor scene registration by …
1. Utilizing pan-tilt rotation trajectories as the visual odometry
▪ Initial rough estimation of the camera pose
▪ Rotation trajectory-based matching and rejection
2. Dividing into and alternatingly optimizing sub-problems
▪ Pairwise axis bound transform estimation
Rotation axis calibration
63
Tiltrotation
Pan rotation
Target Point
Axis Bound Registration
Axis Bound Registration
• Offline calibration
• Online registration
▪ Global registration
▪ Local registration
Overview of the pipeline
64
Rotation axis
calibration*
Pulse-width
mapping*
1. Initial global
pose estimation
2. Trajectory-based
feature rejection
Input RGB frame
Input depth frame
3D keypoint cloud
generation
Descriptor-based
feature matching
3. Corresponding
point pairs check
4. Pairwise axis
bound transform
Final local
registration matrix
*offline process
RGB-D camera
Pan-tilt servos
Axis Bound Registration
• Servo motors …
▪ use potentiometers to orient themselves to certain directions
▪ rotate a certain angle that is linearly proportional to the applied pulse width
• The camera pose can be roughly estimated based on the servo control
• Rotation axis calibration yields the global transform and pulse-width mapping
1. Initial global pose estimation
65
Axis Bound Registration
• Trajectory constraint is applied to reject outlier matching pairs between two proximate frames
▪ (a) Keypoints are extracted from the color images and matched using feature descriptors
▪ (b) Keypoint pairs are un-projected to 3D points in global space, represented with halos.
▪ (c) Keypoint pairs whose distance is below the threshold are accepted as correspondence pairs.
▪ (d) Keypoint pairs whose distance is above the threshold are rejected as outlier matches.
2. Trajectory-based feature rejection
66
Axis Bound Registration
• Outliers are rejected if the distance in global coordinates to estimated pairs exceeds a certain threshold (e.g., 100 mm).
• Notice the keypoint pairs removed …
▪ are matched in diagonal directions (opposed to the camera movement)
▪ have weak correspondences (erroneous depth data around edges and on black/reflective surfaces)
3. Corresponding point pairs check
67
Initial matching of 364 keypoint pairs (based on descriptor distance).
Outlier rejection of 97 keypoint pairs (based on trajectory distance).
Final correspondence result of 267 keypoint pairs.
Axis Bound Registration
• Rotation axes …
▪ downsize the problem of 6 DoF [R|t] transform into two variables α, β rotation
▪ serve as a constraint for efficient and robust local registration
4. Pairwise axis bound transform
68
Alternating optimization
Sub-problem division
SVD SVD
Evaluation
• No available public dataset for pan–tilt RGB-D scan registration including the rotation axes calibration data
• The dataset was made in-house and used for the evaluation
Pan-tilt registration dataset
69
(a) Scene 0 and 10 (13.6°) (b) Scene 0 and 20 (27.1°) (c) Scene 0 and 30 (40.7°)
(d) Scene 39 and 45 (12.7°) (e) Scene 39 and 49 (21.2°) (f) Scene 39 and 55 (33.9°)
(g) Scene 0 and 47 (27.1°/8.5°)
(h) Scene 47 and 37 (23°/8.5°)
Pan rotation dataset Pan & tilt rotation dataset
Tilt rotation dataset
Evaluation
• Registration without ICP
• Consecutive scenes 0~9
• Pan rotation ~12.2°
• Fast – under 0.0002 sec
Qualitative Result
70
Evaluation
• RGBD-Calib [1], ORB-SLAM2 [2], FGR [3], and S4PCS [4] are compared with the proposed method.
• The RMS Euclidean distance error of N closest point pairs is measured [1].
▪ The shortest time to complete is highlighted in yellow.
▪ The lowest RMS error is highlighted in red.
▪ The algorithm with the shortest time and lowest RMSE is emboldened.
• Out of 8 test cases, the proposed method was able to
▪ score the lowest error in 5 cases
▪ complete as the fastest in 5 cases
▪ and was both the fastest & lowest 3 cases
Quantitative Result
[1] Chi-Yi Tsai and Chih-Hung Huang. Indoor scene point cloud registration algorithm based on rgb-d camera calibration. Sensors, 17(8):1874, 2017.
[2] Raúl Mur-Artal and Juan D. Tardós. ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras. IEEE Transactions on Robotics, 33(5):1255–1262, 2017.
[3] Qian-Yi Zhou, Jaesik Park, and Vladlen Koltun. Fast global registration. In European Conference on Computer Vision, pages 766–782. Springer, 2016.
[4] Nicolas Mellado, Dror Aiger, and Niloy J Mitra. Super 4pcs fast global pointcloud registration via smart indexing. In Computer Graphics Forum, volume 33, pages 205–215. Wiley Online Library, 2014. 71
PERVASIVE INTERFACE REGISTRATION
R. Camera User tracking User viewpoint
User interaction
AR scene rendering
Pan-tilt platform
Projector
F. Camera Surface geometry 4. Scene reconstruction
2. Distortion correction
Texture generation
Projective mapping 3. Spatial projection AR
1. Rotation axis calibration
5. Spatial manipulation
“PRISM: Interactive Projection System for Pervasive Registration of Interface with Spatial Manipulation”
Work in progress
Motivation
• Pervasive computing environment in MR / VR / AR
• Applications and widgets in the form of windows or grids
• Intuitive & Natural & Comfortable UI/UX for Pervasive Computing in Immersive AR
Pervasive Computing & Immersive AR
Microsoft HoloLens demo onstage at BUILD 2015, https://www.youtube.com/watch?v=3AADEqLIALk
Introducing Oculus Dash, https://www.youtube.com/watch?v=SvP_RI_S-bw
Fender, Andreas Rene, Hrvoje Benko, and Andy Wilson. "Meetalive: Room-scale omni-directional display system for multi-user content and control sharing." Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces. ACM, 2017. 73
< Oculus Dash > < Microsoft MeetAlive >< Microsoft HoloLens >
Project Overview
• Implementation of pervasive projection AR environment and spatial interaction
▪ 3D reconstruction of an indoor scene
▪ Tracking of the user’s behavior and interaction
▪ Selection and proposal of the projection appropriate region
▪ Window-based registration and mapping of applications and widgets
▪ Spatial interaction for manipulation and control in the AR environment
PRISM: Pervasive Registration of Interface with Spatial Manipulation
74
(a) Spatial Interaction – Tracking (b) Spatial Interaction – Selection (c) Mobile Interaction – Cursor
Implementation Detail
1. Axis Bound Registration for Indoor Scene Reconstruction
2. Hough Plane Detection for Projection Region Selection
3. Planar Grid Partitioning for Multi-Window Generation
4. Touch and Ray Casting based Window Manipulation
PRISM: Pervasive Registration of Interface with Spatial Manipulation
75
Point Accumulation
Coordinate polarization
Distance check
Hough transform
Plane Formation
Inlier check
Region clustering
Largest inner rectangle
Scene Reconstruction
Rotation axis calibration
Pan-tilt RGB-D scan
Axis-bound registration
Window Generation
XY grid projection
Contour & boundary
check
Multi window
partitioning
Implementation Detail
1. Axis Bound Registration for Indoor Scene Reconstruction
2. Hough Plane Detection for Projection Region Selection
3. Planar Grid Partitioning for Multi-Window Generation
4. Touch and Ray Casting based Window Manipulation
PRISM: Pervasive Registration of Interface with Spatial Manipulation
76
< Skeleton Ray Casting – Debug View> < Skeleton Ray Casting – User View>
FURTHER RESEARCH & FUTURE PATH
Summary
• I have …
1. presented a novel concept of Pervasive AR space and its implementation
2. proposed a geometric distortion correction method to realize projection on any surface
3. showed that enhanced presence can be delivered in monocular views with dynamic perspective projection
4. proposed an inverse kinematics servo control method in conjunction with rotation axis calibration
5. proposed a fast and efficient method to register point clouds with pan-tilt rotation priors
Projection Mapping and Augmented Reality for Pervasive AR Environment
78
Deep Learning for Mixed Reality
• Magic Leap Principal Engineer ’Tomasz Malisiewicz‘
▪ Deep learning is going to be the most important ingredient for this mixed reality …
▪ because it is the "master algorithm" for working with perceptual data
▪ Virtual characters are positioned behind actual surfaces –
one has to reason about the geometry, the ordering of the surfaces around the characters
Further Research
Tomasz Malisiewicz - Deep Learning for Augmented Reality Applications https://vimeo.com/229184982 79
Deep Learning for Mixed Reality
• “Towards Pervasive Augmented Reality”, TVCG 2017
▪ Focusing on Context-awareness in Augmented Reality
▪ Proposed and defined the concept of Pervasive Augmented Reality as Context-Awareness
▪ Comparison and contrast of various aspects of Conventional and Pervasive AR
Further Research
Grubert, Jens, et al. "Towards pervasive augmented reality: Context-awareness in augmented reality.” IEEE transactions on visualization and computer graphics 23.6 (2017): 1706-1724.
80
Conventional Augmented Reality Pervasive Augmented Reality
Use Sporadic Continuous
Control User Controlled Context-Controlled
Applications Specific or Niche Multi-Purpose
Hardware General Purpose Tailored/Specific
Context of Use Specific/Restricted Multi-Purpose/Adaptive/Aware
User Interface Prototypical/No Standard/Obtrusive Subtle/Disappearing/Unobtrusive
Mode of Use Task- or Goal-Oriented Context-Driven
Information Access Information Overlay Information Augmentation
Information Visualization Added Integrated/Embedded
Environment Indoors OR Outdoors Indoors AND Outdoors
Flow of Information User Seeking Information Information Seeking Users
Use of Device One Size Fits All Individualized
Context-aware Pervasive AR platform
• Pervasive AR continuum in conjunction with context-awareness
Overview
the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2018R1A2A1A10055673). 81
Context-aware
Pervasive AR
Vision-based
Artificial Intelligence
Scene
Understanding
[ImageNet Classification,
Krizhevsky, NIPS 2012]
Context
Awareness
[Modeling 3d environments,
Jiang, TPAMI 2016]
Geometry
Learning
[Pointnet, Qi, CVPR 2017]
< Context-aware Pervasive AR concept diagram >
Real World Virtual World
Tangible
User Interface
(TUI)
Augmented
Reality
(AR)
Augmented
Virtuality
(AV)
Virtual
Reality
(VR)
Spatial AR (SAR) Mobile AR (MAR)
Mixed World
[ Mixed Reality, Milgram, ITIS 1994 ]
[ Spatial AR, Ramesh Raskar,
ISMAR 2004 ]
[ Tangible Bits, Hiroshi Ishii,
SIG CHI 1997 ]
[ A survey of augmented reality,
Azuma, 1997 ]
Wall Display &
Pervasive Display
[ Pervasive Display Applications ,
Strohbach, M. Martin, 2011 ]
Immersive Display
[ Immersive displays , Lantz, Ed.,
ACM SIGGRAPH, 1996 ]
Projection AR
(stationary)
[ Illumiroom, H. Benko,
ACM CHI 2013 ]
Pervasive AR
[ Mobile AR, Hollerer,
Telegeoinformatics, 2004 ]
[ Mixed Reality, Milgram,
ITIS 1994 ]
[ Virtual Reality : CAVE ,
Cruz-Neira, SIGGRAPH, 1993 ]
Natural
User Interface
(NUI)
[ Natural User Interface,
Petersen, ISMAR 2009]
< Pervasive AR concept diagram >
Context-aware Pervasive AR platform
• Provide in-situ information/content/UI in accordance with user intent/situation by integrating projection augmented
reality system and context-aware technology
• Provide multi-modal interactions in an context-aware Pervasive AR space, enabling seamless interaction with the system
• Design and implement applications that can be used instantly in smart environments without any prior knowledge
Overview
the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2018R1A2A1A10055673). 82
Spatial mapping
RGB-D camera
(Input)
Projection AR
(Output)
Adaptive
Interface
Contextual
Information
In-situ
Content …
User #1
Standing
Position:
(x,y,z)
User
recognition
Scene
understanding
Context-awareness
Environment w/o prior knowledge
User
Real-world
Information/Content/UI
User
Pervasive AR environment with in-situ
info/content/UI via context analysis
Interaction
Consumer Robotics with Pervasive AR
• Personal assistant for smart home with AI & AR
▪ Clova AI device + projection AR system -> R2-D2?
Future Path
Gugenheimer, Jan, et al. "Ubibeam: exploring the interaction space for home deployed projector-camera systems." Human-Computer Interaction. Springer, Cham, 2015. 83
Consumer Robotics with Pervasive AR
• ADAS system with AR-enhanced situational awareness
▪ AHEAD device + projection AR system -> Mercedes-Maybach's Digital Light?
Future Path
https://newatlas.com/mercedes-maybach-digital-light-smart-headlights/53678/ March 6th, 2018
https://www.naverlabs.com/storyDetail/69 2018.11.09 84
THANK YOU FOR YOUR ATTENTION
Jung-Hyun Byun
junghyun.ian.byun@gmail.com

More Related Content

What's hot

MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料
MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料
MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料yoyamasaki
 
いまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authlete
いまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authleteいまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authlete
いまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authleteTatsuo Kudo
 
パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~
パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~
パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~Tatsuo Kudo
 
Verilator勉強会 2021/05/29
Verilator勉強会 2021/05/29Verilator勉強会 2021/05/29
Verilator勉強会 2021/05/29ryuz88
 
Differentiable Ray Sampling for Neural 3D Representation
Differentiable Ray Sampling for Neural 3D RepresentationDifferentiable Ray Sampling for Neural 3D Representation
Differentiable Ray Sampling for Neural 3D RepresentationPreferred Networks
 
これからのネイティブアプリにおけるOpenID Connectの活用
これからのネイティブアプリにおけるOpenID Connectの活用これからのネイティブアプリにおけるOpenID Connectの活用
これからのネイティブアプリにおけるOpenID Connectの活用Masaru Kurahayashi
 
3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)
3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)
3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)Kuniyasu Suzaki
 
” AWS ” だけじゃない! ” GCP ” の オートスケール機能
” AWS ” だけじゃない! ” GCP ” の オートスケール機能” AWS ” だけじゃない! ” GCP ” の オートスケール機能
” AWS ” だけじゃない! ” GCP ” の オートスケール機能Yuya Ohara
 
2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況
2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況
2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況FIDO Alliance
 
ああ、素晴らしきTDD ~アプリとエンジニアの心に安寧を~
ああ、素晴らしきTDD  ~アプリとエンジニアの心に安寧を~ああ、素晴らしきTDD  ~アプリとエンジニアの心に安寧を~
ああ、素晴らしきTDD ~アプリとエンジニアの心に安寧を~Saiki Iijima
 
【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話
【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話
【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話Hibino Hisashi
 
脆弱性スキャナVuls(入門編)
脆弱性スキャナVuls(入門編)脆弱性スキャナVuls(入門編)
脆弱性スキャナVuls(入門編)Takayuki Ushida
 
量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~
量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~
量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~Fixstars Corporation
 
Android向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステム
Android向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステムAndroid向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステム
Android向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステムKLab Inc. / Tech
 
BigObjectsで大量データのチャンピオンになる
BigObjectsで大量データのチャンピオンになるBigObjectsで大量データのチャンピオンになる
BigObjectsで大量データのチャンピオンになるToshiyasu Kuwada
 
Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...
Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...
Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...Preferred Networks
 
他社製品と比較した際のAuth0のいいところ
他社製品と比較した際のAuth0のいいところ他社製品と比較した際のAuth0のいいところ
他社製品と比較した際のAuth0のいいところSatoshi Takayanagi
 
ドメイン駆動設計とマイクロサービス
ドメイン駆動設計とマイクロサービスドメイン駆動設計とマイクロサービス
ドメイン駆動設計とマイクロサービスkouki_mitsuishi
 

What's hot (20)

MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料
MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料
MySQL 8.0で強化されたGIS機能のご紹介:「FOSS4G 2018 Hokkaido」での発表資料
 
いまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authlete
いまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authleteいまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authlete
いまどきの OAuth / OpenID Connect (OIDC) 一挙おさらい (2020 年 2 月) #authlete
 
パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~
パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~
パスワード氾濫時代のID管理とは? ~最新のOpenIDが目指すユーザー認証の効率的な強化~
 
Verilator勉強会 2021/05/29
Verilator勉強会 2021/05/29Verilator勉強会 2021/05/29
Verilator勉強会 2021/05/29
 
Differentiable Ray Sampling for Neural 3D Representation
Differentiable Ray Sampling for Neural 3D RepresentationDifferentiable Ray Sampling for Neural 3D Representation
Differentiable Ray Sampling for Neural 3D Representation
 
これからのネイティブアプリにおけるOpenID Connectの活用
これからのネイティブアプリにおけるOpenID Connectの活用これからのネイティブアプリにおけるOpenID Connectの活用
これからのネイティブアプリにおけるOpenID Connectの活用
 
自作GPUへの道
自作GPUへの道自作GPUへの道
自作GPUへの道
 
3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)
3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)
3種類のTEE比較(Intel SGX, ARM TrustZone, RISC-V Keystone)
 
” AWS ” だけじゃない! ” GCP ” の オートスケール機能
” AWS ” だけじゃない! ” GCP ” の オートスケール機能” AWS ” だけじゃない! ” GCP ” の オートスケール機能
” AWS ” だけじゃない! ” GCP ” の オートスケール機能
 
2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況
2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況
2020 0218 - パスワードのいらない世界へ:FIDOアライアンスとFIDO認証の最新状況
 
ああ、素晴らしきTDD ~アプリとエンジニアの心に安寧を~
ああ、素晴らしきTDD  ~アプリとエンジニアの心に安寧を~ああ、素晴らしきTDD  ~アプリとエンジニアの心に安寧を~
ああ、素晴らしきTDD ~アプリとエンジニアの心に安寧を~
 
【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話
【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話
【DeepSecurityUserNight】我が家の箱入り娘を世間に晒すのは危険なのでDeepSecurityに見守ってもらった話
 
脆弱性スキャナVuls(入門編)
脆弱性スキャナVuls(入門編)脆弱性スキャナVuls(入門編)
脆弱性スキャナVuls(入門編)
 
量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~
量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~
量子コンピュータ時代の製造業におけるDXセミナー~生産工程効率化に向けた新たなご提案~
 
Android向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステム
Android向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステムAndroid向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステム
Android向けUnity製ゲーム最適化のためのCI/CDと連携した自動プロファイリングシステム
 
BigObjectsで大量データのチャンピオンになる
BigObjectsで大量データのチャンピオンになるBigObjectsで大量データのチャンピオンになる
BigObjectsで大量データのチャンピオンになる
 
Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...
Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...
Kubernetes Service Account As Multi-Cloud Identity / Cloud Native Security Co...
 
API Gatewayご紹介
API Gatewayご紹介API Gatewayご紹介
API Gatewayご紹介
 
他社製品と比較した際のAuth0のいいところ
他社製品と比較した際のAuth0のいいところ他社製品と比較した際のAuth0のいいところ
他社製品と比較した際のAuth0のいいところ
 
ドメイン駆動設計とマイクロサービス
ドメイン駆動設計とマイクロサービスドメイン駆動設計とマイクロサービス
ドメイン駆動設計とマイクロサービス
 

Similar to Pervasive ar environment

COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionMark Billinghurst
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARMark Billinghurst
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsMark Billinghurst
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsMark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsMark Billinghurst
 
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...Tomohiro Fukuda
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5Mark Billinghurst
 
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Mark Billinghurst
 
Tangible User Interface Showcase
Tangible User Interface ShowcaseTangible User Interface Showcase
Tangible User Interface ShowcaseSimone Mora
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityMark Billinghurst
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionMark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality TechnologyMark Billinghurst
 
3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring3D Content Development and AR/VR Authoring
3D Content Development and AR/VR AuthoringZi Siang See
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented RealityMark Billinghurst
 
Interactions in Mixed Reality or what is mixed reality and how can we make ap...
Interactions in Mixed Reality or what is mixed reality and how can we make ap...Interactions in Mixed Reality or what is mixed reality and how can we make ap...
Interactions in Mixed Reality or what is mixed reality and how can we make ap...Bektur Ryskeldiev
 

Similar to Pervasive ar environment (20)

COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR Interaction
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
 
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architect...
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)
 
Tangible User Interface Showcase
Tangible User Interface ShowcaseTangible User Interface Showcase
Tangible User Interface Showcase
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology
 
3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
 
0th review.pptx
0th review.pptx0th review.pptx
0th review.pptx
 
Interactions in Mixed Reality or what is mixed reality and how can we make ap...
Interactions in Mixed Reality or what is mixed reality and how can we make ap...Interactions in Mixed Reality or what is mixed reality and how can we make ap...
Interactions in Mixed Reality or what is mixed reality and how can we make ap...
 

More from NAVER Engineering

디자인 시스템에 직방 ZUIX
디자인 시스템에 직방 ZUIX디자인 시스템에 직방 ZUIX
디자인 시스템에 직방 ZUIXNAVER Engineering
 
진화하는 디자인 시스템(걸음마 편)
진화하는 디자인 시스템(걸음마 편)진화하는 디자인 시스템(걸음마 편)
진화하는 디자인 시스템(걸음마 편)NAVER Engineering
 
서비스 운영을 위한 디자인시스템 프로젝트
서비스 운영을 위한 디자인시스템 프로젝트서비스 운영을 위한 디자인시스템 프로젝트
서비스 운영을 위한 디자인시스템 프로젝트NAVER Engineering
 
BPL(Banksalad Product Language) 무야호
BPL(Banksalad Product Language) 무야호BPL(Banksalad Product Language) 무야호
BPL(Banksalad Product Language) 무야호NAVER Engineering
 
이번 생에 디자인 시스템은 처음이라
이번 생에 디자인 시스템은 처음이라이번 생에 디자인 시스템은 처음이라
이번 생에 디자인 시스템은 처음이라NAVER Engineering
 
날고 있는 여러 비행기 넘나 들며 정비하기
날고 있는 여러 비행기 넘나 들며 정비하기날고 있는 여러 비행기 넘나 들며 정비하기
날고 있는 여러 비행기 넘나 들며 정비하기NAVER Engineering
 
쏘카프레임 구축 배경과 과정
 쏘카프레임 구축 배경과 과정 쏘카프레임 구축 배경과 과정
쏘카프레임 구축 배경과 과정NAVER Engineering
 
플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기
플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기
플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기NAVER Engineering
 
200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)
200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)
200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)NAVER Engineering
 
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드NAVER Engineering
 
200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기
200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기
200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기NAVER Engineering
 
200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활
200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활
200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활NAVER Engineering
 
200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출
200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출
200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출NAVER Engineering
 
200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우
200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우
200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우NAVER Engineering
 
200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...
200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...
200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...NAVER Engineering
 
200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법
200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법
200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법NAVER Engineering
 
200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며
200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며
200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며NAVER Engineering
 
200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기
200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기
200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기NAVER Engineering
 
200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기
200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기
200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기NAVER Engineering
 

More from NAVER Engineering (20)

React vac pattern
React vac patternReact vac pattern
React vac pattern
 
디자인 시스템에 직방 ZUIX
디자인 시스템에 직방 ZUIX디자인 시스템에 직방 ZUIX
디자인 시스템에 직방 ZUIX
 
진화하는 디자인 시스템(걸음마 편)
진화하는 디자인 시스템(걸음마 편)진화하는 디자인 시스템(걸음마 편)
진화하는 디자인 시스템(걸음마 편)
 
서비스 운영을 위한 디자인시스템 프로젝트
서비스 운영을 위한 디자인시스템 프로젝트서비스 운영을 위한 디자인시스템 프로젝트
서비스 운영을 위한 디자인시스템 프로젝트
 
BPL(Banksalad Product Language) 무야호
BPL(Banksalad Product Language) 무야호BPL(Banksalad Product Language) 무야호
BPL(Banksalad Product Language) 무야호
 
이번 생에 디자인 시스템은 처음이라
이번 생에 디자인 시스템은 처음이라이번 생에 디자인 시스템은 처음이라
이번 생에 디자인 시스템은 처음이라
 
날고 있는 여러 비행기 넘나 들며 정비하기
날고 있는 여러 비행기 넘나 들며 정비하기날고 있는 여러 비행기 넘나 들며 정비하기
날고 있는 여러 비행기 넘나 들며 정비하기
 
쏘카프레임 구축 배경과 과정
 쏘카프레임 구축 배경과 과정 쏘카프레임 구축 배경과 과정
쏘카프레임 구축 배경과 과정
 
플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기
플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기
플랫폼 디자이너 없이 디자인 시스템을 구축하는 프로덕트 디자이너의 우당탕탕 고통 연대기
 
200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)
200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)
200820 NAVER TECH CONCERT 15_Code Review is Horse(코드리뷰는 말이야)(feat.Latte)
 
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
 
200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기
200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기
200819 NAVER TECH CONCERT 10_맥북에서도 아이맥프로에서 빌드하는 것처럼 빌드 속도 빠르게 하기
 
200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활
200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활
200819 NAVER TECH CONCERT 08_성능을 고민하는 슬기로운 개발자 생활
 
200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출
200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출
200819 NAVER TECH CONCERT 05_모르면 손해보는 Android 디버깅/분석 꿀팁 대방출
 
200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우
200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우
200819 NAVER TECH CONCERT 09_Case.xcodeproj - 좋은 동료로 거듭나기 위한 노하우
 
200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...
200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...
200820 NAVER TECH CONCERT 14_야 너두 할 수 있어. 비전공자, COBOL 개발자를 거쳐 네이버에서 FE 개발하게 된...
 
200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법
200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법
200820 NAVER TECH CONCERT 13_네이버에서 오픈 소스 개발을 통해 성장하는 방법
 
200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며
200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며
200820 NAVER TECH CONCERT 12_상반기 네이버 인턴을 돌아보며
 
200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기
200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기
200820 NAVER TECH CONCERT 11_빠르게 성장하는 슈퍼루키로 거듭나기
 
200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기
200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기
200819 NAVER TECH CONCERT 07_신입 iOS 개발자 개발업무 적응기
 

Recently uploaded

How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?Igalia
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessPixlogix Infotech
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesBoston Institute of Analytics
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilV3cube
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfhans926745
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 

Recently uploaded (20)

How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 

Pervasive ar environment

  • 1. Jung-Hyun Byun junghyun.byun@msl.yonsei.ac.kr Media System Lab, Yonsei University PROJECTION MAPPING AND AUGMENTED REALITY FOR PERVASIVE AR ENVIRONMENT March 4th, 2019 / NAVER Tech Talk
  • 2. Speaker Biography Jung-Hyun Byun junghyun.ian.byun@gmail.com • Education ▪ B.Sc. in Computer Science and Engineering, Yonsei University, Korea – 2011.03 ~ 2015.02 ▪ Integrated Ph.D. course in Computer Science, Yonsei University, Korea – 2015.09 ~ 2020.08 (expected) – Media System Lab. (Advisor: Prof. Tack-Don Han) • Publication ▪ Byun et al. "AIR: Anywhere Immersive Reality with User-Perspective Projection.“ Eurographics 2017 ▪ Byun et al. "Accurate control of a pan-tilt system based on parameterization of rotational motion." Eurographics 2018 • Research Interest ▪ augmented reality, projection mapping, point cloud processing and scene reconstruction Projection Mapping and Augmented Reality for Pervasive AR Environment 2
  • 3. Table of Contents 1. Introduction: Augmented Reality and Projection ▪ Augmented reality continuum/definition/categories ▪ Projection AR comparison/research trend 2. Ongoing research of the speaker ▪ Motivation ▪ Pervasive Augmented Reality ▪ Perspective Projection Mapping ▪ Pan-Tilt Control and Registration 3. Ending remarks: Further Research & Future Path ▪ Summary ▪ Deep Learning for Mixed Reality ▪ Context-aware Pervasive AR ▪ Consumer Robotics with Pervasive AR Projection Mapping and Augmented Reality for Pervasive AR Environment 3
  • 5. Augmented Reality • Augmented reality that has been around in the fiction for some time … • has recently come to life for real Public image (1st row from left) Star Wars, Horizon Zero Dawn, Memories of the Alhambra, (2nd row from left) Pokémon GO, Spatial Systems, Inc. 5
  • 6. Augmented Reality • Head-Mounted Displays ▪ Microsoft HoloLens 2016 ▪ Magic Leap 2018 • Mobile Augmented Reality ▪ Apple ARKit 2017 ▪ Google ARCore 2017 • But there’s one more: Services & Hardware 6 2016 2017 2018 Spatial Augmented Reality
  • 7. Augmented Reality • Reality–Virtuality continuum, Milgram, ITIS 1994 The continuum Milgram, Paul, et al. "Augmented reality: A class of displays on the reality-virtuality continuum." Telemanipulator and telepresence technologies. Vol. 2351. International Society for Optics and Photonics, 1995. https://en.wikipedia.org/wiki/Reality%E2%80%93virtuality_continuum 7 Mixed WorldReal World Virtual World
  • 8. Augmented Reality Definition and categorization [1] Azuma, Ronald T. "A survey of augmented reality." Presence: Teleoperators & Virtual Environments 6.4 (1997): 355-385. [2] Bimber, Oliver, and Ramesh Raskar. "Modern approaches to augmented reality." ACM SIGGRAPH 2006 Courses. ACM, 2006. 8 • Definitionally augmented reality systems … [1] 1. combine real elements and virtual images 2. are capable of interacting with the user in real-time 3. are registered in the 3D real world • Spatial Augmented Reality [2] ▪ Virtual objects are directly superimposed onto the real world ▪ Free of instruments that cause users discomfort and fatigue ▪ Uses displays with spatial characteristics – usually projectors • Projection-based Augmented Reality (Projection AR) means … ▪ technically, any augmented reality generated by wearable, mobile or spatial projector(s) ▪ commonly, a spatial augmented reality generated by projector(s) that are detached from the user Image generation for augmented reality displays [2]
  • 9. Augmented Reality Characteristics and comparison Bimber, Oliver, and Ramesh Raskar. Spatial augmented reality: merging real and virtual worlds. CRC press, 2005. 9 Head Worn Displays strong presence freely movable/active small display (virtual) fatigue/discomfort focus != convergence Mobile devices accessible by many easily carried limits user’s sight small display (virtual) Projection-based highly immersive theoretically unlimited FOV/display projection distortion device perspective
  • 10. Projection Augmented Reality • Steadily studied at Tokyo University (orange), MIT Media Lab. (green) and Microsoft Research (blue) Research trend 10 The Everywhere Displays Projector, IBM UbiComp 2001 2001 2003 iLamps Ramesh Rasker, MIT Media Lab SIGGRAPH 2003 FoveAR, Hrvoje Benko, Microsoft, UIST 2015 2015 Room2Room, Tomislav Pejsa, Microsoft, CSCW 2016 2016 Augmented Surfaces Jun Rekimoto, Sony CHI 1999 1999 Beamatron Andrew Wilson, Microsoft UIST 2012 2012 illumiRoom, Hrvoje Benko, Microsoft, CHI 2013 2013 LuminAR Pattie Maes, MIT Media Lab UIST 2010 2010 SixthSense Pranav Mistry, MIT Media Lab CHI 2009 2009 OmniTouch Chris Harrison, Microsoft, UIST 2011 2011 ExtVision, Naoki Kimura, Tokyo University CHI 2018 2018 Dyadic Project, Hrvoje Benko, Microsoft, UIST 2014 20142014 RoomAlive, Brett Jones, Microsoft, UIST 2014 MeetAlive, Andreas Fender, Microsoft, ISS 2017 2017
  • 11. ONGOING RESEARCH OF THE SPEAKER
  • 12. Pervasive AR interaction platform • AR / VR continuum in conjunction with Pervasive AR concept Overview National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2015R1A2A1A10055673). 12 Real World Virtual World Tangible User Interface (TUI) Augmented Reality (AR) Augmented Virtuality (AV) Virtual Reality (VR) Spatial AR (SAR) Mobile AR (MAR) Mixed World [ Mixed Reality, Milgram, ITIS 1994 ] [ Spatial AR, Ramesh Raskar, ISMAR 2004 ] [ Tangible Bits, Hiroshi Ishii, SIG CHI 1997 ] [ A survey of augmented reality, Azuma, 1997 ] Wall Display & Pervasive Display [ Pervasive Display Applications, Strohbach, M. Martin, 2011 ] Immersive Display [ Immersive displays, Lantz, Ed., ACM SIGGRAPH, 1996 ] Projection AR (stationary) [ Illumiroom, H. Benko, ACM CHI 2013 ] Pervasive AR [ Mobile AR, Hollerer, Telegeoinformatics, 2004 ] [ Mixed Reality, Milgram, ITIS 1994 ] [ Virtual Reality : CAVE, Cruz-Neira, SIGGRAPH, 1993 ] Natural User Interface (NUI) [ Natural User Interface, Petersen, ISMAR 2009] < Pervasive AR concept diagram >
  • 13. Pervasive AR interaction platform • This work defines a new concept, Pervasive AR, and conduct researches that constitute it Overview National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2015R1A2A1A10055673). 13 ▪ Pervasive AR Display environment - Users can project their needed information onto any desired space and surface - Overcome the limitations of traditional, stationary projection technology by integrating autonomous robot with augmented reality ▪ Seamless Interaction functionality - Provide AR-based interaction in real-time in conjunction with varying spaces and situations - In addition to the interaction between user- device, provide interaction with device-device ▪ Pervasive Interface design - Combines TUI and NUI technology to provide intuitive, natural interaction technology - Provides an interface that combines motion sensor with computer vision technology • Scalable Pervasive AR platform - Develop a scalable, heterogeneous, integrated interaction platform that provides a consistent method to integrate a variety of technologies
  • 14. Motivation • Ordinary meeting room ▪ Projection screen in front ▪ Maybe whiteboard on side ▪ Most of the space is just there • What can be done to make use of the remaining space? ▪ “The office of the future” Ongoing Research Raskar, Ramesh, et al. "The office of the future: A unified approach to image-based modeling and spatially immersive displays." Proceedings of the 25th annual conference on Computer graphics and interactive techniques. ACM, 1998. 14 Engineering Hall D, Room 816 Projection screen Whiteboard Vacant wall Projector
  • 15. Motivation • Smart meeting room with Pervasive AR ▪ Projection screen in front -> Projection on any surface ▪ Maybe whiteboard on side -> Digital board ▪ Most of the space is just there -> Interactive wall • Pervasive AR transforms a typical room into smart space with ▪ Minimal hardware ▪ Immersive environment ▪ Responsive system ▪ Seamless interaction Ongoing Research 15 Engineering Hall D, Room 816 Projection on any surface Digital board Interactive wall 360° steerable Projector
  • 16. Contribution • Pervasive AR Key features x Research aspects x Literature contributions Immersive Projection Environment 16 Minimal hardware Immersive environment Responsive system Seamless interaction Steerable platform with pan-tilt servos User perspective 360° projection AR Geometric distortion-corrected projection Point cloud registration of indoor scene Spatial manipulation of virtual contents Anywhere Immersive, EG 2017 Accurate Control, EG 2018 Dynamic Perspective*, 2019 Axis Bound Registration*, 2019 Pervasive Interface^, 2019 Pervasive AR Key features Pervasive AR Research aspects Literature Contributions *in submission ^ work in progress
  • 18. Proposed System • The front camera captures the data of the surface geometry that is to be projection-mapped • The rear camera tracks the user’s position and interaction in an AR scene • The pan-tilt platform is steered to provide 360° projection with correct user’s perspective System Environment 18 < The ceiling-type design > Frontal RGB-D Camera - Acquire geometry Projector - Correct distortions Rear RGB-D Camera - Track user’s viewpointPan/tilting Servos - Steer platform pose < The table-type design >
  • 19. Proposed System System Prototype 19 Asus Xtion PICO Projector Pan-Tilt Servos 360 PROJECTOR mini • Key components ▪ Front-facing camera ▪ Rear-facing camera ▪ Projector ▪ Steerable Platform • H/W configurations ▪ Microsoft Kinect v2 ▪ Microsoft Kinect 360 ▪ Epson 1771W Projector ▪ HS-785HB Servos + Arduino board Kinect 360 Pan-Tilt Platform Epson Projector 360 PROJECTOR V1 360 PROJECTOR V2 Epson Projector Kinect 360 Pan-Tilt Platform Arduino Kinect V2
  • 20. Proposed System • The front camera captures the data of the surface geometry that is to be projection-mapped • The rear camera tracks the user’s position and interaction in an AR scene • The pan-tilt platform is steered to provide 360° projection with correct user’s perspective • Ceiling-type design implemented in a 3.7 x 4.0 x 2.25 m3 (W x D x H) cubic space System Implementation 20 < The schematic design > < The implemented environment >
  • 21. Proposed System • Overview: the proposed system … ▪ tracks a user’s position and interaction ▪ renders augmented virtual contents from correct perspective ▪ performs projection mapping without geometric distortion ▪ acquires and reconstructs spatial information System Flow 21 R. Camera User tracking User viewpoint User interaction AR scene rendering Pan-tilt platform Projector F. Camera Surface geometry 4. Scene reconstruction 2. Distortion correction Texture generation Projective mapping 3. Spatial projection AR 1. Rotation axis calibration 5. Spatial manipulation
  • 22. PAN-TILT AXIS CALIBRATION AND CONTROL R. Camera User tracking User viewpoint User interaction AR scene rendering Pan-tilt platform Projector F. Camera Surface geometry 4. Scene reconstruction 2. Distortion correction Texture generation Projective mapping 3. Spatial projection AR 1. Rotation axis calibration 5. Spatial manipulation “Accurate Control of a Pan-tilt System Based on Parameterization of Rotational Motion” Eurographics 2018
  • 23. Motivation • We want to control a pan-tilt system in a way that … ▪ can accurately “target” a point in 3D space ▪ can find the rotation so that the target point captured after rotation lies on the optical axis ▪ i.e., the captured target point should coincide with the optical center ▪ can model its rotation trajectories and plan its motion ahead • To accurately control a pan-tilt system, we implemented: ▪ A general pan-tilt assembly model without specifications ▪ A physics-consistent pan-tilt rotation model ▪ A calibration process to recover rotation parameters ▪ An inverse kinematics interpretation for manipulation Accurate Control of a Pan-tilt System 23 Tiltrotation Pan rotation Target Point
  • 24. Pan-Tilt Rotation Modeling • Two-step process to recover rotation parameters ▪ estimate the direction vector of the rotation plane ▪ estimate the center pivot of the circular trajectory • Points rotated about an axis form a closed circular rotation trajectory on a 3-dimensional plane, where ▪ the rotation axis is the normal of the plane ▪ the circle center is the intersection with the plane • Capture multiple frames of a large checkerboard with pre-known the structure, during rotation ▪ all the trajectories can be represented with respect to the trajectory of the top-leftmost corner (𝒙 𝟎𝟎) ▪ thus stable global optimization of the trajectory is possible Rotation Parameters Acquisition 24 Optical Center Rotation Center Pan Axis Tilt Axis Rotation model formation
  • 25. Pan-Tilt Rotation Modeling • For the rotation of the upper-left corner ▪ Let us denote – the rotation axis: 𝒙−𝒂 𝒏 𝒙 = 𝒚−𝒃 𝒏 𝒚 = 𝒛−𝒄 𝒏 𝒛 – the rotation direction vector: 𝒏 = 𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛 𝑻 , 𝒏 = 𝟏 – the rotation circle center: 𝒑 = (𝒂, 𝒃, 𝒄) 𝑻 ▪ Then equations form as – the rotation plane: 𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 = 𝟎 – the rotation trajectory: 𝒙 − 𝒂 𝟐 + 𝒚 − 𝒃 𝟐 + 𝒛 − 𝒄 𝟐 = 𝒓 𝟐 Rotation Parameters Acquisition 25 𝒙 𝟎𝟎 (𝒂, 𝒃, 𝒄) [𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛] 𝒙 − 𝒂 𝒏 𝒙 = 𝒚 − 𝒃 𝒏 𝒚 = 𝒛 − 𝒄 𝒏 𝒛 𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 = 𝟎 𝒙 − 𝒂 𝟐 + 𝒚 − 𝒃 𝟐 + 𝒛 − 𝒄 𝟐 = 𝒓 𝟐 𝒓
  • 26. Pan-Tilt Rotation Modeling • For the rotation of the corner at the i-th row and j-th column ▪ Let us denote – the distance of foots on the rotation axis between vertical corners: 𝒅 𝒉 – the distance of foots on the rotation axis between horizontal corners: 𝒅 𝒘 ▪ Then equations form as: – the rotation circle center: 𝒑𝒊𝒋 = 𝒂𝒊𝒋, 𝒃𝒊𝒋, 𝒄𝒊𝒋 𝑻 𝒂𝒊𝒋 = 𝒂 − 𝒏 𝒙 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 𝒃𝒊𝒋 = 𝒃 − 𝒏 𝒚(𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘) 𝒄𝒊𝒋 = 𝒄 − 𝒏 𝒛(𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘) – the rotation plane: 𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 + 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 = 𝟎 – the rotation trajectory: 𝒙 − 𝒂𝒊𝒋 𝟐 + 𝒚 − 𝒃𝒊𝒋 𝟐 + 𝒛 − 𝒄𝒊𝒋 𝟐 = 𝒓𝒊𝒋 𝟐 Rotation Parameters Acquisition 26 𝒙 𝟏𝟎 𝒙 𝟎𝟏 𝒙 − 𝒂𝒊𝒋 𝟐 + 𝒚 − 𝒃𝒊𝒋 𝟐 + 𝒛 − 𝒄𝒊𝒋 𝟐 = 𝒓𝒊𝒋 𝟐 𝒅 𝒘 𝒓𝒊𝒋 (𝒂𝒊𝒋, 𝒃𝒊𝒋, 𝒄𝒊𝒋) 𝒏 𝒙 𝒙 + 𝒏 𝒚 𝒚 + 𝒏 𝒛 𝒛 + 𝒅 + 𝒊 × 𝒅 𝒉 + 𝒋 × 𝒅 𝒘 = 𝟎 𝒅 𝒉 𝒙 𝟎𝟎 𝒊 𝒋
  • 27. Pan-Tilt Rotation Modeling • For k-th captured checkerboard frame ▪ Let us denote – the 3D coordinate of the corner at i-th row and j-th column: 𝒗𝒊𝒋𝒌 = 𝒙𝒊𝒋𝒌, 𝒚𝒊𝒋𝒌, 𝒛𝒊𝒋𝒌 𝑻 ▪ Minimize errors of two objective functions using the global least squares method – Find the parameters 𝒏 𝒙, 𝒏 𝒚, 𝒏 𝒛, 𝒅, 𝒅 𝒉, 𝒅 𝒘 that minimize the planar term: – With recovered parameters, find parameters 𝒂, 𝒃, 𝒄, 𝒓𝒊𝒋 that minimize the circular term: Rotation Parameters Acquisition Chen, Ping, et al. "Rotation axis calibration of a turntable using constrained global optimization." Optik-International Journal for Light and Electron Optics 125.17 (2014): 4831-4836. 27
  • 28. Evaluation • Captured 28 frames in pan rotation (-25°~15°), and 11 frames in tilt rotation (-17° ~ 16°) • Pan and tilt rotation trajectories were computed and from those, the visual odometry of the camera can also be tracked System Configuration and Calibration 28 Bottommost Tilt Topmost Tilt LeftmostPan RightmostPan Point cloud accumulation
  • 29. Evaluation • The rotation axis calibration results of the pan-tilt system • Pan and tilt trajectories were calculated (green circle) • The point cloud of the projector was captured from behind by the rear camera • Thus the front camera was occluded, and its positions and poses were estimated as the “pan-tilt camera” (white cube) System Configuration and Calibration 29 Rear Kinect 360 Pan-Tilt Platform Front Kinect V2 Projector
  • 30. Evaluation • The task at hand: ▪ Want to orient the system so that it can accurately “target” a point in 3D space ▪ Given the target point, the optical axis of the camera should “pass through” the target object ▪ In other words, the target should coincide with the optical center after the rotation • Interpretation in inverse kinematics terms ▪ Think of the optical axis as a “linear actuator” ▪ and the target point on the axis as an “end effector” Servo Control with Inverse Kinematics Buss, Samuel R. "Introduction to inverse kinematics with jacobian transpose, pseudoinverse and damped least squares methods." IEEE Journal of Robotics and Automation 17.1-19 (2004): 16. 30 Tiltrotationangle Pan rotation angle “End effector”
  • 31. Evaluation • Testing of the accurate targeting capability ▪ The system is tasked to adjust its attitude so that the target point is at the optical center of the camera ▪ The coordinates of the 70 checkerboard corners are used as target points ▪ Proposed inverse kinematics control method and Single Point Calibration Method (SPCM) are compared • Errors between the intended destination of a point and the actually captured point after the rotation are measured ▪ Root Mean Squared Errors (RMSE) of L2-norms in XY image pixels and real distances (mm) ▪ Mean Absolute Errors (MAE) of L2-norms in each X, Y (and Z) directions in both pixels and mm Experiment Result Li, Yunting, et al. "Method for pan-tilt camera calibration using single control point." JOSA A 32.1 (2015): 156-163. 31
  • 32. GEOMETRIC DISTORTION COMPENSATED PROJECTION R. Camera User tracking User viewpoint User interaction AR scene rendering Pan-tilt platform Projector F. Camera Surface geometry 4. Scene reconstruction 2. Distortion correction Texture generation Projective mapping 3. Spatial projection AR 1. Rotation axis calibration 5. Spatial manipulation “AIR: Anywhere Immersive Reality with User-Perspective Projection” Eurographics 2017
  • 33. Motivation • Geometric Calibration required by most projector-camera applications [1] ▪ Geometrically register the projectors to the surface and to enable the generation of a consistent projection onto a complex surface geometry • User Perspective Rendering in tablet-based augmented reality [2] ▪ Comparison of device-perspective rendering, user-perspective rendering, and a ground truth User-Perspective Projection [1] Grundhöfer, Anselm, and Daisuke Iwai. "Recent advances in projection mapping algorithms, hardware and applications." Computer Graphics Forum. Vol. 37. No. 2. 2018. [2] Tomioka, Makoto, Sei Ikeda, and Kosuke Sato. "Approximated user-perspective rendering in tablet-based augmented reality." 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2013. 33
  • 34. Motivation • Distortion Correction and User Perspective for projection AR User-Perspective Projection Tomioka, Makoto, Sei Ikeda, and Kosuke Sato. "Approximated user-perspective rendering in tablet-based augmented reality." 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2013. 34 Adapt & substitute
  • 35. Proposed System • Schematic model and process overview of the proposed system ▪ consists of a projector, two RGB-D cameras, pan-tilt servos, and a control board ▪ designed to be portable, so that it can placed on a table-top with unknown surroundings System Overview 35 Front RGB-D Camera - Acquire geometry Projector - Correct distortion Rear RGB-D Camera - Track user’s view pt. Pan/tilting Servos - Steer platform pose RGB-D Camera Pan/tilt motors World Geometry User view pt. tracking View-depend. rendering Perspective project. mat. Distortion-corrected Projection Perspective Rendering Projective Texturing Texture map. on geometry Workflow
  • 36. Proposed System • Calibrating between cameras and a projector with gray code patterns. • First, the pixel correspondences between the projector and both front and rear RGB cameras are computed. • Then, corresponding points in the color images are transformed to depth image points. • Finally, the depth image points are un-projected to 3D points, which are used to calibrate the projector and between front and rear cameras. Projector-Camera Calibration 36
  • 37. Proposed System • Calibration & Two-pass Rendering • Calibration: geometry construction ▪ Register poses of cameras, a projector and a user • First-pass: perspective rendering ▪ Track the user and render scenes from perspective • Second-pass: distortion correction ▪ Projectively map texture onto geometry • Finally, display mapped rendering from the projector System Flow Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction Front Camera Rear Camera Projector/ Servos Track User View Perspective Matrix Offscreen Rendering Projective Matrix Texture Coordinate Projective Mapping
  • 38. Proposed System • Register separate coordinates front, proj, rear in one global system world • 𝑷 𝒘𝒐𝒓𝒍𝒅 = 𝑇𝑟𝑒𝑎𝑟→𝑤𝑜𝑟𝑙𝑑 𝑷 𝒓𝒆𝒂𝒓 = 𝑅 𝑝𝑎𝑛 𝛼 𝑅𝑡𝑖𝑙𝑡 𝛽 𝑷 𝒇𝒓𝒐𝒏𝒕 • 𝑠 ⋅ 𝒙 𝒑𝒓𝒐𝒋 = 𝐴 𝑝𝑟𝑜𝑗 𝑇𝑓𝑟𝑜𝑛𝑡→𝑝𝑟𝑜𝑗 𝑷 𝒇𝒓𝒐𝒏𝒕 • Solve internal and external parameters as reverse-camera model System Flow 38 Front Camera (front) - Changes w.r.t servos Rear Camera (rear) - Fixed in world Pan/tilt Servos (servo) - Rotate in world User Perspective (user) - Subordinate to rear Global world coordinate Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction Projector (proj) - Subordinate to front
  • 39. Proposed System • Rear camera tracks the user’s viewpoint 𝐸 𝑢𝑠𝑒𝑟 ▪ Setup the perspective matrix 𝐴 𝑢𝑠𝑒𝑟 from 𝐸 𝑢𝑠𝑒𝑟 • Configure perspective equation: 𝐴 𝑢𝑝𝑟 = 𝐴 𝑢𝑠𝑒𝑟 𝑇 𝑤𝑜𝑟𝑙𝑑→𝑟𝑒𝑎𝑟 • First-pass: render off-screen scene to a texture, projTex , with FrameBuffer Object System Flow TOMIOKA M., IKEDA S., SATO K.: Approximated user perspective rendering in tablet-based augmented reality. In Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on (2013), IEEE, pp. 21–28. 39 Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction Global world coordinate Rear Camera (rear) - Fixed in world 𝐴 𝑢𝑝𝑟 User Perspective (user) - Subordinate to rear
  • 40. Proposed System • Setup the projective matrix 𝐴 𝑢𝑝𝑟 −1 • Configure projective equation: 𝑠 𝐴 𝑢𝑝𝑟 −1 projTexprojCoord = 𝑃𝑔𝑒𝑜𝑚𝑒𝑡𝑟𝑦 • Second-pass: map projTex onto geometry with projCoord ▪ Display user-perspective rendering from the projector’s viewpoint System Flow 40 Camera-Projector-Servo Registration User-Perspective Projection Projection Distortion Correction Global world coordinate Pan/tilt Servos (servo) - Rotate in world User Perspective (user) - Subordinate to rear Rear Camera (rear) - Fixed in world Front Camera (front) - Changes w.r.t servos Projector (proj) - Subordinate to front
  • 41. Evaluation • Geometric distortion correction results of XY grid images. Before & after Implementation Result 41 (c) Pan = -45°, Tilt = 45°(a) Pan = 0°, Tilt = 0° (b) Pan = 0°, Tilt = 45° DistortioncorrectedSurfacedistortion
  • 42. • To quantify the image degradation, MirageTable[1] authors… ▪ computed Root Mean Square(RMS) errors of absolute intensity differences of corrected projections, pixel-by-pixel ▪ varying color conditions yield substantially greater, irrelevant RMS errors • Instead we… 1 𝑁 ෍ 𝑖=1 𝑁 𝑃𝑏𝑎𝑠𝑒 − 𝑃𝑖 2 2 , where N: the number of corners ▪ compute differences in structural compositions, point-by-point ▪ setup the base image by projecting a checkerboard on a planar surface ▪ place different obstacles and correct projection output accordingly ▪ match corners of checkerboards with the base image and compute dislocations Quantitative metric Evaluation [1] Benko, Hrvoje, Ricardo Jota, and Andrew Wilson. "MirageTable: freehand interaction on a projected augmented reality tabletop." Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 2012.
  • 43. • Distortion correction works well on ▪ (d), (e) complex geometry ▪ (f) a freely deformable object Results Evaluation Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20. (a) Base (b) Drawer (14.98) (c) Heater (10.97) (d) Panels (6.28) (f) Plastic (5.21)(e) Box (6.45)
  • 44. • Highest dislocation errors on ▪ (b) rigid geometry ▪ (c) curved geometry Results Evaluation Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20. (a) Base (b) Drawer (14.98) (c) Heater (10.97) (d) Panels (6.28) (f) Plastic (5.21)(e) Box (6.45)
  • 45. • Slopes in red dotted region … ▪ are almost parallel to the depth camera ▪ result in inaccurate depth data acquisition ▪ the current system cannot cope with inherently invalid values • “KinectToF affected by a low incident angle (angles below 10°), detects the corrupted pixels resulting in extremely range errors up to 800 mm.” • “For incident angles between 10° and 30° the KinectToF delivers up to 100% invalid pixel.” Limitation Evaluation Sarbolandi, Hamed, Damien Lefloch, and Andreas Kolb. "Kinect range sensing: Structured-light versus time-of-flight kinect." Computer vision and image understanding 139 (2015): 1-20. (a) Base (b) Drawer (14.98) (c) Heater (10.97)
  • 46. DYNAMIC PERSPECTIVE PROJECTION MAPPING R. Camera User tracking User viewpoint User interaction AR scene rendering Pan-tilt platform Projector F. Camera Surface geometry 4. Scene reconstruction 2. Distortion correction Texture generation Projective mapping 3. Spatial projection AR 1. Rotation axis calibration 5. Spatial manipulation “PPAP: Perspective Projection Augment Platform with Dynamic Control of Steerable Pan-Tilt System” In submission
  • 47. Motivation • Cues for Perceiving Spatial Relationships Spatial Perception in CGI Grossman, Tovi, and Ravin Balakrishnan. "An evaluation of depth perception on volumetric displays." Proceedings of the working conference on Advanced visual interfaces. ACM, 2006. Wanger, Leonard R., James Ferwerda, and Donald P. Greenberg. "Perceiving spatial relationships in computer-generated images." IEEE Computer Graphics and Applications 12.3 (1992): 44-58. Benko, Hrvoje, Andrew D. Wilson, and Federico Zannier. "Dyadic projected spatial augmented reality." Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014. 47 Perspective projection • Render with projective geometry • Gives a perspective Shading and Shadow • Shadow from object to surface • Gives positional information • Basics of graphics rendering • Feels off when done wrong • But there’s not much to add • Degrades rendering quality • Need to wear 3D glasses • Not instrument-free for user Stereopsis • Using the difference in vision • Gives a sense of depth Motion parallax • Change in position with motion • Gives spatial relations • Perception of 3D in far distance • Works in monocular imagery • Good fit with pan-tilt mechanism Our focus in monocular projection
  • 48. System Description • (a) The user’s position and perspective are tracked by the rear camera (gray arrow). • (b) As the user inspects the virtual object (green circle), the user’s line-of-sight (blue arrow) is ray-casted to locate its view center on the surface geometry (green rectangle). The view center is projected (yellow arrow) on the projector’s image domain, and appropriate pan and tilt angles (a, b) are computed. • (c) The projector rotates to match its projection center and the user’s view center to render more parts of the virtual object. Servo Control with User Perspective 48
  • 49. System Description • In dynamic perspective projection, the projector can rotate itself using the pan-tilt servo motors • The pan-tilt projector automatically rotates focusing on the perspective-mapped surface at the user's point of view • Therefore, the dynamic perspective projection system can greatly expand the viewing angle compared to the static perspective projection, thereby providing a more immersive augmented reality experience to the user Servo Control with User Perspective 49 < Static Perspective Projection > < Dynamic Perspective Projection >
  • 50. System Description • The user’s perspective is tracked by the rear camera and ray-casted to determine its view center on the surface in the reference coordinate. • The view center coordinate is sequentially transformed to the front camera and the projector coordinates. • Finally, the coordinate is projected onto the image domain of the projector to determine the corresponding point in the projection texture. Dynamic Projection with Perspective Mapping 50
  • 51. System Description • The pan-tilt platform is rotated to match the view center and the projection center, so that the user can perceive the augmented content in more widened viewing angle. • Given the user’s viewpoint, the goal is to find the rotation angles α and β that minimize the following displacement error in the image domain: Dynamic Projection with Perspective Mapping 51 Servo Rotation Control Projection Viewpoint Control
  • 52. User Experiments • H1. The participants are able to perceive spatial representation of the virtual objects, projected by the proposed system. • H2. The participants are able to rate the size and distance more correctly when they can view wider range of the augmented object. • H3. The further distance of the virtual object to the projected surface of the wall affects negatively on its presence that is perceived by the participants. Main Hypotheses 52 < Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
  • 53. User Experiments • Investigated other factors that can affect the experiment results: Gender, Gaming, Driving • Participants were asked to rate the size and distance of the virtual objects with ▪ three different sizes (30 cm, 40 cm and 50 cm radii) ▪ three different distances (1.6 m, 2.0 m and 2.4 m away) ▪ two different conditions (static projection vs dynamic projection) Experiment Design 53 < Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
  • 54. User Experiments • 11 participants aged from 25 to 31 (M = 27.1 years, SD = 1.8 years). • 18 configurations (Size (3) x Distance (3) x Condition (2)) with 3 trials => Total 54 ratings per participant • Within-subject design -> the presentation order of static and dynamic conditions counterbalanced • 15 seconds time limit per configuration / once time lapsed, the projection was turned off Experiment Procedure 54 < Size and Distance configuration > < Projection configuration – Static vs. Dynamic >
  • 55. Results and Analysis • The participants were able to correctly rate 61.1% of size variations and 68.7% of distance variations • The participants were able to correctly rate 47.0% of both size & distance combinations (Overall correctness) • A naïve guess -> 11.1% chance of being correct (1 out of 9 size*distance combinations) • H1. The participants are able to perceive spatial representation of the virtual objects, projected by the proposed system. => Accepted Hypothesis H1 55
  • 56. Results and Analysis • Only 4.7% of the total ratings missed by more than one option, such as mistaking a “near” distance as “far”. • Encoded participants’ responses into a binary scale, either “correct” or “incorrect” ▪ allows us to analyze the performance results using binomial regression ▪ simplifies the interpretation and analysis of effective factors on spatial presence and perception (H2, H3) • Within-subject design + categorical responses -> the repeated measures logistic regression ▪ analyzed using Generalized Estimating Equations (GEE) with IBM SPSS Statistics software ▪ used to estimate the parameters of a model with a possible unknown correlation between outcomes • Wald Chi-Square test -> to find the correlation between categorical factors • Three variables as predictors: Size, Distance and Condition w/ binary correctness • Significance level as α = 0.05 (literature standard) ▪ “Assuming that the null hypothesis is true, we may reject the null only if the observed data are so unusual that they would have occurred by chance at most 5 % of the time.“ ▪ “α sets the standard for how extreme the data must be before we can reject the null hypothesis.” ▪ “The p-value indicates how extreme the data are.” -> reject the null hypothesis, if p <= 0.05. Hypothesis H2 & H3 Dr. Janxin Leu, “What is the difference between an alpha level and a p-value”, Fundamentals of Psychological Research, University of Washington 56
  • 57. Results and Analysis • The condition (static or dynamic) was found to be highly significant (𝛘2 = 12.431, df = 1, p < .001) over size/distance ▪ 34.3% correct in the static condition ▪ 59.6% correct in the dynamic condition • H2. The participants are able to rate the size and distance more correctly when they can view wider range of the augmented object. => Accepted • Participants reported the increased viewing angle (150° for the dynamic and 102° for the static) helped them feel more comfortable and immersed, so that it was more easy to deduce the spatial relationship of virtual object. Hypothesis H2 57
  • 58. Results and Analysis • The distance was found to be statistically not significant (𝛘2 = 5.213, df = 2, p = .074) at first • The result was quite contradictory as several researches reported ▪ the quality of the projection degrades as it is distanced from the projection surface ▪ and thus users find a virtual object less present Hypothesis H3 58
  • 59. Results and Analysis • Analyses on the separated subsets were found significant (𝛘2 = 5.992, df = 2, p = .049, in dynamic condition) ▪ the projector FOV is fixed, thus the visible area is limited depending on the projection distance ▪ the dynamic projection rotates the projector to match the user’s view, increasing the effective FOV of the projector ▪ Increased viewing angle -> more virtual parallax -> more immersion and presence in AR • H3. The further distance of the virtual object to the projected surface of the wall affects negatively on its presence that is perceived by the participants. => Accepted. Hypothesis H3 59
  • 60. PAN-TILT POINT CLOUD REGISTRATION R. Camera User tracking User viewpoint User interaction AR scene rendering Pan-tilt platform Projector F. Camera Surface geometry 4. Scene reconstruction 2. Distortion correction Texture generation Projective mapping 3. Spatial projection AR 1. Rotation axis calibration 5. Spatial manipulation “Fast and Accurate Reconstruction of Pan-Tilt RGB-D Scans via Axis Bound Registration” In submission
  • 61. Motivation • Spatial Mapping for Pervasive AR • SLAM/ICP-based registration • Calibration-based registration Spatial Mapping for Pervasive AR Wilson, Andrew, et al. "Steerable augmented reality with the beamatron." Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012. Fender, Andreas Rene, Hrvoje Benko, and Andy Wilson. "MeetAlive: Room-Scale Omni-Directional Display System for Multi-User Content and Control Sharing." Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces. ACM, 2017. https://msrstudio99.wordpress.com/2016/01/25/a-panorama-of-skies/ 61
  • 62. Motivation • Iterative Closest Point point-to-point ▪ a method that iteratively disregards outliers in order to improve upon the previous estimate of the rotation and translation parameters. • (a) obtains point correspondences by searching for the nearest neighbor target point from the source point cloud. • (b) estimates the rotation R and translation t by minimizing the squared distance between the corresponding pairs • ICP then iteratively solves (a) and (b) to improve upon the estimates of the previous iterations General ICP Formulation Bellekens, Ben, et al. "A survey of rigid 3D pointcloud registration algorithms." AMBIENT 2014: the Fourth International Conference on Ambient Computing, Applications, Services and Technologies, August 24-28, 2014, Rome, Italy. 2014. 62 (a) (b) (a) (b) Uncertain point correspondences induce the iterative nature, which leads to high computational cost
  • 63. Motivation • Formulation a light-weight solution for indoor scene registration by … 1. Utilizing pan-tilt rotation trajectories as the visual odometry ▪ Initial rough estimation of the camera pose ▪ Rotation trajectory-based matching and rejection 2. Dividing into and alternatingly optimizing sub-problems ▪ Pairwise axis bound transform estimation Rotation axis calibration 63 Tiltrotation Pan rotation Target Point Axis Bound Registration
  • 64. Axis Bound Registration • Offline calibration • Online registration ▪ Global registration ▪ Local registration Overview of the pipeline 64 Rotation axis calibration* Pulse-width mapping* 1. Initial global pose estimation 2. Trajectory-based feature rejection Input RGB frame Input depth frame 3D keypoint cloud generation Descriptor-based feature matching 3. Corresponding point pairs check 4. Pairwise axis bound transform Final local registration matrix *offline process RGB-D camera Pan-tilt servos
  • 65. Axis Bound Registration • Servo motors … ▪ use potentiometers to orient themselves to certain directions ▪ rotate a certain angle that is linearly proportional to the applied pulse width • The camera pose can be roughly estimated based on the servo control • Rotation axis calibration yields the global transform and pulse-width mapping 1. Initial global pose estimation 65
  • 66. Axis Bound Registration • Trajectory constraint is applied to reject outlier matching pairs between two proximate frames ▪ (a) Keypoints are extracted from the color images and matched using feature descriptors ▪ (b) Keypoint pairs are un-projected to 3D points in global space, represented with halos. ▪ (c) Keypoint pairs whose distance is below the threshold are accepted as correspondence pairs. ▪ (d) Keypoint pairs whose distance is above the threshold are rejected as outlier matches. 2. Trajectory-based feature rejection 66
  • 67. Axis Bound Registration • Outliers are rejected if the distance in global coordinates to estimated pairs exceeds a certain threshold (e.g., 100 mm). • Notice the keypoint pairs removed … ▪ are matched in diagonal directions (opposed to the camera movement) ▪ have weak correspondences (erroneous depth data around edges and on black/reflective surfaces) 3. Corresponding point pairs check 67 Initial matching of 364 keypoint pairs (based on descriptor distance). Outlier rejection of 97 keypoint pairs (based on trajectory distance). Final correspondence result of 267 keypoint pairs.
  • 68. Axis Bound Registration • Rotation axes … ▪ downsize the problem of 6 DoF [R|t] transform into two variables α, β rotation ▪ serve as a constraint for efficient and robust local registration 4. Pairwise axis bound transform 68 Alternating optimization Sub-problem division SVD SVD
  • 69. Evaluation • No available public dataset for pan–tilt RGB-D scan registration including the rotation axes calibration data • The dataset was made in-house and used for the evaluation Pan-tilt registration dataset 69 (a) Scene 0 and 10 (13.6°) (b) Scene 0 and 20 (27.1°) (c) Scene 0 and 30 (40.7°) (d) Scene 39 and 45 (12.7°) (e) Scene 39 and 49 (21.2°) (f) Scene 39 and 55 (33.9°) (g) Scene 0 and 47 (27.1°/8.5°) (h) Scene 47 and 37 (23°/8.5°) Pan rotation dataset Pan & tilt rotation dataset Tilt rotation dataset
  • 70. Evaluation • Registration without ICP • Consecutive scenes 0~9 • Pan rotation ~12.2° • Fast – under 0.0002 sec Qualitative Result 70
  • 71. Evaluation • RGBD-Calib [1], ORB-SLAM2 [2], FGR [3], and S4PCS [4] are compared with the proposed method. • The RMS Euclidean distance error of N closest point pairs is measured [1]. ▪ The shortest time to complete is highlighted in yellow. ▪ The lowest RMS error is highlighted in red. ▪ The algorithm with the shortest time and lowest RMSE is emboldened. • Out of 8 test cases, the proposed method was able to ▪ score the lowest error in 5 cases ▪ complete as the fastest in 5 cases ▪ and was both the fastest & lowest 3 cases Quantitative Result [1] Chi-Yi Tsai and Chih-Hung Huang. Indoor scene point cloud registration algorithm based on rgb-d camera calibration. Sensors, 17(8):1874, 2017. [2] Raúl Mur-Artal and Juan D. Tardós. ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras. IEEE Transactions on Robotics, 33(5):1255–1262, 2017. [3] Qian-Yi Zhou, Jaesik Park, and Vladlen Koltun. Fast global registration. In European Conference on Computer Vision, pages 766–782. Springer, 2016. [4] Nicolas Mellado, Dror Aiger, and Niloy J Mitra. Super 4pcs fast global pointcloud registration via smart indexing. In Computer Graphics Forum, volume 33, pages 205–215. Wiley Online Library, 2014. 71
  • 72. PERVASIVE INTERFACE REGISTRATION R. Camera User tracking User viewpoint User interaction AR scene rendering Pan-tilt platform Projector F. Camera Surface geometry 4. Scene reconstruction 2. Distortion correction Texture generation Projective mapping 3. Spatial projection AR 1. Rotation axis calibration 5. Spatial manipulation “PRISM: Interactive Projection System for Pervasive Registration of Interface with Spatial Manipulation” Work in progress
  • 73. Motivation • Pervasive computing environment in MR / VR / AR • Applications and widgets in the form of windows or grids • Intuitive & Natural & Comfortable UI/UX for Pervasive Computing in Immersive AR Pervasive Computing & Immersive AR Microsoft HoloLens demo onstage at BUILD 2015, https://www.youtube.com/watch?v=3AADEqLIALk Introducing Oculus Dash, https://www.youtube.com/watch?v=SvP_RI_S-bw Fender, Andreas Rene, Hrvoje Benko, and Andy Wilson. "Meetalive: Room-scale omni-directional display system for multi-user content and control sharing." Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces. ACM, 2017. 73 < Oculus Dash > < Microsoft MeetAlive >< Microsoft HoloLens >
  • 74. Project Overview • Implementation of pervasive projection AR environment and spatial interaction ▪ 3D reconstruction of an indoor scene ▪ Tracking of the user’s behavior and interaction ▪ Selection and proposal of the projection appropriate region ▪ Window-based registration and mapping of applications and widgets ▪ Spatial interaction for manipulation and control in the AR environment PRISM: Pervasive Registration of Interface with Spatial Manipulation 74 (a) Spatial Interaction – Tracking (b) Spatial Interaction – Selection (c) Mobile Interaction – Cursor
  • 75. Implementation Detail 1. Axis Bound Registration for Indoor Scene Reconstruction 2. Hough Plane Detection for Projection Region Selection 3. Planar Grid Partitioning for Multi-Window Generation 4. Touch and Ray Casting based Window Manipulation PRISM: Pervasive Registration of Interface with Spatial Manipulation 75 Point Accumulation Coordinate polarization Distance check Hough transform Plane Formation Inlier check Region clustering Largest inner rectangle Scene Reconstruction Rotation axis calibration Pan-tilt RGB-D scan Axis-bound registration Window Generation XY grid projection Contour & boundary check Multi window partitioning
  • 76. Implementation Detail 1. Axis Bound Registration for Indoor Scene Reconstruction 2. Hough Plane Detection for Projection Region Selection 3. Planar Grid Partitioning for Multi-Window Generation 4. Touch and Ray Casting based Window Manipulation PRISM: Pervasive Registration of Interface with Spatial Manipulation 76 < Skeleton Ray Casting – Debug View> < Skeleton Ray Casting – User View>
  • 77. FURTHER RESEARCH & FUTURE PATH
  • 78. Summary • I have … 1. presented a novel concept of Pervasive AR space and its implementation 2. proposed a geometric distortion correction method to realize projection on any surface 3. showed that enhanced presence can be delivered in monocular views with dynamic perspective projection 4. proposed an inverse kinematics servo control method in conjunction with rotation axis calibration 5. proposed a fast and efficient method to register point clouds with pan-tilt rotation priors Projection Mapping and Augmented Reality for Pervasive AR Environment 78
  • 79. Deep Learning for Mixed Reality • Magic Leap Principal Engineer ’Tomasz Malisiewicz‘ ▪ Deep learning is going to be the most important ingredient for this mixed reality … ▪ because it is the "master algorithm" for working with perceptual data ▪ Virtual characters are positioned behind actual surfaces – one has to reason about the geometry, the ordering of the surfaces around the characters Further Research Tomasz Malisiewicz - Deep Learning for Augmented Reality Applications https://vimeo.com/229184982 79
  • 80. Deep Learning for Mixed Reality • “Towards Pervasive Augmented Reality”, TVCG 2017 ▪ Focusing on Context-awareness in Augmented Reality ▪ Proposed and defined the concept of Pervasive Augmented Reality as Context-Awareness ▪ Comparison and contrast of various aspects of Conventional and Pervasive AR Further Research Grubert, Jens, et al. "Towards pervasive augmented reality: Context-awareness in augmented reality.” IEEE transactions on visualization and computer graphics 23.6 (2017): 1706-1724. 80 Conventional Augmented Reality Pervasive Augmented Reality Use Sporadic Continuous Control User Controlled Context-Controlled Applications Specific or Niche Multi-Purpose Hardware General Purpose Tailored/Specific Context of Use Specific/Restricted Multi-Purpose/Adaptive/Aware User Interface Prototypical/No Standard/Obtrusive Subtle/Disappearing/Unobtrusive Mode of Use Task- or Goal-Oriented Context-Driven Information Access Information Overlay Information Augmentation Information Visualization Added Integrated/Embedded Environment Indoors OR Outdoors Indoors AND Outdoors Flow of Information User Seeking Information Information Seeking Users Use of Device One Size Fits All Individualized
  • 81. Context-aware Pervasive AR platform • Pervasive AR continuum in conjunction with context-awareness Overview the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2018R1A2A1A10055673). 81 Context-aware Pervasive AR Vision-based Artificial Intelligence Scene Understanding [ImageNet Classification, Krizhevsky, NIPS 2012] Context Awareness [Modeling 3d environments, Jiang, TPAMI 2016] Geometry Learning [Pointnet, Qi, CVPR 2017] < Context-aware Pervasive AR concept diagram > Real World Virtual World Tangible User Interface (TUI) Augmented Reality (AR) Augmented Virtuality (AV) Virtual Reality (VR) Spatial AR (SAR) Mobile AR (MAR) Mixed World [ Mixed Reality, Milgram, ITIS 1994 ] [ Spatial AR, Ramesh Raskar, ISMAR 2004 ] [ Tangible Bits, Hiroshi Ishii, SIG CHI 1997 ] [ A survey of augmented reality, Azuma, 1997 ] Wall Display & Pervasive Display [ Pervasive Display Applications , Strohbach, M. Martin, 2011 ] Immersive Display [ Immersive displays , Lantz, Ed., ACM SIGGRAPH, 1996 ] Projection AR (stationary) [ Illumiroom, H. Benko, ACM CHI 2013 ] Pervasive AR [ Mobile AR, Hollerer, Telegeoinformatics, 2004 ] [ Mixed Reality, Milgram, ITIS 1994 ] [ Virtual Reality : CAVE , Cruz-Neira, SIGGRAPH, 1993 ] Natural User Interface (NUI) [ Natural User Interface, Petersen, ISMAR 2009] < Pervasive AR concept diagram >
  • 82. Context-aware Pervasive AR platform • Provide in-situ information/content/UI in accordance with user intent/situation by integrating projection augmented reality system and context-aware technology • Provide multi-modal interactions in an context-aware Pervasive AR space, enabling seamless interaction with the system • Design and implement applications that can be used instantly in smart environments without any prior knowledge Overview the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. NRF-2018R1A2A1A10055673). 82 Spatial mapping RGB-D camera (Input) Projection AR (Output) Adaptive Interface Contextual Information In-situ Content … User #1 Standing Position: (x,y,z) User recognition Scene understanding Context-awareness Environment w/o prior knowledge User Real-world Information/Content/UI User Pervasive AR environment with in-situ info/content/UI via context analysis Interaction
  • 83. Consumer Robotics with Pervasive AR • Personal assistant for smart home with AI & AR ▪ Clova AI device + projection AR system -> R2-D2? Future Path Gugenheimer, Jan, et al. "Ubibeam: exploring the interaction space for home deployed projector-camera systems." Human-Computer Interaction. Springer, Cham, 2015. 83
  • 84. Consumer Robotics with Pervasive AR • ADAS system with AR-enhanced situational awareness ▪ AHEAD device + projection AR system -> Mercedes-Maybach's Digital Light? Future Path https://newatlas.com/mercedes-maybach-digital-light-smart-headlights/53678/ March 6th, 2018 https://www.naverlabs.com/storyDetail/69 2018.11.09 84
  • 85. THANK YOU FOR YOUR ATTENTION Jung-Hyun Byun junghyun.ian.byun@gmail.com