BDSM⚡Call Girls in Sector 97 Noida Escorts >༒8448380779 Escort Service
Point Cloud Stream on Spatial Mixed Reality: Toward Telepresence in Architectural Field
1. Point Cloud Stream on
Spatial Mixed Reality
Graduate School of Engineering
Osaka University
Tomohiro Fukuda
Yuehan Zhu, Nobuyoshi Yabuki
Sep. 20, 2018
eCAADe2018, Łódź, Poland
Toward Telepresence in Architectural Field
2. Outline
1. Introduction
2. PcsMR: Development of Point Cloud Stream
on Mixed Reality
3. Experiments and Results
4. Conclusions and Future Work
2
3. 3D Remote Meeting
3デザイナー@千葉 デザイン関係者@大阪 デザイン関係者@ハイデルベルクSun, L., Fukuda, T. and Resch, B. (2014). A synchronous distributed cloud-based virtual reality meeting system for architectural and urban design, Frontiers of Architectural Research, 3(4), 348-357.
• Sharing 3D virtual models is as necessary as sharing
appearances and voices of meeting participants
• System development and pilot projects have attempted
• Still insufficient compared with face-to-face meeting
Sacrifice time
Incur cost
CO2 emissions
4. Mixed Reality
4
デザイナー@千葉 デザイン関係者@大阪 デザイン関係者@ハイデルベルク
• Mixed Reality (MR) enables 3D presentations of invisible
information, melting the physical with virtual worlds.
• One MR method is telepresence, which is expected to give
people a way to communicate remotely as if face to face in
a realistic way.
Real
Env.
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Env. = VR
Mixed Reality (MR)
Milgram and Kishino (1994) Simplified representation of a "virtuality continuum“
Yuehan Zhu, Tomohiro Fukuda and Nobuyoshi Yabuki: SLAM-Based MR with Animated CFD for Building Design Simulation, CAADRIA 2018, 391-400, 2018.5.
5. Objective
5デザイナー@千葉 デザイン関係者@大阪 デザイン関係者@ハイデルベルク
• To explore applicability of a spatial MR that displays point
cloud streams (PCS) to realize 3D remote meeting in
architecture and urban fields.
• PCS: Acquire point clouds in real-time using a RGB camera and
depth sensor (RGB-D camera)
• Previous studies: State-of-the art MR systems meant to try
telepresence remote meetings [Beck et al. 2013; Orts-Escolano et al.
2016].
• There are no sufficient researches on MR applicability in
architecture and urban fields.
Beck, S, Kunert, A, Kulik, A and Froehlich, B 2013, ’Immersive group-to-group telepresence’, Visualization and Computer Graphics, IEEE Transactions, 19(4),
616-625
Orts-Escolano, S, Rhemann, C, Fanello, S, Chang, W, Kowdle, A, Degtyarev, Y, Kim, D, Davidson, PL, Khamis, S, Dou, M, Tankovich, V, Loop, C, Cai, Q,
Chou, PA, Mennicken, S, Valentin, J, Pradeep, V, Wang, S, Kang, SB, Kohli, P, Lutchyn, Y, Keskin, C and Izadi, S 2016, ’Holoportation: Virtual 3D
Teleportation in Realtime’, Proc. of the 29th Annual Symposium on User Interface Software and Technology, 741-754.
6. Outline
1. Introduction
2. PcsMR: Development of Point Cloud Stream
on Mixed Reality
3. Experiments and Results
4. Conclusions and Future Work
6
7. PcsMR(Point cloud stream on Mixed Reality)
System configuration: Simple and inexpensive
7
Generating PCS
Kinect for Windows v2
• Range: 0.5-8.0m
• Angle: H 70°, V 60°
Transferring PCS
PC (Windows10)
Router
• IEEE802.11ac (867Mbps)
• IEEE802.11n (300Mbps)
RGB-D camera
PC
USB
W-LAN
Router
Optical See-through HMD
1000Base-T/TX
Place A Place B
Rendering PCS
HoloLens
8. PcsMR-2
Algorithm -1
8
RGB-D camera
USB
Router
1000Base-T/TX
Generating PCS
• 30fps
• Transferring to the
PC in real time
Transferring PCS
• Developing system to receive PCS using
Kinect SDK
• Unity3D to adjust PCS to accurate place
on MR: Position, angle, scale
Function Feature
GetDefaultKinectSensor(IKinectSensor) Get current Kinect
IKinectSensor::open() Kinec Operate Kinect
IKinectSensor::get_CoordinateMapper Get coordinate converter
IKinectSensor::get_*FrameSource Get 3D point source frame by frame
I*FrameSource::OpenReader(I*FrameReader *) Create a frame reader for each source
I*FrameReader::AcquireLatestFrame(I*Frame *) Request new data frame in the main loop
Kinect SDK functions used for PcsMR
Rendering PCS
Optical See-through HMD
W-LAN
9. PcsMR-3
9
• If HoloLens displays all PCS during MR rendering process, unnecessary
point clouds appear as noise.
• Highly granular PCS also becomes to big data volume, making real-time
rendering difficult.
• To reduce noise, we implemented an algorithm (Kowalski et al. 2015).
• Finding n neighboring points for each point (target point) in PCS.
• If the distance between the target point and a neighboring point is
greater than the threshold t, the neighboring point is detected an
outer point (noise).
Kowalski, M., Naruniec, J. and Daniluk, M. (2015). LiveScan3D: A Fast and Inexpensive 3D Data Acquisition System for Multiple Kinect v2 Sensors, In 3D Vision, 2015
International Conference on, Lyon, France.
Algorithm -2
RGB-D camera
USB
Router
1000Base-T/TX
Rendering PCSGenerating PCS Transferring PCS
Optical See-through HMD
W-LAN
10. Outline
1. Introduction
2. PcsMR: Development of Point Cloud Stream
on Mixed Reality
3. Experiments and Results
• One Kinect and one HoloLens used
• Audio (microphone and speaker): Technically possible
but not implemented
• Router: IEEE 802.11n Wi-Fi standard
• PCS traffic b/w router and HoloLens: 4MB/sec
4. Conclusions and Future Work
10
11. PcsMR-1 MR with PCS of People and BIM
11
• Development for Osaka-U event for high school students, held on a
different campus from our laboratory
• Students have never visited our laboratory, try to experience our
laboratory virtually and intuitively by PcsMR-1
• Generating PCS of user A and synthesizing a laboratory BIM model
• User B experience MR
PCUser A
(Creating PCS)
User B
(MR experience)
HoloLens
Kinect
12. PcsMR-1
Kinect
HoloLens
User A
(Creating PCS)
User B
(MR experience)
• User A might sit in front of the Kinect and move
• His appearance and actions were shown as if he sat in a chair in the
laboratory BIM model.
• User B could move freely with HoloLens and could explore 3D spatial
MR laboratory where tele-presented user A was seated
12
13. PcsMR-1: Observation
13
• User A behaved such waving hands, according to user B’s reaction.
• User B experienced laboratory environment and felt more realistic with
MR because of tele-presented user A in the unknown laboratory
Real World
(Event Site)
Point Cloud Stream
(L: Staff, R: User A)
BIM Model
(Laboratory)
14. PcsMR-2 MR with PCS of People and Model
14
• PcsMR-2 generated both user A and a city model (1:500 scale) as PCS.
• User B, who was in a remote place, wore a HoloLens to observe user A’s
presentation while handling 3D models.
• Because the city model put on a desk, another desk was also placed
beside user B
• Positions were aligned as if the scale model was on this desk.
Kinect
PC
User A and scale model
(Creating point cloud stream)
Display
User B
(MR experience)
HoloLens
15. PcsMR-2 Setup
15
RGB-D image captured by Kinect
(inversion condition)
Point Cloud Stream on Game Engine
(Unity 3D)
16. PcsMR-2 View from MR user
16
Speed: x1.5 | BGM: MusMus
• User B could freely move about in the meeting room and
confirm the user A’s movement and the scale model by PCS.
• When user B moved, the appearance of user A and scale
model changed from user B’s viewpoint in real-time, which is
impossible by 2D display such as video conference system.
17. Found Problems
1. Adapting to WAN
• Two experiments could be carried out only in LAN.
• We planned for PcsMR-1 to be constructed in WAN and
for the laboratory to create models both by BIM and by
PCS.
• Because PCS can realistically represent features of the
laboratory’s state, which are hard to express using BIM.
• However, transfer of PCS experienced a large latency
over WAN environment.
2. Improving PCS granularity and expanding capture
area
• Range: 0.5-8.0m | Angle: H 70°, V 60°
3. Synchronizing multiple Kinects and PCs
• Only one Kinect can connect per PC.
• When acquiring PCS from many directions, synchronizing
multiple Kinects and PCs are necessary.
17
18. Outline
1. Introduction
2. PcsMR: Development of Point Cloud Stream
on Mixed Reality
3. Experiments and Results
4. Conclusions and Future Work
18
19. Conclusions
• Toward telepresence in
architectural field, we
developed the PcsMR
system that displays 3D
point cloud streams.
• PcsMR consists of a Kinect, a PC, a router, and a
HoloLens, and uses a noise elimination algorithm for
point cloud stream.
• Application possibilities include meetings and
communications that share 3D objects in real-time
and include the movement of users and objects
remotely and synchronously.
19
20. Future Tasks
• Adapting PcsMR to WAN and the internet
environment
• Improving point cloud stream graininess and
acquisition ranges.
Acknowledgement
• This research has been partly supported by the
research grant of Nohmura Foundation for
Membrane Structure’s Technology, and by JSPS
KAKENHI Grant Number JP16K00707.
20