This slide is presented in CAADRIA2012 (The 17th International Conference on Computer Aided Architectural Design Research in Asia).
Abstract. This research presents the development of a sensor oriented mobile AR system which realizes geometric consistency using GPS, a gyroscope and a video camera which are mounted in a smartphone for urban landscape assessment. A low cost AR system with high flexibility is realized. Consistency of the viewing angle of a video camera and a CG virtual camera, and geometric consistency between a video image and 3DCG are verified. In conclusion, the proposed system was evaluated as feasible and effective.
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
1. CAADRIA2012, Chennai, India
SOAR
SENSOR ORIENTED MOBILE
AUGMENTED REALITY FOR URBAN
LANDSCAPE ASSESSMENT
TOMOHIRO FUKUDA, TIAN ZHANG, AYAKO SHIMIZU,
MASAHARU TAGUCHI, LEI SUN and NOBUYOSHI YABUKI
Division of Sustainable Energy and Environmental Engineering,
Graduate School of Engineering,
Osaka University, Japan
2. Contents
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consistency with the viewing angle of a video camera
and CG virtual camera
2. Accuracy of geometric consistency with a video image
and 3DCG
4. Conclusion
2
3. Contents
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consistency with the viewing angle of a video camera
and CG virtual camera
2. Accuracy of geometric consistency with a video image
and 3DCG
4. Conclusion
3
4. 1.1 Motivation -1 1. Introduction
In recent years, the need for landscape simulation has been
growing. A review meeting of future landscape is carried out on a
planned construction site in addition to being carried out in a
conference room.
It is difficult for stakeholders to imagine concretely such an image
that is three-dimensional and does not exist. A landscape
visualization method using Computer Graphics (CG) and Virtual
Reality (VR) has been developed.
However, this method requires much time and expense to make a
3D model. Moreover, since consistency with real space is not
achieved when using VR on a planned construction site, it has the
problem that a reviewer cannot get an immersive experience.
4
A landscape study on site VR caputure of Kobe city
5. 1.1 Motivation -2 1. Introduction
In this research, the authors focus Augmented Reality (AR) which
can superimpose an actual landscape acquired with a video
camera and 3DCG. When AR is used, a landscape assessment
object will be included in the present surroundings. Thereby, a
drastic reduction of the time and expense involved in carrying out
3DCG modeling of the present surroundings can be expected.
A smartphone is widely available on the market level.
Sekai Camera Web Smartphone Market in Japan
http://sekaicamera.com/ 5
7. 1.2 Previous Study 1. Introduction
2. Use of an artificial marker. Since an artificial marker needs to be always
visible by the AR camera, the movable span of a user is limited. Moreover,
to realize high precision, it is necessary to use a large artificial marker.
Yabuki, N., et al.: 2011, An invisible height evaluation
system for building height regulation to preserve good
landscapes using augmented reality, Automation in
Construction, Volume 20, Issue 3, 228-235.
artificial marker
7
8. 1.3 Aim 1. Introduction
In this research, SOAR (Sensor Oriented Mobile AR) system which
realizes geometric consistency using GPS, a gyroscope and a
video camera which are mounted in a smartphone is developed.
A low cost AR system with high flexibility is realizable.
9
9. Contents
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consistency with the viewing angle of a video camera
and CG virtual camera
2. Accuracy of geometric consistency with a video image
and 3DCG
4. Conclusion
10
10. 2. System Development
2.1 Development Environment Of a System
Standard Spec Smartphone: GALAPAGOS 003SH (Softbank Mobile Corp.)
Development Language: OpenGL-ES(Ver.2.0),Java(Ver.1.6)
Development Environment: Eclipse Galileo(Ver.3.5)
Location Estimation Technology: A-GPS (Assisted Global Positioning System)
Video Camera
Spec of 003SH
OS Android™ 2.2
Qualcomm®MSM8255
CPU
Snapdragon® 1GHz
ROM:1GB
Memory
RAM:512MB
Weight ≒140g
Size ≒W62×H121×D12mm
Display Size 3.8 inch
Resolution 480×800 pixel
003SH
11
11. 2.2 System Flow -1 2. System Development
While the CG model realizes
Calibration of a video camera
ideal rendering by the
Definition of landscape assessment 3DCG model perspective drawing method,
rendering of a video camera
Activation of AR system produces distortion.
Selection of 3DCG model
Activation of Activation of
Activation of GPS gyroscope video camera
Position Angle information
information Capture of live
acquisition video image Distortion Calibration
acquisition
Definition of position and angle
information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Save of AR image
Calibration of the video camera
12
using Android NDK-OpenCV
12. 2.2 System Flow -2 2. System Development
3DCG Model
Calibration of a video camera
Definition of landscape assessment 3DCG model
Geometry, Texture, Unit
Activation of AR system
Selection of 3DCG model
3DCG model allocation file
Activation of Activation of
Activation of GPS gyroscope video camera
Position Angle information
information Capture of live
acquisition video image 3DCG model name, File name,
acquisition
Position data (longitude, latitude,
orthometric height), Degree of
Definition of position and angle
information on CG virtual camera rotation angle, and Zone
number of the rectangular plane
Superposition to live video image and 3DCG model
Display of AR image 3DCG model arrangement information file
Save of AR image
Number of the 3DCG model
allocation information file, 13
Each name
13. 2.2 System Flow -3 2. System Development
Calibration of a video camera
Definition of landscape assessment 3DCG model
Activation of AR system
Selection of 3DCG model
Activation of Activation of
Activation of GPS gyroscope video camera
Position Angle information
information Capture of live
acquisition video image
acquisition
Definition of position and angle
information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Save of AR image
GUI of the Developed System
14
14. 2.2 System Flow -4 2. System Development
Calibration of a video camera
Definition of landscape assessment 3DCG model
yaw
Activation of AR system
Selection of 3DCG model
Activation of Activation of
Activation of GPS gyroscope video camera
Position Angle information
information Capture of live
acquisition video image roll
acquisition
pitch
Definition of position and angle
information on CG virtual camera
Superposition to live video image and 3DCG model
Coordinate System of Developed
AR system
Display of AR image
Save of AR image
15
15. 2.2 System Flow -4 2. System Development
Calibration of a video camera Position data is converted into
the coordinates (x, y) of a
Definition of landscape assessment 3DCG model
rectangular plane. Orthometric
Activation of AR system
height is created by subtracting
the geoid height from the
Selection of 3DCG model ellipsoidal height.
Activation of Activation of
Activation of GPS gyroscope video camera
Position Angle information
information Capture of live
acquisition video image
acquisition
Definition of position and angle
information on CG virtual camera
Superposition to live video image and 3DCG model
Angle value of yaw points out
Display of AR image
magnetic north. In an AR
Save of AR image system, in order to use a true
north value, a magnetic
declination is acquired and
corrected.
16
16. 2.2 System Flow -5 2. System Development
Calibration of a video camera
Definition of landscape assessment 3DCG model
Activation of AR system
Selection of 3DCG model
Activation of Activation of
Activation of GPS gyroscope video camera
Position Angle information
information Capture of live
acquisition video image
acquisition
Definition of position and angle
information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Save of AR image
17
18. Contents
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consistency with the viewing angle of a video camera
and CG virtual camera
2. Accuracy of geometric consistency with a video image
and 3DCG
4. Conclusion
19
19. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Known Building Target
▶ GSE Common East Building at Osaka University Suita Campus
▶ W29.6 m, D29.0 m, H67.0 m
Photo Drawing
28.95m 29.6m
28.95m
29.6m
64.8m
64.8m
29.6m
Latitude, Longitude, Orthometric height Outlined 3D Model
34.823026944, 135.520751389, 60.15 24
20. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Known Viewpoint Place
▶ No.14-563 reference point. Distance from the reference point to the center
of the Building was 203 m.
▶ AR system was installed with a tripod at a level height 1.5m.
Building Target
BC
A D
203m
Viewpoint (No.14-563 Reference Point)
Latitude, Longitude, Orthometric height Measuring Points of
34.82145699, 135.519612, 53.1 Residual Error 25
21. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Parameter Settings of Eight Experiments
Parameter Settings
(S: Static Value = Known value, D: Dynamic Value = Acquired value from a device )
Position Information of Angle Information of
Experiment CG Virtual Camera CG Virtual Camera
Latitude Longitude Altitude yaw pitch roll
No.1 S S S S S S
No.2 D D D D D D
No.3 D S S S S S
No.4 S D S S S S
No.5 S S D S S S
No.6 S S S D S S
No.7 S S S S D S
No.8 S S S S S D
26
22. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Calculation Procedure of Residual Error
1. Pixel Error: Each difference between the horizontal direction and vertical
direction of four points measured by pixels (Δx, Δy).
⊿x
⊿y Live Image
CG Model
Calculation image of residual error between live video image and CG
2. Distance Error: From the acquired value (Δx, Δy), each difference in the
horizontal direction and vertical direction was computed as a meter unit
by the formula 1 and the formula 2 (ΔX, ΔY).
(1) (2)
W: Actual width of an object (m)
H: Actual height of an object (m)
x: Width of 3DCG model on AR image (px)
y: Height of 3DCG model on AR image (px)
27
23. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.1 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.1 S S S S S S
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
24. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.2 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.2 D D D D D D
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
25. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.3 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.3 D S S S S S
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
26. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.4 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.4 S D S S S S
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
27. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.5 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.5 S S D S S S
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
28. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.6 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.5 S S S D S S
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
29. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.7 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.5 S S S S D S
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
30. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results: No.8 AR image
Position Information of Angle Information of
Experim
CG Virtual Camera CG Virtual Camera
ent Latitude Longitude Altitude yaw pitch roll
No.5 S S S S S D
(0.12m/pixel)
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Pixel Error
Max. Mean Min.
Unit:
Distance Error
Distance Error Unit:
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8
Unit:
31. 3. Verification of System
3.2 Accuracy of geometric consistency with a video image
and 3DCG
Results in No.1 (All the known static value used)
▶ Pixel error: less than 1.5 pixels
▶ Mean distance error: less than 0.15m
▶ Accuracy of the AR system was found to be high.
▶ Object 200m away can be evaluated when a known static value is
used.
Results in No.2 (General SOAR)
▶ Pixel error: less than 20 pixels (H), 55 pixels (V)
▶ Mean distance error: less than 2.3m (H), 6.3m (V)
Results through No.2-8 No.1
▶ Although No.2 is all dynamic inputs, the X residual error of No.2
is smaller than the one of No.6. It is the result of offsetting the X
residual error of No.2, the X residual error of No.6 which is +
angle, and the X residual error of No.4 which is - angle.
Max. Mean Min.
Distance Error
Distance Error Unit:
No.2
No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8 36
32. Contents
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consistency with the viewing angle of a video camera
and CG virtual camera
2. Accuracy of geometric consistency with a video image
and 3DCG
4. Conclusion
37
33. 4. Conclusion
4.1 Conclusion
When known values are used for position and angle information on CG
virtual camera and the distance was 200 m, the pixel error is less than 1.5
pixels, and the mean distance error is less than 0.15 m. This value is the
tolerance level of landscape assessment.
When dynamic values acquired with GPS and
gyroscope were used, the pixel error was
less than 20 horizontal pixels and 55 vertical
pixels. The mean distance error was 2.3
horizontal meters and 6.3 vertical meters.
The developed SOAR has geometric
consistency using GPS and the gyroscope
with which the smart phone is equipped.
38
34. 4. Conclusion
4.2 Future Work
A future work should attempt to reduce the residual error included in the
dynamic value acquired with GPS and the gyroscope.
It is also necessary to verify accuracy of the residual error to objects
further than 200m away and usability.
39
35. Thank you for your attention!
E-mail: fukuda@see.eng.osaka-u.ac.jp
Twitter: fukudatweet
Facebook: Tomohiro Fukuda
Linkedin: Tomohiro Fukuda