11. Off-site On-site
Reliability
• PEVO
• Reprojection error of image
features
• Position and posture errors
of a camera
• PEVO
• Reprojection error of image
features
• Position and posture errors of a
camera
• Completeness of a trial (kind of
Robustness?)
Temporality
• Latency
• Frequency
• Time for trial completion
Variety
• Number of datasets used
for benchmarking
• Variety on properties of
datasets used for
benchmarking
• Number of trials conducted for
benchmarking
• Variety on properties of
datasets used for
benchmarking
Benchmark Indicators
PEVO: Projection error of virtual objects, which is the most direct and intuitive indicator for
vSRT methods for MAR
vSRT: Vision-based spatial registration and tracking 11
12. Off-site On-site
Reliability
• PEVO
• Reprojection error of image
features
• Position and posture errors
of a camera
• PEVO
• Reprojection error of image
features
• Position and posture errors of a
camera
• Completeness of a trial
Temporality
• Latency
• Frequency
• Time for trial completion
Variety
• Number of datasets used
for benchmarking
• Variety on properties of
datasets used for
benchmarking
• Number of trials conducted for
benchmarking
• Variety on properties of
datasets used for
benchmarking
Benchmark Indicators
PEVO: Projection error of virtual objects, which is the most direct and intuitive
indicator for vSRT methods for MAR
vSRT: Vision-based spatial registration and tracking 12
ISMAR 2015 Tracking competition
13. Trial set for benchmarking
vSRT: Vision-based spatial registration and tracking 13
14. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
14
15. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
TrakMark
15
16. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
Metaio
16
17. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
The City of Sights:
An Augmented Reality Stage Set
17
18. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
ISMAR 2015 Tracking competition
18
19. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
ISMAR 2014 Tracking competition
19
20. Off-site On-site
Dataset
Contents
• Image sequences
• Ground truth of intrinsic/extrinsic
parameters of one or more cameras
• Optional contents
• 3D model data for the target
objects in image sequences
• 3D model data for virtual objects
in image sequences
• Depth image
• Self-contained sensor data, etc.
• Ground truth of challenge points
• 3D models for the target objects
• 3D models for virtual objects
overlaid in benchmarking
Metadata
• Scenario
• Camera motion type
• Camera configuration
• Image quality
• Scenario
Physical
object
instances
• Easily available or deliverable
physical objects
• Information on how to find the
physical objects
• Physical objects
Trial set for benchmarking
ISMAR 2015 Tracking competition
20
21. IT project performance benchmarking
framework (ISO/IEC 29155 series)
MAR Reference Model
(ISO/IEC CD 18039)
Benchmarking for MAR
Benchmarking of vision-based geometric
registration and tracking methods for MAR
ISO/IEC WD 18520
Venn diagram on conceptual relationship
between ISO/IEC 29155 series, ISO/IEC
CD 18039, and ISO/IEC WD 18520
21
22. IT product, system, and service
vSRT methods for MAR
IT project
Benchmarking of vision-based geometric
registration and tracking methods for MAR
(ISO/IEC WD 18520)
Create, Improve
IT project performance benchmarking
Conduct
Benchmarking
IT project performance benchmarking framework
(ISO/IEC 29155 series)
Target of benchmarking:
Performance of
vSRT methods for MAR
Target of benchmarking:
IT project performance
Layered structure between ISO/IEC
29155 series and ISO/IEC WD 18520
22
23. Thank you!
• AIST is now hiring for Tenure-track, Postdoc, and
RA (PhD) positions at Tsukuba, Japan.
• Target research fields are ↓ ↓ ↓
23