Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

Benchmarking of vision-based registration and tracking for MAR

243 Aufrufe

Veröffentlicht am

Benchmarking of vision-based registration and tracking for MAR in ISMAR 2016 Workshop: Standards for Mixed and Augmented Reality

Veröffentlicht in: Ingenieurwesen
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Benchmarking of vision-based registration and tracking for MAR

  1. 1. Benchmarking of Vision-based Spatial Registration and Tracking Methods for MAR (ISO/IEC NP 18520) Takeshi Kurata12, Koji Makita13, Takafumi Taketomi4, Hideaki Uchiyama5, Shohei Mori6, Tomotsugu Kondo7, Fumihisa Shibata8 1AIST, 2Univ. of Tsukuba, 3Canon, 4NAIST, 5Kyushu Univ., 6Keio Univ., 1The Open Univ. of Japan, 1Ritsumeikan Univ. ISMAR 2016 Workshop: Standards for Mixed and Augmented Reality (2016/9/23)
  2. 2. ISO/IEC WD (Working Draft) 18520 • Main Body – Terms and Definitions – Benchmarking framework – Benchmark Indicators – Trial set for benchmarking 2 Benchmark Indicators + Benchmarking Framework Trial Set (Dataset)+ • Annex A: Benchmarking organizations and activities • Annex B: Tracking competitions in ISMAR
  3. 3. Benchmarking framework vSRT: Vision-based spatial registration and tracking 3
  4. 4. Example of stakeholders and their roles 4 Technology developer
  5. 5. Example of stakeholders and their roles 5 Technology supplier
  6. 6. Example of stakeholders and their roles Benchmarking service provider 6
  7. 7. Example of stakeholders and their roles 7 Benchmark provider
  8. 8. Example of stakeholders and their roles 8 Technology user
  9. 9. Example of stakeholders and their roles 9 Benchmark provider Technology supplier Benchmarking service provider Technology developer Technology user
  10. 10. Benchmark Indicators vSRT: Vision-based spatial registration and tracking 10
  11. 11. Off-site On-site Reliability • PEVO • Reprojection error of image features • Position and posture errors of a camera • PEVO • Reprojection error of image features • Position and posture errors of a camera • Completeness of a trial (kind of Robustness?) Temporality • Latency • Frequency • Time for trial completion Variety • Number of datasets used for benchmarking • Variety on properties of datasets used for benchmarking • Number of trials conducted for benchmarking • Variety on properties of datasets used for benchmarking Benchmark Indicators PEVO: Projection error of virtual objects, which is the most direct and intuitive indicator for vSRT methods for MAR vSRT: Vision-based spatial registration and tracking 11
  12. 12. Off-site On-site Reliability • PEVO • Reprojection error of image features • Position and posture errors of a camera • PEVO • Reprojection error of image features • Position and posture errors of a camera • Completeness of a trial Temporality • Latency • Frequency • Time for trial completion Variety • Number of datasets used for benchmarking • Variety on properties of datasets used for benchmarking • Number of trials conducted for benchmarking • Variety on properties of datasets used for benchmarking Benchmark Indicators PEVO: Projection error of virtual objects, which is the most direct and intuitive indicator for vSRT methods for MAR vSRT: Vision-based spatial registration and tracking 12 ISMAR 2015 Tracking competition
  13. 13. Trial set for benchmarking vSRT: Vision-based spatial registration and tracking 13
  14. 14. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking 14
  15. 15. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking TrakMark 15
  16. 16. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking Metaio 16
  17. 17. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking The City of Sights: An Augmented Reality Stage Set 17
  18. 18. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking ISMAR 2015 Tracking competition 18
  19. 19. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking ISMAR 2014 Tracking competition 19
  20. 20. Off-site On-site Dataset Contents • Image sequences • Ground truth of intrinsic/extrinsic parameters of one or more cameras • Optional contents • 3D model data for the target objects in image sequences • 3D model data for virtual objects in image sequences • Depth image • Self-contained sensor data, etc. • Ground truth of challenge points • 3D models for the target objects • 3D models for virtual objects overlaid in benchmarking Metadata • Scenario • Camera motion type • Camera configuration • Image quality • Scenario Physical object instances • Easily available or deliverable physical objects • Information on how to find the physical objects • Physical objects Trial set for benchmarking ISMAR 2015 Tracking competition 20
  21. 21. IT project performance benchmarking framework (ISO/IEC 29155 series) MAR Reference Model (ISO/IEC CD 18039) Benchmarking for MAR Benchmarking of vision-based geometric registration and tracking methods for MAR ISO/IEC WD 18520 Venn diagram on conceptual relationship between ISO/IEC 29155 series, ISO/IEC CD 18039, and ISO/IEC WD 18520 21
  22. 22. IT product, system, and service vSRT methods for MAR IT project Benchmarking of vision-based geometric registration and tracking methods for MAR (ISO/IEC WD 18520) Create, Improve IT project performance benchmarking Conduct Benchmarking IT project performance benchmarking framework (ISO/IEC 29155 series) Target of benchmarking: Performance of vSRT methods for MAR Target of benchmarking: IT project performance Layered structure between ISO/IEC 29155 series and ISO/IEC WD 18520 22
  23. 23. Thank you! • AIST is now hiring for Tenure-track, Postdoc, and RA (PhD) positions at Tsukuba, Japan. • Target research fields are ↓ ↓ ↓ 23

×