SlideShare ist ein Scribd-Unternehmen logo
1 von 24
Downloaden Sie, um offline zu lesen
||Autonomous Systems Lab
Elena Morara
Master Thesis Defense
Supervised by Professor Roland Siegwart
and Professor Howie Choset (Carnegie Mellon University, Pittsburgh, USA)
22.09.2016Ellena Morara 1
Odometry and Mapping Inside Pipes
Visual, Inertial and Kinematic Sensor Fusion
||Autonomous Systems Lab
Estimate:
 Motion
of the robot
 Map of
the pipe
network
Why?
 Pipe
inspection
22.09.2016Elena Morara 2
Goal
||Autonomous Systems Lab 22.09.2016Elena Morara 3
Overview
 Snake robots
 Odometry techniques
 Existing
 Novel
 Proposed framework
 Results
 Conclusion and future work
||Autonomous Systems Lab 22.09.2016Elena Morara 4
Snake robots in pipes
1x
 Visual, inertial
and kinematic
feedback
 Complies to
traversed pipe
SEA Snake Robot, BioRobotics Lab
||Autonomous Systems Lab 22.09.2016Elena Morara 5
Framework
||Autonomous Systems Lab
 Low pass accelerometers
gravity direction estimate
 High pass gyros
 Orientation of the robot
22.09.2016Elena Morara 6
Complementary Filter
||Autonomous Systems Lab
 Extract-match visual features
 Two-view geometry
 Rotation of the head
 Translation of the head
up to scale
22.09.2016Elena Morara 7
Feature-Based
Visual Odometry
||Autonomous Systems Lab 22.09.2016Elena Morara 8
Visual Joint Tracker
||Autonomous Systems Lab
 Probabilistic Data Association Filters
 Clutter VS target-generated
measurements
22.09.2016Elena Morara 9
Visual Joint Tracker:
circle tracker
PDAFOriginal frame
Detected circle
Filtered circle
||Autonomous Systems Lab 22.09.2016Elena Morara 10
Pipe Estimator
When in straight pipe:
• Orientation
When in bend also:
• Bend angle
• Position
||Autonomous Systems Lab 22.09.2016Elena Morara 11
Framework: recap
||Autonomous Systems Lab 22.09.2016Elena Morara 12
Framework: recap
• Orientation
||Autonomous Systems Lab 22.09.2016Elena Morara 13
Framework: recap
• Rotation
• Translation
up to scale
||Autonomous Systems Lab 22.09.2016Elena Morara 14
Framework: recap
If joint is visible
• Position
||Autonomous Systems Lab 22.09.2016Elena Morara 15
Framework: recap
• Orientation
||Autonomous Systems Lab 22.09.2016Elena Morara 16
Framework: recap
If inside
a bend
• Position
||Autonomous Systems Lab
 State:
 Current and previous pose
 Pose of current pipe
 Drift of Complementary Filter
 Prediction step:
 Driven from Visual Odometry
for current pose
 Zero velocity for others
 Measurement update step:
 Combination of other estimates
22.09.2016Elena Morara 17
Sensor fusion: EKF
||Autonomous Systems Lab
 GoPro Hero Session 4
 Pipe diameter: 10 cm
 Snake of 18 modules
(~120 cm)
22.09.2016Elena Morara 18
Sample results
||Autonomous Systems Lab 22.09.2016Elena Morara 19
Results: Pipe orientation
Time [sec]
Pipeangle[º] 45º
90º
||Autonomous Systems Lab 22.09.2016Elena Morara 20
Results: Map + odometry
x [m]
y[m]
Truth
EKF
TOP VIEW
45º
45º
||Autonomous Systems Lab 22.09.2016 21
Results: Map
x [m]
y[m]
TOP VIEW
||Autonomous Systems Lab
Exploiting the problem structure
(Joints, pipe segments, bends)
Together with standard odometry techniques
For improved:
 6DoF pose estimate of robot
 Map of traversed pipe
22.09.2016Elena Morara 22
Contributions of this work
||Autonomous Systems Lab
 Investigate linearization of dependency of Pipe Estimator
on Complementary Filter
 Include temporal filtering of variables inside the EKF
 Use no-slip motion model in prediction step of EKF
 Exploit motion estimate for more robust Visual Joint
Tracker
22.09.2016((Vorname Nachname)) 23
Future work
||Autonomous Systems Lab
 Professor Siegwart
 Professor Choset
 Biorobotics Lab at Carnegie Mellon University
 ZKS Foundation
22.09.2016((Vorname Nachname)) 24
Acknowldegements

Weitere ähnliche Inhalte

Andere mochten auch

Better Buildings Breakfast, Ottawa (May 11th, 2016)
Better Buildings Breakfast, Ottawa (May 11th, 2016)Better Buildings Breakfast, Ottawa (May 11th, 2016)
Better Buildings Breakfast, Ottawa (May 11th, 2016)Rick Huijbregts
 
Gamificamooc - Rúbrica de evaluación p2p del Nivel 5
Gamificamooc - Rúbrica de evaluación p2p del Nivel 5Gamificamooc - Rúbrica de evaluación p2p del Nivel 5
Gamificamooc - Rúbrica de evaluación p2p del Nivel 5INTEF
 
Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...
Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...
Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...Yavuz Silay
 
Tarea de implantación 5 - MOOC Aprendizaje Cooperativo
Tarea de implantación 5 - MOOC Aprendizaje CooperativoTarea de implantación 5 - MOOC Aprendizaje Cooperativo
Tarea de implantación 5 - MOOC Aprendizaje CooperativoINTEF
 
Administracion de medica
Administracion de medicaAdministracion de medica
Administracion de medicainformaticacra
 
Soluciones parenterales
Soluciones parenteralesSoluciones parenterales
Soluciones parenteralesinformaticacra
 

Andere mochten auch (8)

Better Buildings Breakfast, Ottawa (May 11th, 2016)
Better Buildings Breakfast, Ottawa (May 11th, 2016)Better Buildings Breakfast, Ottawa (May 11th, 2016)
Better Buildings Breakfast, Ottawa (May 11th, 2016)
 
Gamificamooc - Rúbrica de evaluación p2p del Nivel 5
Gamificamooc - Rúbrica de evaluación p2p del Nivel 5Gamificamooc - Rúbrica de evaluación p2p del Nivel 5
Gamificamooc - Rúbrica de evaluación p2p del Nivel 5
 
Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...
Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...
Strategies for Conducting New Product Scientific Assessment - Yavuz SILAY - D...
 
Tarea de implantación 5 - MOOC Aprendizaje Cooperativo
Tarea de implantación 5 - MOOC Aprendizaje CooperativoTarea de implantación 5 - MOOC Aprendizaje Cooperativo
Tarea de implantación 5 - MOOC Aprendizaje Cooperativo
 
Alzheimer's disease
Alzheimer's diseaseAlzheimer's disease
Alzheimer's disease
 
Presbyopia
PresbyopiaPresbyopia
Presbyopia
 
Administracion de medica
Administracion de medicaAdministracion de medica
Administracion de medica
 
Soluciones parenterales
Soluciones parenteralesSoluciones parenterales
Soluciones parenterales
 

Ähnlich wie Presentation_short_OdometryInPipes

Masterarbeit_Verteidigung
Masterarbeit_VerteidigungMasterarbeit_Verteidigung
Masterarbeit_VerteidigungAmr Awad
 
A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...
A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...
A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...Sheila Sinclair
 
Syllabus_Mech_Sem-V_2016Pattern.pdf
Syllabus_Mech_Sem-V_2016Pattern.pdfSyllabus_Mech_Sem-V_2016Pattern.pdf
Syllabus_Mech_Sem-V_2016Pattern.pdfParag Chaware
 
crowd-robot interaction: crowd-aware robot navigation with attention-based DRL
crowd-robot interaction: crowd-aware robot navigation with attention-based DRLcrowd-robot interaction: crowd-aware robot navigation with attention-based DRL
crowd-robot interaction: crowd-aware robot navigation with attention-based DRL민재 정
 
Iterative Visual Recognition for Learning Based Randomized Bin-picking
Iterative Visual Recognition for Learning Based Randomized Bin-pickingIterative Visual Recognition for Learning Based Randomized Bin-picking
Iterative Visual Recognition for Learning Based Randomized Bin-pickingKensuke Harada
 
Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...
Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...
Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...ShuvamRoy12
 
(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...
(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...
(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...Mohamed Elawady
 
Introduction to material Engineering 2KIU.pdf
Introduction to material Engineering 2KIU.pdfIntroduction to material Engineering 2KIU.pdf
Introduction to material Engineering 2KIU.pdfMohammedBuhariDahiru
 
K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)
K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)
K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)shailesh yadav
 
ANN Features for Heading Classifier
ANN Features for Heading ClassifierANN Features for Heading Classifier
ANN Features for Heading ClassifierAlwin Poulose
 
Control System Book Preface TOC
Control System Book  Preface TOCControl System Book  Preface TOC
Control System Book Preface TOCImthias Ahamed
 
Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...
Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...
Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...Databricks
 
The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018
The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018
The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018Amazon Web Services
 
Analog circuit fault diagnosis via FOA-LSSVM
Analog circuit fault diagnosis via FOA-LSSVMAnalog circuit fault diagnosis via FOA-LSSVM
Analog circuit fault diagnosis via FOA-LSSVMTELKOMNIKA JOURNAL
 
Traffic Light Control
Traffic Light ControlTraffic Light Control
Traffic Light Controlhoadktd
 

Ähnlich wie Presentation_short_OdometryInPipes (20)

Masterarbeit_Verteidigung
Masterarbeit_VerteidigungMasterarbeit_Verteidigung
Masterarbeit_Verteidigung
 
A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...
A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...
A Solution To The Motion Planning And Control Problem Of A Car-Like Robot Via...
 
Complete (2)
Complete (2)Complete (2)
Complete (2)
 
Syllabus_Mech_Sem-V_2016Pattern.pdf
Syllabus_Mech_Sem-V_2016Pattern.pdfSyllabus_Mech_Sem-V_2016Pattern.pdf
Syllabus_Mech_Sem-V_2016Pattern.pdf
 
crowd-robot interaction: crowd-aware robot navigation with attention-based DRL
crowd-robot interaction: crowd-aware robot navigation with attention-based DRLcrowd-robot interaction: crowd-aware robot navigation with attention-based DRL
crowd-robot interaction: crowd-aware robot navigation with attention-based DRL
 
Iterative Visual Recognition for Learning Based Randomized Bin-picking
Iterative Visual Recognition for Learning Based Randomized Bin-pickingIterative Visual Recognition for Learning Based Randomized Bin-picking
Iterative Visual Recognition for Learning Based Randomized Bin-picking
 
Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...
Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...
Attentive-YOLO: On-Site Water Pipeline Inspection Using Efficient Channel Att...
 
111020_TOPL_Review
111020_TOPL_Review111020_TOPL_Review
111020_TOPL_Review
 
(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...
(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...
(Reading Group) First Results in Detecting and Avoiding Frontal Obstacles fro...
 
B.Tech Thesis
B.Tech ThesisB.Tech Thesis
B.Tech Thesis
 
C16_3rd review
C16_3rd reviewC16_3rd review
C16_3rd review
 
Introduction to material Engineering 2KIU.pdf
Introduction to material Engineering 2KIU.pdfIntroduction to material Engineering 2KIU.pdf
Introduction to material Engineering 2KIU.pdf
 
K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)
K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)
K-10714 ABHISHEK(AUTOMOBILE SURVEILLANCE)
 
143070029 - 3
143070029 - 3143070029 - 3
143070029 - 3
 
ANN Features for Heading Classifier
ANN Features for Heading ClassifierANN Features for Heading Classifier
ANN Features for Heading Classifier
 
Control System Book Preface TOC
Control System Book  Preface TOCControl System Book  Preface TOC
Control System Book Preface TOC
 
Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...
Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...
Fiducial Marker Tracking Using Machine Vision with Saurabh Ghanekar and Kazut...
 
The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018
The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018
The Future of Mixed-Autonomy Traffic (AIS302) - AWS re:Invent 2018
 
Analog circuit fault diagnosis via FOA-LSSVM
Analog circuit fault diagnosis via FOA-LSSVMAnalog circuit fault diagnosis via FOA-LSSVM
Analog circuit fault diagnosis via FOA-LSSVM
 
Traffic Light Control
Traffic Light ControlTraffic Light Control
Traffic Light Control
 

Presentation_short_OdometryInPipes

  • 1. ||Autonomous Systems Lab Elena Morara Master Thesis Defense Supervised by Professor Roland Siegwart and Professor Howie Choset (Carnegie Mellon University, Pittsburgh, USA) 22.09.2016Ellena Morara 1 Odometry and Mapping Inside Pipes Visual, Inertial and Kinematic Sensor Fusion
  • 2. ||Autonomous Systems Lab Estimate:  Motion of the robot  Map of the pipe network Why?  Pipe inspection 22.09.2016Elena Morara 2 Goal
  • 3. ||Autonomous Systems Lab 22.09.2016Elena Morara 3 Overview  Snake robots  Odometry techniques  Existing  Novel  Proposed framework  Results  Conclusion and future work
  • 4. ||Autonomous Systems Lab 22.09.2016Elena Morara 4 Snake robots in pipes 1x  Visual, inertial and kinematic feedback  Complies to traversed pipe SEA Snake Robot, BioRobotics Lab
  • 5. ||Autonomous Systems Lab 22.09.2016Elena Morara 5 Framework
  • 6. ||Autonomous Systems Lab  Low pass accelerometers gravity direction estimate  High pass gyros  Orientation of the robot 22.09.2016Elena Morara 6 Complementary Filter
  • 7. ||Autonomous Systems Lab  Extract-match visual features  Two-view geometry  Rotation of the head  Translation of the head up to scale 22.09.2016Elena Morara 7 Feature-Based Visual Odometry
  • 8. ||Autonomous Systems Lab 22.09.2016Elena Morara 8 Visual Joint Tracker
  • 9. ||Autonomous Systems Lab  Probabilistic Data Association Filters  Clutter VS target-generated measurements 22.09.2016Elena Morara 9 Visual Joint Tracker: circle tracker PDAFOriginal frame Detected circle Filtered circle
  • 10. ||Autonomous Systems Lab 22.09.2016Elena Morara 10 Pipe Estimator When in straight pipe: • Orientation When in bend also: • Bend angle • Position
  • 11. ||Autonomous Systems Lab 22.09.2016Elena Morara 11 Framework: recap
  • 12. ||Autonomous Systems Lab 22.09.2016Elena Morara 12 Framework: recap • Orientation
  • 13. ||Autonomous Systems Lab 22.09.2016Elena Morara 13 Framework: recap • Rotation • Translation up to scale
  • 14. ||Autonomous Systems Lab 22.09.2016Elena Morara 14 Framework: recap If joint is visible • Position
  • 15. ||Autonomous Systems Lab 22.09.2016Elena Morara 15 Framework: recap • Orientation
  • 16. ||Autonomous Systems Lab 22.09.2016Elena Morara 16 Framework: recap If inside a bend • Position
  • 17. ||Autonomous Systems Lab  State:  Current and previous pose  Pose of current pipe  Drift of Complementary Filter  Prediction step:  Driven from Visual Odometry for current pose  Zero velocity for others  Measurement update step:  Combination of other estimates 22.09.2016Elena Morara 17 Sensor fusion: EKF
  • 18. ||Autonomous Systems Lab  GoPro Hero Session 4  Pipe diameter: 10 cm  Snake of 18 modules (~120 cm) 22.09.2016Elena Morara 18 Sample results
  • 19. ||Autonomous Systems Lab 22.09.2016Elena Morara 19 Results: Pipe orientation Time [sec] Pipeangle[º] 45º 90º
  • 20. ||Autonomous Systems Lab 22.09.2016Elena Morara 20 Results: Map + odometry x [m] y[m] Truth EKF TOP VIEW 45º 45º
  • 21. ||Autonomous Systems Lab 22.09.2016 21 Results: Map x [m] y[m] TOP VIEW
  • 22. ||Autonomous Systems Lab Exploiting the problem structure (Joints, pipe segments, bends) Together with standard odometry techniques For improved:  6DoF pose estimate of robot  Map of traversed pipe 22.09.2016Elena Morara 22 Contributions of this work
  • 23. ||Autonomous Systems Lab  Investigate linearization of dependency of Pipe Estimator on Complementary Filter  Include temporal filtering of variables inside the EKF  Use no-slip motion model in prediction step of EKF  Exploit motion estimate for more robust Visual Joint Tracker 22.09.2016((Vorname Nachname)) 23 Future work
  • 24. ||Autonomous Systems Lab  Professor Siegwart  Professor Choset  Biorobotics Lab at Carnegie Mellon University  ZKS Foundation 22.09.2016((Vorname Nachname)) 24 Acknowldegements

Hinweis der Redaktion

  1. Thank you everyone My name ETH, where I am supervised by I have been at Carnegie Mellon University to do my master thesis which involved odometry and mapping in pipe networks
  2. Let me introduce the goal of my work Given a pipe network (drain, sewage pipings etc) Robot able to navigate it Estimate pose and reconstruct map Useful for applications of pipe inspection, robots are used to assess conditions. It localizate fault detected from Video footage of the robot
  3. Here is an overview of my resentation First, briefly introduce considered motion estimation teniques (exiting and novel of this work) Then I’ll go into methodolody explaining the details of my solution Empirical results Ten I’ll conclude with future work
  4. Hyper redundant mechanisms with a serial chain kinematics The robot I have been using developed at biorobotics lab It has a modular architecture, each module having one degree of freedom, with an encoder and an IMU Also front monocular camera Suitable because Flexible body and deformable shape Allow then to comply With the environment Adapt to different piping characteristics Diameters Manage bends Our snake robot prived visual feedback from a frontal monocular camera Has IMUs alongs its body NOW LET ME INTRODUCE SOME MOTION ESTIMATION TECNIQUES WHICH ARE SUITABLE FOR OUR SNAKE ROBOT
  5. Here is an high level view of my proposed solution Input to the systems are Which are feeded to individual pose/motion estimators In yellow are the novel contributions of this work The first here Pipe Estimator exploits kinematics constraints This last is the tracking of the pipe joints In the middle two standard odometry tecniques: inertial and visual The estimates are fused together with an EKF To produce an estimate of robot pose over time Map of the traversed network NOW I WOULD LIKE TO EXPLAIN THE DETAILS OF THESE CONCEPTUAL BLOCKS
  6. Let me know dwelve into the components of my framework Complementary Filter is a estimation technique which uses readings from IMUs for the orientation of a body Which relies on High freq noise of acc Integrating the gyros leads to a slow drift over time Snake robot we have several IMUS along the body Whose readings can be combined to estimate orientation
  7. On the other standard block I have implemented well established technique in visual odometry Called Feature-based odometry It takes as input subsequent frames from the camera video Extracts and matches visual features Exploits epipolar geometry To retrieve Local rotation of the camera between the frames Its Translation up to scale by triangulating the matched visual features BREATHE
  8. Now let’s move on one of the customized estimator of my solution Visual joint tracker It takes as input the current video frame and in case a pipe joint is visible it returns its position with respect to the camera Hence we can infer head position I will now go throught the work flow of this block
  9. Circle tracker contrains several filters which are candidate trackers of the circle generated by a pipe joint The filters are PDAF which is a variant of the KF The key difference is estimates whether the measurements are generated by the desired target or by noise The PDAFs are initialized and updated with the circle detected by the Hough Detectore Here you see an example highlighting power of PDAF Here is original frame In blue is the detected filter , which is an outlier, does not reflect position of pipe joint While the Kalmar filter follows it The PDAF labeled it as a spurious measurement And doesn not update based on it BREATHE
  10. Finally Here is an overview of the pipe estimatore block It uses the kinematics of the snake And its current pose estimate In case the robot is inside a straight pipe it just provides the orientation of the snake When it is going though a bend It also able to estimate forward motion
  11. Let me now recap what I have explained so far about the individual motion estimates
  12. NOW I WILL GIVE AN HIGH LEVEL VIEW OF HOW THIS FINAL STAGE IS PERFORMED, the sensor fusion
  13. I run EKF Every time all the estimates are updated, hence depends on the camera frame rate …. I have adopted a solution which is common in visual-inertial odometry Which consists in having the motion model driven by observations … LET ME REMIND YOU That all the estimates are independent, except for the Pipe Estimator, which relies on Complementary Filter output While this is neglected in current solution, it should be addressed in the future BREATHE
  14. How I will show some empirical results Two/three body lengths
  15. I would like to start by comparing the estimate of the pipe orientation from two blocks of my setup Top right you can see the setup snake was going thorugh Consisting of three pipes Bends of 45 and 90 Here in the graph we have three signals over time In black is the orientation of the pipe the head is going through 45 then 45+90 In blue is the snake heading as computed by the CF since the head points forward in the pipe should follow black In red is the pipe orientation as estimated from the kinematics By the Pipe Estimator Block It shows lower error and we have tested on 45, 90 and 180 bends And it was always superior performance On average by 1 order of magnitude
  16. Then here I would like to show a sample result from the whole algorithm Id est The pose estimate and the piping map from the EKF Here on top right the real setup the snake was going through Three segments, bends of 45 and -45 the snake stoped after 70 cm in this pipe in black the real map In red the estimated map The overall error is of ~10 cm over 2.5 meters To put this in perspective here I show the real size of the pipes Finally let me show the estimated robot odometry Let me highlight wee need full 6DOF information for this
  17. Finally I would like to show one last result Which shows two important points - Main cause of bad estimates - Comparison with tandard viusual-inertial We see a setup of three bends 45 45 Shown in this top view in black In red is the estimated map from the EKF It is built from left to right The error mainly build up in the initial part When robot is in a straight pipe and no joint is visible, forward motion can be estimated only by visual odometry We discussed how its performance is impaired by pipe looks Further more During this initial part of the run the snake got briefly stack into the pipe Leading to a jerky recovery motion in which it bumped against pipe walls, this made complementary filter position estimate unrealiable These two considerations Are the also the basis of the poor performance of the visual+ inertial estimates alone While the map builds up, the robot bumps and the CF orientation estimates fails, then it recovers as shown before, the drift of CF worsens the estimated angles Hence, to wrap up, This shows how the novel contributions improve
  18. Which consist in the contribution of my work I achieved
  19. I am concluding by mentioning directions for future work As I mentioned pipe estimator depends on CF and involves a heavy filtering of signals over time. Investigate how this can be integrated into the EKF furthermore, in order to improve the performance in straight pipes, improvement is expected by predictiong the robot motion with the non slip model by Enner and Rollinson Finally, the visual joint tracker performance can be improved by retving circle motion model from estimate of camera model