1. ||Autonomous Systems Lab
Elena Morara
Master Thesis Defense
Supervised by Professor Roland Siegwart
and Professor Howie Choset (Carnegie Mellon University, Pittsburgh, USA)
22.09.2016Ellena Morara 1
Odometry and Mapping Inside Pipes
Visual, Inertial and Kinematic Sensor Fusion
6. ||Autonomous Systems Lab
Low pass accelerometers
gravity direction estimate
High pass gyros
Orientation of the robot
22.09.2016Elena Morara 6
Complementary Filter
7. ||Autonomous Systems Lab
Extract-match visual features
Two-view geometry
Rotation of the head
Translation of the head
up to scale
22.09.2016Elena Morara 7
Feature-Based
Visual Odometry
9. ||Autonomous Systems Lab
Probabilistic Data Association Filters
Clutter VS target-generated
measurements
22.09.2016Elena Morara 9
Visual Joint Tracker:
circle tracker
PDAFOriginal frame
Detected circle
Filtered circle
10. ||Autonomous Systems Lab 22.09.2016Elena Morara 10
Pipe Estimator
When in straight pipe:
• Orientation
When in bend also:
• Bend angle
• Position
17. ||Autonomous Systems Lab
State:
Current and previous pose
Pose of current pipe
Drift of Complementary Filter
Prediction step:
Driven from Visual Odometry
for current pose
Zero velocity for others
Measurement update step:
Combination of other estimates
22.09.2016Elena Morara 17
Sensor fusion: EKF
18. ||Autonomous Systems Lab
GoPro Hero Session 4
Pipe diameter: 10 cm
Snake of 18 modules
(~120 cm)
22.09.2016Elena Morara 18
Sample results
19. ||Autonomous Systems Lab 22.09.2016Elena Morara 19
Results: Pipe orientation
Time [sec]
Pipeangle[º] 45º
90º
20. ||Autonomous Systems Lab 22.09.2016Elena Morara 20
Results: Map + odometry
x [m]
y[m]
Truth
EKF
TOP VIEW
45º
45º
22. ||Autonomous Systems Lab
Exploiting the problem structure
(Joints, pipe segments, bends)
Together with standard odometry techniques
For improved:
6DoF pose estimate of robot
Map of traversed pipe
22.09.2016Elena Morara 22
Contributions of this work
23. ||Autonomous Systems Lab
Investigate linearization of dependency of Pipe Estimator
on Complementary Filter
Include temporal filtering of variables inside the EKF
Use no-slip motion model in prediction step of EKF
Exploit motion estimate for more robust Visual Joint
Tracker
22.09.2016((Vorname Nachname)) 23
Future work
24. ||Autonomous Systems Lab
Professor Siegwart
Professor Choset
Biorobotics Lab at Carnegie Mellon University
ZKS Foundation
22.09.2016((Vorname Nachname)) 24
Acknowldegements
Hinweis der Redaktion
Thank you everyone
My name
ETH, where I am supervised byI have been at Carnegie Mellon Universityto do my master thesiswhich involved odometry and mapping in pipe networks
Let me introduce the goal of my workGiven a pipe network (drain, sewage pipings etc)
Robot able to navigate itEstimate pose and reconstruct mapUseful for applications of pipe inspection, robots are used to assess conditions. It localizate fault detected from Video footage of the robot
Here is an overview of my resentationFirst, briefly introduce considered motion estimation teniques (exiting and novel of this work)
Then I’ll go into methodolody explaining the details of my solutionEmpirical resultsTen I’ll conclude with future work
Hyper redundant mechanisms with a serial chain kinematics
The robot I have been using developed at biorobotics lab
It has a modular architecture, each module having one degree of freedom, with an encoder and an IMU
Also front monocular camera
Suitable because
Flexible body and deformable shape
Allow then to comply
With the environment
Adapt to different piping characteristics
Diameters
Manage bendsOur snake robot prived visual feedback from a frontal monocular camera
Has IMUs alongs its bodyNOW LET ME INTRODUCE SOME MOTION ESTIMATION TECNIQUES WHICH ARE SUITABLE FOR OUR SNAKE ROBOT
Here is an high level view of my proposed solutionInput to the systems areWhich are feeded to individual pose/motion estimatorsIn yellow are the novel contributions of this workThe first here Pipe Estimator exploits kinematics constraintsThis last is the tracking of the pipe jointsIn the middle two standard odometry tecniques: inertial and visualThe estimates are fused together with an EKFTo produce an estimate of robot pose over time
Map of the traversed networkNOW I WOULD LIKE TO EXPLAIN THE DETAILS OF THESE CONCEPTUAL BLOCKS
Let me know dwelve into the components of my frameworkComplementary Filter is a estimation technique
which uses readings from IMUs
for the orientation of a body
Which relies on
High freq noise of acc
Integrating the gyros leads to a slow drift over time
Snake robot we have several IMUS along the bodyWhose readings can be combined to estimate orientation
On the other standard blockI have implementedwell established technique in visual odometry
Called
Feature-based odometry
It takes as input subsequent frames from the camera videoExtracts and matches visual features
Exploits epipolar geometryTo retrieve
Local rotation of the camera between the frames
Its Translation up to scale by triangulating the matched visual features
BREATHE
Now let’s move on one of the customized estimator of my solution
Visual joint tracker
It takes as input the current video frame and in case a pipe joint is visibleit returns its position with respect to the camera
Hence we can infer head positionI will now go throught the work flow of this block
Circle tracker contrains several filters which are candidate trackers of the circle generated by a pipe jointThe filters are PDAF which is a variant of the KF
The key difference is estimates whether the measurements are generated by the desired target or by noiseThe PDAFs are initialized and updated with the circle detected by the Hough DetectoreHere you see an example highlighting power of PDAFHere is original frame
In blue is the detected filter , which is an outlier, does not reflect position of pipe jointWhile the Kalmar filter follows it
The PDAF labeled it as a spurious measurement
And doesn not update based on it
BREATHE
FinallyHere is an overview of the pipe estimatore block
It uses the kinematics of the snakeAnd its current pose estimateIn case the robot is inside a straight pipe it just provides the orientation of the snakeWhen it is going though a bendIt also able to estimate forward motion
Let me now recap what I have explained so far about the individual motion estimates
NOW I WILL GIVE AN HIGH LEVEL VIEW OF HOW THIS FINAL STAGE IS PERFORMED, the sensor fusion
I run EKFEvery time all the estimates are updated, hence depends on the camera frame rate….
I have adopted a solution which is common in visual-inertial odometryWhich consists in having the motion model driven by observations
…LET ME REMIND YOUThat all the estimates are independent, except for the Pipe Estimator, which relies on Complementary Filter outputWhile this is neglected in current solution, it should be addressed in the futureBREATHE
How I will show some empirical results
Two/three body lengths
I would like to start by comparing the estimate of the pipe orientationfrom two blocks of my setupTop right you can see the setup snake was going thorugh
Consisting of three pipesBends of 45 and 90Here in the graph we have three signals over time
In black is the orientation of the pipe the head is going through45 then 45+90
In blue is the snake heading as computed by the CFsince the head points forward in the pipe should follow black
In red is the pipe orientation as estimated from the kinematics
By the Pipe Estimator BlockIt shows lower error and we have tested on 45, 90 and 180 bendsAnd it was always superior performance
On average by 1 order of magnitude
Then here
I would like to show a sample result from the whole algorithm
Id estThe pose estimate and the piping map from the EKFHere on top right the real setup the snake was going through
Three segments, bends of 45 and -45the snake stoped after 70 cm in this pipein black the real map
In red the estimated map
The overall error is of ~10 cm over 2.5 meters
To put this in perspective here I show the real size of the pipesFinally let me show the estimated robot odometryLet me highlight wee need full 6DOF information for this
FinallyI would like to show one last resultWhich shows two important points- Main cause of bad estimates- Comparison with tandard viusual-inertialWe see a setup of three bends 45 45Shown in this top view in blackIn red is the estimated map from the EKFIt is built from left to rightThe error mainly build up in the initial partWhen robot is in a straight pipe and no joint is visible, forward motion can be estimated only by visual odometryWe discussed how its performance is impaired by pipe looksFurther moreDuring this initial part of the run the snake got briefly stack into the pipeLeading to a jerky recovery motionin which it bumped against pipe walls,this made complementary filter position estimate unrealiableThese two considerationsAre the also the basis of the poor performance of the visual+ inertial estimates aloneWhile the map builds up, the robot bumps and the CF orientation estimates fails, then it recoversas shown before, the drift of CF worsens the estimated anglesHence, to wrap up,This shows how the novel contributions improve
Which consist in the contribution of my workI achieved
I am concluding by mentioning directions for future workAs I mentionedpipe estimator depends on CF and involves a heavy filtering of signals over time. Investigate how this can be integrated into the EKFfurthermore, in order to improve the performance in straight pipes, improvement is expected by predictiong the robot motion with the non slip model by Enner and RollinsonFinally, the visual joint tracker performance can be improved by retving circle motion model from estimate of camera model