For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/embedded-vision-alliance/embedded-vision-training/videos/pages/may-2016-embedded-vision-summit-nasa-keynote
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Larry Matthies, senior research scientist at the NASA Jet Propulsion Laboratory, presents the "Using Vision to Enable Autonomous Land, Sea and Air Vehicles" keynote at the May 2016 Embedded Vision Summit.
Say you’re an autonomous rover and you’ve just landed on Mars. Vexing questions now confront you: “Where am I and how am I moving?” “What obstacles are around me?” “Are the obstacles moving?” “What other objects are around me that matter to my mission?” As it turns out, Earth isn’t that different from Mars in this regard. If you’re an autonomous car or drone, you face similar challenges. You’ve got to find combinations of sensors that work across different illumination, weather, temperature, and vehicle dynamics; processors that fit the size, weight, and power constraints of the system; and algorithms that can answer the questions given the sensors and processors available. In this talk, Matthies gives an overview of autonomous vehicle computer vision applications, explores successful approaches, and illustrates concepts with application examples from applications on Earth and in planetary exploration.
"Using Vision to Enable Autonomous Land, Sea and Air Vehicles," a Keynote Presentation from NASA JPL
1. Jet Propulsion
Laboratory
California Institute
of Technology
Using Vision to Enable
Autonomous Land, Sea, Air,
and Space Vehicles
Larry Matthies
Computer Vision Group
Jet Propulsion Laboratory
California Institute of Technology
lhm@jpl.nasa.gov
Gale Crater
Copyright 2016 California Institute of Technology. U.S. Government sponsorship acknowledged.
2. Jet Propulsion
Laboratory
California Institute
of Technology
Application Domains and
Main JPL Themes
Land: all-terrain autonomous mobility; mobile manipulation
Sea: USV escort teams; UUVs for
subsurface oceanography
Space: assembling
large structures in
Earth orbit
Air: Mars precision
landing; rotorcraft for
Mars and Titan; drone
autonomy on Earth
3. Jet Propulsion
Laboratory
California Institute
of Technology
Basic Taxonomy of
Perception Capabilities and Challenges
Capabilities
• Localization
– Absolute, relative
• Obstacle detection
– Stationary, moving
– Obstacle type
– Terrain trafficability
• Other scene semantics
– Landmarks, signs,
destinations, etc
– Perceiving people and their
activities
• Perception for grasping
Challenges
• Sensors for observability
• Fast motion
• Lighting conditions
– Low light, no light
– Very wide dynamic range
• Atmospheric conditions
– Haze, fog, smoke
– Precipitation
• Difficult object/terrain types
– Featureless, specular,
transparent
– Obstacles in grass; water, snow,
ice; mud
• Computational cost vs processor
4. Jet Propulsion
Laboratory
California Institute
of Technology
Localization
• Relevant use cases:
– Wheeled, tracked, legged vehicles
indoors and outdoors
– Drones
– Mars rovers
– Mars landers
• Key challenges:
– Appearance variability: lighting,
weather, season
– Moving objects
– Fail-safe performance
• Examples I’ll describe:
– Day/night relative localization with IMU,
leg odometry, and NIR active stereo for
DARPA LS3 program
– Map relative localization for Mars
precision landing
1976 Viking
174 x 62 mi
• Some central themes:
– Fusion of IMU with vision or lidar
now commonplace for relative
localization
– Map matching is a very active
topic for absolute localization
5. Jet Propulsion
Laboratory
California Institute
of Technology
Day/Night LS3 Relative Navigation
with Visual, Inertial, and Leg Odometry
• Array of NIR LEDs with different lenses/power to
achieve more uniform illumination with dense
stereo out to ~ 4.5 m
SENSORS:
1) Bumblebee Stereo (1024x768)
2) Tactical-grade IMU (600Hz)
3) Optional: Nav-grade IMU (600Hz)
4) Optional: Leg Odometry (200Hz)
6. Jet Propulsion
Laboratory
California Institute
of Technology
Day/Night LS3 Relative Navigation
with Visual, Inertial, and Leg Odometry
asphalt dirt road snow forest
- Position error over a moving window of 50m
- Position error < 0.5m 95% of runs
- Position error < 0.75m 100% of runs
7. Jet Propulsion
Laboratory
California Institute
of Technology
Day/Night LS3 Relative Navigation
with Visual, Inertial, and Leg Odometry
• How far can the lookahead scale with
illuminators?
• Can it work at night with thermal images
(stereo or monocular) – with more noise,
motion blur, and rolling shutter readout?
8. Jet Propulsion
Laboratory
California Institute
of Technology
Mars Airbag-based Landers (1997, 2004):
Horizontal Velocity at Impact
impact
velocity
impact
velocity
impact
velocity
Pathfinder
Pathfinder
with TIRS
Pathfinder
with TIRS
and DIMES
9. Jet Propulsion
Laboratory
California Institute
of Technology
Descent Image Motion Estimation System
for 2004 Mars Landers
• Horizontal velocity estimation during last 2 km of descent
AI1
AI2
I1qG
G
g
I2qG
I1
I2
vh11, vh12
AI3
vh21, vh22
I3qG
I3
~ 20 sec with 20% of 20 MHz
RAD6000 flight computer
10. Jet Propulsion
Laboratory
California Institute
of Technology
Map-Relative Localization
for Mars Precision Landing
Backshell
Separation
Powered
Descent
Sky
Crane
Flyaway
Prime MLEs
Radar Data
Collection
Divert
Maneuver
Safe Target
Selection
Lander
Vision
System
TRN increases the
probability of safe landing
Hazards in Landing
Ellipse without TRN
Hazards in Landing
Ellipse with TRN
TRN
11. Jet Propulsion
Laboratory
California Institute
of Technology
Map-Relative Localization
for Mars Precision Landing
LVS
IMU
LVS Compute
Element
Processor
Navigation Filter
Data Flow
Virtex 5 FPGA
Image Processing
Sensor Interfaces
Memory
for map
Spacecraft Flight
Computer
LVS
Camera
Image 1
Image 2
Image 3
IMU
IMU
IMU
Image 4
Image 5
IMU
Coarse Landmark Matching
Remove Position Error (3km)
Fine Landmark Matching
Improve Accuracy (40m)
State Estimation
Fuse inertial measurements with landmark
matches and complete in 10 seconds
spacecraft
attitude, altitude
map relative
position
12. Jet Propulsion
Laboratory
California Institute
of Technology
Obstacle Detection
• Relevant use cases:
– Land vehicles indoors and outdoor
on-road and off-road
– Drones: flying and landing
– Boats on and under water
– Robot manipulators
• Key challenges:
– Appearance variability:
• Lighting, weather, season
• Surface reflectance, transparency
– Terrain variability
– Moving objects
– Fail-safe performance
HVU
8
HSMSTs
(Teleoperated)
5
USVs
(Autonomous)
• Issues I’ll discuss:
– Sensors and phenomenology
– 3-D representations
– Land, air, and sea examples
18. Jet Propulsion
Laboratory
California Institute
of Technology
Almost
crossed
à
“crossing”
rule
not
applied
USV
must
give
way
1 knot
10 knots
30 knots
10 knots
COLREGS Illustration
Crossing from leftCrossing from right Overtaking Head-on
Your boat
Traffic boat
Need
more
than
the
geometry
to
determine
COLREGS
situaHons
24. Jet Propulsion
Laboratory
California Institute
of Technology
Heat Transfer Characteristics:
Negative Obstacle Detection
Color crosswise view
Color lengthwise view
MWIR image 1 hr
after sundown
Weatherproof sensor enclosure
25. Jet Propulsion
Laboratory
California Institute
of Technology
Heat Transfer Characteristics:
After Sundown, Holes Cool More Slowly than Surface
• Radiation
• Evapotranspiration (ignored here)
A
€
qnegobs1
2negobsq
terrainq
1sideT 2sideT
terrainT
skyT
airT
terrainT
terrainq
negobsq
negobsT
C5020 °−=diurnalT
terrainT
terrainq
negobsq
negobsT
• Convection
• Conduction
27. Jet Propulsion
Laboratory
California Institute
of Technology
Detection Results using Thermal Signature
Rectified
thermal
infrared
intensity
image.
After
intensity
difference
thresholding.
Closed
contours
overlaid on
intensity
image.
After
geometry
based
filtering.
Trench 3 pixels wide at first detection. Trench first detected at 18.2m range.
29. Jet Propulsion
Laboratory
California Institute
of Technology
Water Body Detection with Reflections in Stereo:
Works with Visible and Thermal Images
15:00
100m
map
Stereo range imageLeft rectified image
36. Jet Propulsion
Laboratory
California Institute
of Technology
MAV Obstacle Avoidance:
Test Results
C-space-like
obstacle expansion
of disparity map
Obstacle points and path
projected on ground plane
Upright view
38. Jet Propulsion
Laboratory
California Institute
of Technology
Stereo-OF fusion:
“egocylinder”
• Range from stereo
and OF
• Scale propagation
from stereo in
overlap region
• Projection into
common cylinder
representation
• C-space
expansion in
image space
Egocylinder Representation
42. Jet Propulsion
Laboratory
California Institute
of Technology
Autonomous Landing:
Problems and Solution Characteristics
Gale Crater,
Mars
• Problem characteristics
• Variable potentially complex 3-D structure
• Variable appearance
• Variable altitude for approach
• Need for very lightweight hardware
• Solution characteristics
• Desire dense 3-D perception
• Must work from variable altitude
• Must work in direct sunlight
• Hardware as light as possible – just a camera?
46. Jet Propulsion
Laboratory
California Institute
of Technology
Real-time Onboard Mapping
for Safe Landing in Unknown Terrain
Raw image from camera
at ~15m height.
Computed elevation map
Landing confidence map
(dark blue: highest confidence)
river
bed
edge
48. Jet Propulsion
Laboratory
California Institute
of Technology
Notional Future Directions for
Space Exploration
• Mars sample return
• Accessing recurring slope lineae,
caves, and vertical/microgravity
Pre-Decisional Information – For Planning and Discussion Purposes Only
49. Jet Propulsion
Laboratory
California Institute
of Technology
Notional Future Directions for
Space Exploration
• Mars sample return
• Accessing recurring slope lineae
• Comet sample return, Ocean Words, Titan
Pre-Decisional Information – For Planning and Discussion Purposes Only
50. Jet Propulsion
Laboratory
California Institute
of Technology
Perception and Planning for Robots in Human
Environments, Interacting with People
Base
reachability
Arm
reachability
Desired
end-effector
goal
Perception and planning
for mobile manipulation
Deep learning-based object class
labeling/pose estimation; human
articulate body pose estimation
Power grasp opportunities
Pinch grasp opportunities
Scene
51. Jet Propulsion
Laboratory
California Institute
of Technology
Some Thoughts About Back on Earth:
Better Perception for Human-Robot Interaction
Integrate facial expressions,
head pose, and body pose
into robot perception of
people for more intelligent
human robot interaction
S_1 S_2 S_3 S_4 S_5 S_6 S_7 S_8 S_9
S_13S_12 S_15S_14 S_17S_16S_11S_10
Electromyography
sleeve with forearm IMU
and magnetometer for
recognizing arm and
hand gestures