SlideShare a Scribd company logo
1 of 41
Download to read offline
Copyright © 2019 Carnegie Robotics
How to Choose a
3D Vision Technology
Chris Osterwood
Capable Robot Components
May 2019
Copyright © 2019 Carnegie Robotics
Carnegie Robotics, LLC
2
Carnegie Robotics is a leading provider of rugged sensors,
autonomy software and platforms for defense, agriculture,
mining, infrastructure, and energy applications.
The company’s 5-acre campus collocates engineering,
product testing and a 9,000 square foot production facility.
Certified to ISO 9001-2015, the company prides itself in
integrated systems of record for engineering, production,
finance and quality.
Copyright © 2019 Carnegie Robotics
•Criteria & Definitions
•Overview of 3D Sensor Modalities: LIDAR, ToF Camera, Stereo
•How do they work?
•What are their general advantages and disadvantages?
•Sample Data
•3D Sensor Testing & Edge Cases
•Not covered:
•Laser Triangulation, Structured Light, & Computed Tomography
Outline
3
Copyright © 2019 Carnegie Robotics
• Field of View (FOV): Defines angular area of perceivable field of sensor
• Density:
• Angular step size between sample points
• Can be different horizontally and vertically
• Resolution: Generally FOV x Density
• Depth Accuracy: Difference between measured range and actual range
Depth Resolution: Step size between possible measurement ranges (along measurement
axis)
• Minimum and Maximum Range: Defines distances that are perceivable to the sensor.
May vary by object material, brightness, reflectivity, etc.
• Rate: Specified in “frames” or “points” per second, depending on modality
Evaluation Criteria – Performance
4
Copyright © 2019 Carnegie Robotics
• Size, Weight, & Power
• Cost
• Sealing: Dust proof, Water splash, Water streams, Immersion
• Shock & Vibration Rating
• Communication Interface: Ethernet, USB, Serial, CANBus, Firewire
• Synchronization: None, Hardware, Software (broadcast trigger or network time sync)
• Software Interface:
• Published wire protocol?
• Drivers. Platform? Language? License?
• Example usage software?
• Other: Trigger, GPIO, Lighting, IMU, Temperature & Health Reporting
Evaluation Criteria – Non-Performance
5
Copyright © 2019 Carnegie Robotics
How does end-use change criteria weighting?
6
UAV UGV Automotive
Mapping Mapping Safety Mapping Safety
HFOV
VFOV
Density
Accuracy
Mix range
Max range
Rate
Swap
Cost
Sealing
Shock/Vibe
Sync
Moderate weighting High weighting
Weightings are affected by the
platform and end-use of the
sensor
There are many other uses for
3D sensor
The table here is an example of
how there are unique drivers:
• Mapping: Max Range
• Safety: Rate
• UAV: Weight
• UGV: Vertical FOV / Min
Range
• Automotive: Cost
Copyright © 2019 Carnegie Robotics 7
3D Sensor Technologies
Copyright © 2019 Carnegie Robotics
3D LIDAR: How do they work?
8
Transmitter
Phase
measurement
Transmitted Beam Reflected Beam
Building block is single-beam system
shown in the diagram here.
3D LIDARS can have:
• 1 beam steered mechanically in 2
DOF
• Multiple beams steered mechanically
in 1 DOF
Steering can be done with rotating mirror,
galvanometer, or directly upon the emitter
& detector
Copyright © 2019 Carnegie Robotics
Downsides
Advantages
3D LIDAR
9
• Constant error model — 1 sigma can be +/- 1 cm
• Long range. 10 m to 2 km possible depending on laser power and pulse duration
• General wide horizontal FOV due to rotating mirror or rotating emitters and
detectors
• Can have large vertical FOV. Dependent on laser and mechanism details
• Cost. Trends are encouraging here, but low cost systems are not yet available.
• Scan time — no instantaneous capture of data. System has to compensate for
platform (or mechanism) motion during scan
• Limited angular density & asymmetric density
• Moving parts & non-planar windows are difficult to seal
• Affected by obscurants in the environment
Copyright © 2019 Carnegie Robotics
ToF Cameras: How do they work?
10
• Like LIDAR, but every pixel is a
measurement point.
• Results in limited range or more
transmission power.
• Accuracy increases with
modulation frequency.
• Maximum range decreases with
modulation frequency.
• Systems generally use multiple
frequencies to allow for long range
and non-ambiguous range.
Copyright © 2019 Carnegie Robotics
• Dense data regardless of scene texture
• Instantaneous horizontal & vertical FOV
• Linear error with respect to distance
• Narrower shadow behind positive obstacles
• Solid state — no moving parts
• Fewer “camera” parameters
ToF Cameras
11
• Low resolution
• Long integration time (to increase SNR and resolve range ambiguity)
causes motion blur if platform or objects are moving
• Susceptible to multi-echo returns distorting data
• Affected by obscurant in the environment
• Limited FOV — generally both horizontal and vertical
• May not have an “image” output
Downsides
Advantages
Copyright © 2019 Carnegie Robotics
Stereo Camera: How do they work?
12
Feature matches are found in left and
right cameras. Difference in lateral
position (disparity) is inversely
proportional to distance.
Matching process (correspondence
problem) is critical to data accuracy and
density.
“Disparity search range” = number of
horizontal pixels in right camera searched
before moving onto the next left pixel. A
larger search range allows seeing objects
closer to the camera, but generally
reduces 3D throughput.
Copyright © 2019 Carnegie Robotics
Active & Passive Stereo
13
• Active Stereo: Correspondence problem aided through projection of
arbitrary texture into the scene.
• Advantages:
• Very high resolution and density
• Instantaneous horizontal & vertical FOV
• Flexible FOV options (from 130° to 10°)
• Monochrome or color images are co-registered with 3D data
• Many “camera” parameters
• Passive solution is immune to obscurant in the environment
• Solid state — no moving parts
• Downsides:
• Data density dependent on scene texture or pattern projection
• Non-linear error model with respect to distance
• Double shadow behind positive obstacles
• Computationally expensive process increases BOM cost
Copyright © 2019 Carnegie Robotics
Sensor Error Models
14
Understanding the stereo error
model is key to effective use.
It is VERY different than every
other sensor.
Stereo accuracy here assumes
matches with 1/2 pixel of
accuracy. That is common in
real world environments, but
not guaranteed. Accuracy is
affected by scene texture.
Copyright © 2019 Carnegie Robotics
LIDAR & Stereo for Mobile Footstep Planning
15
Rotating LIDAR Stereo
Videos courtesy of MIT’s DARPA Robotics Challenge Team.
3D data is being used to plan foot placement and robot movement over the steps.
Both videos are sped up from real-time.
Copyright © 2019 Carnegie Robotics
3D LIDAR Sample Data
16
1: 8 seconds of persisted data
2: 1 second of persisted data,
high rotation rate
Notice:
• Wide FOV
• Difficulty in seeing/
identifying person
• Density differences
horizontally & vertically
• Mixed pixel returns
Copyright © 2019 Carnegie Robotics
ToF Camera Sample Data: Low Cost
17
1: Axis Colored Scene
2: Intensity Colored Scene
Notice:
• Low resolution / Low Rate
• Narrow FOV
• Invisible / bent cart
handle.
• “Flashlight effect”
• Mid-air returns
• 1/2 of floor missing
Copyright © 2019 Carnegie Robotics
ToF Camera Sample Data: Industrial
18
1 & 2: Axis colored scene
Notice:
• High resolution / High
rate
• Distortion with near-field
object
• Distortion to rear wall
with foreground object
• Multi-path distortion in
floor
• Mixed-pixel returns
Copyright © 2019 Carnegie Robotics
Stereo Sample Data
19
1: Overview of scene
Notice:
• High density — can see
pieces, can see cart
handle
• High Rate & Wider FOV
• Correlated scene color
• Data below ground
caused by specular
reflections
• False “spike” in rear wall
caused by cart handle
2: Close-up of chess board
• Post-matching cost
threshold starts high (all
data), is lowered (only
high confidence data),
and is raised again
Copyright © 2019 Carnegie Robotics
Stereo Sample Data
20
IR Pattern Projector turned on
and off throughout scene.
Scene has RGB colormap of
monochrome intensity.
Areas of low-confidence have
higher-confidence with projector
on. This results in lower-noise
and more accurate data.
Copyright © 2019 Carnegie Robotics
Stereo from Passive Thermal
Thermal Stereo
Optical Stereo
Copyright © 2019 Carnegie Robotics 22
General Sweet Spots
LIDAR ToF Stereo
HFOV
VFOV
Density
Range accuracy
Min range
Max range
Data rate
Obscurant
Cost
Sealing
Shock/Vibe
LIDAR
• High speed vehicles — well served by their long
range and high accuracy at range
• Vehicles with high turning rates — require wide
horizontal FOV
ToF Camera
• Indoor environments
• Short range object scanning & gesture
recognition
Stereo Camera
• Outdoor applications with high levels of ambient
obscurants
• Multi-view applications (overlapping FOVs)
• Applications which require a non-emitting solution
Copyright © 2019 Carnegie Robotics
Testing 3D Sensors
Copyright © 2019 Carnegie Robotics
ToF & Stereo: Outdoors
Significant horizontal and vertical field of view
(FOV) advantage
Much denser data allows smaller objects to be
detected
No mixed-pixel returns
Stereo ToF
Both views are
nearly top-
down
RGB Image of Scene
Copyright © 2019 Carnegie Robotics 25
ToF & Stereo: Indoors
Stereo allows near and far objects to be detected
Wider vertical and horizontal field of view (FOV) allows more
understanding of the scene
ToF Camera has dense returns on visually featureless drywall
No far-range data returned
Both 3D views are from same perspective and are above sensor,
angled down slightly
ToFStereo
Copyright © 2019 Carnegie Robotics 26
ToF Camera: Multi-echo Distortions
Reflective surfaces distort ToF camera 3D
data. IR energy reflects and scattered
returns biases range reading longer than
actual geometry
Causes ruts in floors and bows in vertical
walls
Polarization can help mitigate but severely
limits maximum range.
RGB Image of Scene
Copyright © 2019 Carnegie Robotics 27
Stereo & LIDAR in Obscurant
Stereo & Laser Point Clouds
Viewed from above
Colorized Stereo Point Cloud
50 m Tunnel viewed from near camera
Colorized Stereo
Point Cloud
3D Laser
Copyright © 2019 Carnegie Robotics 28
3D LIDAR: Lighting: Vendor A
300 lux: Fluorescent 5000 lux: Halogen
Lexan
Mirror
Mesh
Black Tile
Acrylic
8 3 6 9 7 5 8 3 6 9 7 5
White areas shifted 2.5 cm closer
to sensor
Std. dev. of first row (mm)
Copyright © 2019 Carnegie Robotics
Mirror
Lexan
Polished
Brass
Mesh
Acrylic
Black Tile
29
3D LIDAR: Lighting: Vendor B
Lexan
Mirror
Mesh
Black Tile
Acrylic
10 7 7 9 10 8 11 9 10 12 13 12
High IR content of Halogen
increases noise additional 40%
Std. dev. of first row (mm)
50% more noise than Vendor A
300 lux: Fluorescent 5000 lux: Halogen
Copyright © 2019 Carnegie Robotics 30
ToF Camera: Lighting
Lexan
Mirror
Mesh
Black Tile
Acrylic
High IR content of Halogen
increases noise by 2x to 5x
Higher range noise
on dark objects
Mirror
Lexan
Polished
Brass
Mesh
Acrylic
Black Tile
300 lux: Fluorescent 5000 lux: Halogen
Copyright © 2019 Carnegie Robotics 31
Stereo Camera: 7cm Baseline Range Accuracy
4 4 4 4 4 4 9 9 12 10 8 10
Higher range noise on low-texture
objects
Std. dev. of first row (mm)
Mirror
Lexan
Black Tile
Polished
Brass
Mesh
Acrylic
0.4 meter 1.2 meter
Copyright © 2019 Carnegie Robotics 32
3D LIDAR: Cost vs Performance
Price X Price 3X
• Test is static scan of flat surface, 5 cm to 140 cm in 5 cm increments
• Datasets are from the same vendor, different models of LIDAR
• Very different scan density — can be known from datasheet
• Range noise, distortions, & localized errors cannot be known from
datasheet
Error to Color Mapping
0.00 cm: Blue
0.50 cm: Green
1.00 cm: Yellow
1.25 cm: Orange
1.50+ cm: Red
Copyright © 2019 Carnegie Robotics 33
3D LIDAR: Error Changes Over Time
• Every laser (even same SKU) is different
• Lasers change over time
• Error can vary by emission angle & range
Unit A Unit A Unit BJanuary July
Error to Color Mapping
0.00 cm: Blue
0.50 cm: Green
1.00 cm: Yellow
1.25 cm: Orange
1.50+ cm: Red
Copyright © 2019 Carnegie Robotics 34
Stereo: Lens Flare
• Lens flare can cause loss of
3D data due to over-exposure
of that region of the scene.
• It is possible for flare to be
similar enough between left
and right cameras to result in
3D matches. More likely with
Block Matching and with
lower resolution stereo
system.
• Can also affect ToF camera
systems.
Copyright © 2019 Carnegie Robotics 35
3D Resolution Testing
In some applications XYZ resolution
matters as much as depth accuracy.
How do you measure 3D resolution?
Take cues from 2D camera testing.
• Raised Relief 3D Resolution Target
• Derived from USAF 1951
resolution target
• THE standard image quality test
for 60 years
Copyright © 2019 Carnegie Robotics 36
3D Resolution Testing Process
Capture Image
Identify and Isolate Target
Outputs:
• Colorized Depth Image
• Colorized Depth Error Image
• RMS error on features and
between features for each
grouping size
Copyright © 2019 Carnegie Robotics 37
3D Resolution: Stereo
Error Image Range Image
Copyright © 2019 Carnegie Robotics 38
3D Resolution: Industrial ToF Camera
Error Image Range Image
Copyright © 2019 Carnegie Robotics 39
3D Resolution: Low Cost ToF Camera
Error Image Range Image
Copyright © 2019 Carnegie Robotics 40
Take Aways
There is no perfect sensor.
They all have limitations and edge cases.
Look for published test data. If none is available:
• Ask the sensor manufacturer,
• Collect it yourself, or
• Turn to those who have past experience in integration and downstream software.
It is critical to understand the performance & non-performance criteria that matter for
your application.
• Those weights must drive your sensor selection process.
• You may have to sacrifice one or more lesser characteristics to meet the ones which
really matter to you.
Copyright © 2019 Carnegie Robotics 41
Resources
http://carnegierobotics.com/evs/
http://carnegierobotics.com/support/
Companies & Products
Carnegie
Robotics
MultiSense
IFM O3D
SICK 3vistor-T
ToF Camera
pmd pico
ToF Camera
ZED Camera
Orbbec Persee
Structure Sensor
SICK TiM / LMS
LIDARs
Velodyne
3D LIDARs
Hokuyo
2D & 3D LIDAR
Carnegie Robotics
Intel
RealSense
nerian Karmin2
Stereo Camera
iDS Ensenso
Roboception
rc_visard
SICK PLB500
Stereo Camera
LIDAR
ToF Camera

More Related Content

What's hot

“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...
“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...
“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...
Edge AI and Vision Alliance
 
Panel: Affordable options for Indoor location and elevation
Panel: Affordable options for Indoor location and elevationPanel: Affordable options for Indoor location and elevation
Panel: Affordable options for Indoor location and elevation
HitReach
 
“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...
“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...
“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...
Edge AI and Vision Alliance
 
Using powerful and precise location technology that enhances the user experience
Using powerful and precise location technology that enhances the user experienceUsing powerful and precise location technology that enhances the user experience
Using powerful and precise location technology that enhances the user experience
HitReach
 

What's hot (20)

"Image Sensor Formats and Interfaces for IoT Applications," a Presentation fr...
"Image Sensor Formats and Interfaces for IoT Applications," a Presentation fr..."Image Sensor Formats and Interfaces for IoT Applications," a Presentation fr...
"Image Sensor Formats and Interfaces for IoT Applications," a Presentation fr...
 
Tim Leland (Qualcomm): The Mobile Future of Extended Reality (XR)
Tim Leland (Qualcomm): The Mobile Future of Extended Reality (XR)Tim Leland (Qualcomm): The Mobile Future of Extended Reality (XR)
Tim Leland (Qualcomm): The Mobile Future of Extended Reality (XR)
 
“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...
“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...
“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Pres...
 
“Market Analysis on SoCs for Imaging, Vision and Deep Learning in Automotive ...
“Market Analysis on SoCs for Imaging, Vision and Deep Learning in Automotive ...“Market Analysis on SoCs for Imaging, Vision and Deep Learning in Automotive ...
“Market Analysis on SoCs for Imaging, Vision and Deep Learning in Automotive ...
 
Boundless Photorealistic Mobile XR Over 5G
Boundless Photorealistic Mobile XR Over 5GBoundless Photorealistic Mobile XR Over 5G
Boundless Photorealistic Mobile XR Over 5G
 
Emerging vision technologies
Emerging vision technologiesEmerging vision technologies
Emerging vision technologies
 
AI firsts: Leading from research to proof-of-concept
AI firsts: Leading from research to proof-of-conceptAI firsts: Leading from research to proof-of-concept
AI firsts: Leading from research to proof-of-concept
 
"Is Vision the New Wireless?," a Presentation from Qualcomm
"Is Vision the New Wireless?," a Presentation from Qualcomm"Is Vision the New Wireless?," a Presentation from Qualcomm
"Is Vision the New Wireless?," a Presentation from Qualcomm
 
Panel: Affordable options for Indoor location and elevation
Panel: Affordable options for Indoor location and elevationPanel: Affordable options for Indoor location and elevation
Panel: Affordable options for Indoor location and elevation
 
Making Immersive Virtual Reality Possible in Mobile
Making Immersive Virtual Reality Possible in MobileMaking Immersive Virtual Reality Possible in Mobile
Making Immersive Virtual Reality Possible in Mobile
 
How to take advantage of XR over 5G: Understanding XR Viewers
How to take advantage of XR over 5G: Understanding XR ViewersHow to take advantage of XR over 5G: Understanding XR Viewers
How to take advantage of XR over 5G: Understanding XR Viewers
 
“Deep Learning on Mobile Devices,” a Presentation from Siddha Ganju
“Deep Learning on Mobile Devices,” a Presentation from Siddha Ganju“Deep Learning on Mobile Devices,” a Presentation from Siddha Ganju
“Deep Learning on Mobile Devices,” a Presentation from Siddha Ganju
 
Automotive LIDAR with SensL SiPM Sensors
Automotive LIDAR with SensL SiPM SensorsAutomotive LIDAR with SensL SiPM Sensors
Automotive LIDAR with SensL SiPM Sensors
 
The path to personalized, on-device virtual assistant
The path to personalized, on-device virtual assistantThe path to personalized, on-device virtual assistant
The path to personalized, on-device virtual assistant
 
Achieving AI @scale on Mobile Devices
Achieving AI @scale on Mobile DevicesAchieving AI @scale on Mobile Devices
Achieving AI @scale on Mobile Devices
 
“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...
“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...
“Certifying Neural Networks for Autonomous Flight,” a Presentation from Daeda...
 
Leading Research Across the AI Spectrum
Leading Research Across the AI SpectrumLeading Research Across the AI Spectrum
Leading Research Across the AI Spectrum
 
Using powerful and precise location technology that enhances the user experience
Using powerful and precise location technology that enhances the user experienceUsing powerful and precise location technology that enhances the user experience
Using powerful and precise location technology that enhances the user experience
 
More Immersive XR through Split-Rendering
More Immersive XR through Split-RenderingMore Immersive XR through Split-Rendering
More Immersive XR through Split-Rendering
 
Making an on-device personal assistant a reality
Making an on-device personal assistant a realityMaking an on-device personal assistant a reality
Making an on-device personal assistant a reality
 

Similar to "How to Choose a 3D Vision Sensor," a Presentation from Capable Robot Components

“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...
“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...
“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...
Edge AI and Vision Alliance
 

Similar to "How to Choose a 3D Vision Sensor," a Presentation from Capable Robot Components (20)

Digital radiography testing
Digital radiography testingDigital radiography testing
Digital radiography testing
 
"Deep Learning for Manufacturing Inspection Applications," a Presentation fro...
"Deep Learning for Manufacturing Inspection Applications," a Presentation fro..."Deep Learning for Manufacturing Inspection Applications," a Presentation fro...
"Deep Learning for Manufacturing Inspection Applications," a Presentation fro...
 
RF Propagation Modelling for 5G RAN Planning and Design
RF Propagation Modelling for 5G RAN Planning and DesignRF Propagation Modelling for 5G RAN Planning and Design
RF Propagation Modelling for 5G RAN Planning and Design
 
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
 
IRJET- A Survey of Approaches for Vehicle Traffic Analysis
IRJET- A Survey of Approaches for Vehicle Traffic AnalysisIRJET- A Survey of Approaches for Vehicle Traffic Analysis
IRJET- A Survey of Approaches for Vehicle Traffic Analysis
 
IRJET- A Survey of Approaches for Vehicle Traffic Analysis
IRJET- A Survey of Approaches for Vehicle Traffic AnalysisIRJET- A Survey of Approaches for Vehicle Traffic Analysis
IRJET- A Survey of Approaches for Vehicle Traffic Analysis
 
NODAR.pptx
NODAR.pptxNODAR.pptx
NODAR.pptx
 
Getting to the Edge – Exploring 4G/5G Cloud-RAN Deployable Solutions
Getting to the Edge – Exploring 4G/5G Cloud-RAN Deployable SolutionsGetting to the Edge – Exploring 4G/5G Cloud-RAN Deployable Solutions
Getting to the Edge – Exploring 4G/5G Cloud-RAN Deployable Solutions
 
"The Roomba 980: Computer Vision Meets Consumer Robotics," a Presentation fro...
"The Roomba 980: Computer Vision Meets Consumer Robotics," a Presentation fro..."The Roomba 980: Computer Vision Meets Consumer Robotics," a Presentation fro...
"The Roomba 980: Computer Vision Meets Consumer Robotics," a Presentation fro...
 
“3D Sensing: Market and Industry Update,” a Presentation from the Yole Group
“3D Sensing: Market and Industry Update,” a Presentation from the Yole Group“3D Sensing: Market and Industry Update,” a Presentation from the Yole Group
“3D Sensing: Market and Industry Update,” a Presentation from the Yole Group
 
“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...
“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...
“Selecting Image Sensors for Embedded Vision Applications: Three Case Studies...
 
“Machine Learning for the Real World: What is Acceptable Accuracy, and How Ca...
“Machine Learning for the Real World: What is Acceptable Accuracy, and How Ca...“Machine Learning for the Real World: What is Acceptable Accuracy, and How Ca...
“Machine Learning for the Real World: What is Acceptable Accuracy, and How Ca...
 
Mavenir: RAN Evolution for 5G
Mavenir: RAN Evolution for 5GMavenir: RAN Evolution for 5G
Mavenir: RAN Evolution for 5G
 
Agricultural robot sprayer: Evaluation of user interfaces in field experiments
Agricultural robot sprayer: Evaluation of user interfaces in field experimentsAgricultural robot sprayer: Evaluation of user interfaces in field experiments
Agricultural robot sprayer: Evaluation of user interfaces in field experiments
 
FUTURE-PROOFING VEHICLES BY LEVERAGING PERCEPTION RADAR
FUTURE-PROOFING VEHICLES BY LEVERAGING PERCEPTION RADARFUTURE-PROOFING VEHICLES BY LEVERAGING PERCEPTION RADAR
FUTURE-PROOFING VEHICLES BY LEVERAGING PERCEPTION RADAR
 
Agrirobot presentation by George Adamides
Agrirobot presentation by George AdamidesAgrirobot presentation by George Adamides
Agrirobot presentation by George Adamides
 
Qualcomm: How to take advantage of XR over 5G in 2019: Understanding XR Viewers
Qualcomm: How to take advantage of XR over 5G in 2019: Understanding XR ViewersQualcomm: How to take advantage of XR over 5G in 2019: Understanding XR Viewers
Qualcomm: How to take advantage of XR over 5G in 2019: Understanding XR Viewers
 
Avoiding privacy in cinema using ir camera (1)
Avoiding privacy in cinema using ir camera (1)Avoiding privacy in cinema using ir camera (1)
Avoiding privacy in cinema using ir camera (1)
 
IRJET- Robust and Fast Detection of Moving Vechiles in Aerial Videos usin...
IRJET-  	  Robust and Fast Detection of Moving Vechiles in Aerial Videos usin...IRJET-  	  Robust and Fast Detection of Moving Vechiles in Aerial Videos usin...
IRJET- Robust and Fast Detection of Moving Vechiles in Aerial Videos usin...
 
HSA-4146, Creating Smarter Applications and Systems Through Visual Intelligen...
HSA-4146, Creating Smarter Applications and Systems Through Visual Intelligen...HSA-4146, Creating Smarter Applications and Systems Through Visual Intelligen...
HSA-4146, Creating Smarter Applications and Systems Through Visual Intelligen...
 

More from Edge AI and Vision Alliance

“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
Edge AI and Vision Alliance
 
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
Edge AI and Vision Alliance
 
“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...
Edge AI and Vision Alliance
 
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
Edge AI and Vision Alliance
 
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
Edge AI and Vision Alliance
 
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
Edge AI and Vision Alliance
 
“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara
Edge AI and Vision Alliance
 
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
Edge AI and Vision Alliance
 

More from Edge AI and Vision Alliance (20)

“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
 
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
 
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
 
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
 
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
 
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
 
“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...
 
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
 
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
 
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
 
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
 
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
 
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
 
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
 
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
 
“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara
 
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
 
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
 
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
 
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
 

Recently uploaded

Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 

Recently uploaded (20)

Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu SubbuApidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
A Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusA Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source Milvus
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 

"How to Choose a 3D Vision Sensor," a Presentation from Capable Robot Components

  • 1. Copyright © 2019 Carnegie Robotics How to Choose a 3D Vision Technology Chris Osterwood Capable Robot Components May 2019
  • 2. Copyright © 2019 Carnegie Robotics Carnegie Robotics, LLC 2 Carnegie Robotics is a leading provider of rugged sensors, autonomy software and platforms for defense, agriculture, mining, infrastructure, and energy applications. The company’s 5-acre campus collocates engineering, product testing and a 9,000 square foot production facility. Certified to ISO 9001-2015, the company prides itself in integrated systems of record for engineering, production, finance and quality.
  • 3. Copyright © 2019 Carnegie Robotics •Criteria & Definitions •Overview of 3D Sensor Modalities: LIDAR, ToF Camera, Stereo •How do they work? •What are their general advantages and disadvantages? •Sample Data •3D Sensor Testing & Edge Cases •Not covered: •Laser Triangulation, Structured Light, & Computed Tomography Outline 3
  • 4. Copyright © 2019 Carnegie Robotics • Field of View (FOV): Defines angular area of perceivable field of sensor • Density: • Angular step size between sample points • Can be different horizontally and vertically • Resolution: Generally FOV x Density • Depth Accuracy: Difference between measured range and actual range Depth Resolution: Step size between possible measurement ranges (along measurement axis) • Minimum and Maximum Range: Defines distances that are perceivable to the sensor. May vary by object material, brightness, reflectivity, etc. • Rate: Specified in “frames” or “points” per second, depending on modality Evaluation Criteria – Performance 4
  • 5. Copyright © 2019 Carnegie Robotics • Size, Weight, & Power • Cost • Sealing: Dust proof, Water splash, Water streams, Immersion • Shock & Vibration Rating • Communication Interface: Ethernet, USB, Serial, CANBus, Firewire • Synchronization: None, Hardware, Software (broadcast trigger or network time sync) • Software Interface: • Published wire protocol? • Drivers. Platform? Language? License? • Example usage software? • Other: Trigger, GPIO, Lighting, IMU, Temperature & Health Reporting Evaluation Criteria – Non-Performance 5
  • 6. Copyright © 2019 Carnegie Robotics How does end-use change criteria weighting? 6 UAV UGV Automotive Mapping Mapping Safety Mapping Safety HFOV VFOV Density Accuracy Mix range Max range Rate Swap Cost Sealing Shock/Vibe Sync Moderate weighting High weighting Weightings are affected by the platform and end-use of the sensor There are many other uses for 3D sensor The table here is an example of how there are unique drivers: • Mapping: Max Range • Safety: Rate • UAV: Weight • UGV: Vertical FOV / Min Range • Automotive: Cost
  • 7. Copyright © 2019 Carnegie Robotics 7 3D Sensor Technologies
  • 8. Copyright © 2019 Carnegie Robotics 3D LIDAR: How do they work? 8 Transmitter Phase measurement Transmitted Beam Reflected Beam Building block is single-beam system shown in the diagram here. 3D LIDARS can have: • 1 beam steered mechanically in 2 DOF • Multiple beams steered mechanically in 1 DOF Steering can be done with rotating mirror, galvanometer, or directly upon the emitter & detector
  • 9. Copyright © 2019 Carnegie Robotics Downsides Advantages 3D LIDAR 9 • Constant error model — 1 sigma can be +/- 1 cm • Long range. 10 m to 2 km possible depending on laser power and pulse duration • General wide horizontal FOV due to rotating mirror or rotating emitters and detectors • Can have large vertical FOV. Dependent on laser and mechanism details • Cost. Trends are encouraging here, but low cost systems are not yet available. • Scan time — no instantaneous capture of data. System has to compensate for platform (or mechanism) motion during scan • Limited angular density & asymmetric density • Moving parts & non-planar windows are difficult to seal • Affected by obscurants in the environment
  • 10. Copyright © 2019 Carnegie Robotics ToF Cameras: How do they work? 10 • Like LIDAR, but every pixel is a measurement point. • Results in limited range or more transmission power. • Accuracy increases with modulation frequency. • Maximum range decreases with modulation frequency. • Systems generally use multiple frequencies to allow for long range and non-ambiguous range.
  • 11. Copyright © 2019 Carnegie Robotics • Dense data regardless of scene texture • Instantaneous horizontal & vertical FOV • Linear error with respect to distance • Narrower shadow behind positive obstacles • Solid state — no moving parts • Fewer “camera” parameters ToF Cameras 11 • Low resolution • Long integration time (to increase SNR and resolve range ambiguity) causes motion blur if platform or objects are moving • Susceptible to multi-echo returns distorting data • Affected by obscurant in the environment • Limited FOV — generally both horizontal and vertical • May not have an “image” output Downsides Advantages
  • 12. Copyright © 2019 Carnegie Robotics Stereo Camera: How do they work? 12 Feature matches are found in left and right cameras. Difference in lateral position (disparity) is inversely proportional to distance. Matching process (correspondence problem) is critical to data accuracy and density. “Disparity search range” = number of horizontal pixels in right camera searched before moving onto the next left pixel. A larger search range allows seeing objects closer to the camera, but generally reduces 3D throughput.
  • 13. Copyright © 2019 Carnegie Robotics Active & Passive Stereo 13 • Active Stereo: Correspondence problem aided through projection of arbitrary texture into the scene. • Advantages: • Very high resolution and density • Instantaneous horizontal & vertical FOV • Flexible FOV options (from 130° to 10°) • Monochrome or color images are co-registered with 3D data • Many “camera” parameters • Passive solution is immune to obscurant in the environment • Solid state — no moving parts • Downsides: • Data density dependent on scene texture or pattern projection • Non-linear error model with respect to distance • Double shadow behind positive obstacles • Computationally expensive process increases BOM cost
  • 14. Copyright © 2019 Carnegie Robotics Sensor Error Models 14 Understanding the stereo error model is key to effective use. It is VERY different than every other sensor. Stereo accuracy here assumes matches with 1/2 pixel of accuracy. That is common in real world environments, but not guaranteed. Accuracy is affected by scene texture.
  • 15. Copyright © 2019 Carnegie Robotics LIDAR & Stereo for Mobile Footstep Planning 15 Rotating LIDAR Stereo Videos courtesy of MIT’s DARPA Robotics Challenge Team. 3D data is being used to plan foot placement and robot movement over the steps. Both videos are sped up from real-time.
  • 16. Copyright © 2019 Carnegie Robotics 3D LIDAR Sample Data 16 1: 8 seconds of persisted data 2: 1 second of persisted data, high rotation rate Notice: • Wide FOV • Difficulty in seeing/ identifying person • Density differences horizontally & vertically • Mixed pixel returns
  • 17. Copyright © 2019 Carnegie Robotics ToF Camera Sample Data: Low Cost 17 1: Axis Colored Scene 2: Intensity Colored Scene Notice: • Low resolution / Low Rate • Narrow FOV • Invisible / bent cart handle. • “Flashlight effect” • Mid-air returns • 1/2 of floor missing
  • 18. Copyright © 2019 Carnegie Robotics ToF Camera Sample Data: Industrial 18 1 & 2: Axis colored scene Notice: • High resolution / High rate • Distortion with near-field object • Distortion to rear wall with foreground object • Multi-path distortion in floor • Mixed-pixel returns
  • 19. Copyright © 2019 Carnegie Robotics Stereo Sample Data 19 1: Overview of scene Notice: • High density — can see pieces, can see cart handle • High Rate & Wider FOV • Correlated scene color • Data below ground caused by specular reflections • False “spike” in rear wall caused by cart handle 2: Close-up of chess board • Post-matching cost threshold starts high (all data), is lowered (only high confidence data), and is raised again
  • 20. Copyright © 2019 Carnegie Robotics Stereo Sample Data 20 IR Pattern Projector turned on and off throughout scene. Scene has RGB colormap of monochrome intensity. Areas of low-confidence have higher-confidence with projector on. This results in lower-noise and more accurate data.
  • 21. Copyright © 2019 Carnegie Robotics Stereo from Passive Thermal Thermal Stereo Optical Stereo
  • 22. Copyright © 2019 Carnegie Robotics 22 General Sweet Spots LIDAR ToF Stereo HFOV VFOV Density Range accuracy Min range Max range Data rate Obscurant Cost Sealing Shock/Vibe LIDAR • High speed vehicles — well served by their long range and high accuracy at range • Vehicles with high turning rates — require wide horizontal FOV ToF Camera • Indoor environments • Short range object scanning & gesture recognition Stereo Camera • Outdoor applications with high levels of ambient obscurants • Multi-view applications (overlapping FOVs) • Applications which require a non-emitting solution
  • 23. Copyright © 2019 Carnegie Robotics Testing 3D Sensors
  • 24. Copyright © 2019 Carnegie Robotics ToF & Stereo: Outdoors Significant horizontal and vertical field of view (FOV) advantage Much denser data allows smaller objects to be detected No mixed-pixel returns Stereo ToF Both views are nearly top- down RGB Image of Scene
  • 25. Copyright © 2019 Carnegie Robotics 25 ToF & Stereo: Indoors Stereo allows near and far objects to be detected Wider vertical and horizontal field of view (FOV) allows more understanding of the scene ToF Camera has dense returns on visually featureless drywall No far-range data returned Both 3D views are from same perspective and are above sensor, angled down slightly ToFStereo
  • 26. Copyright © 2019 Carnegie Robotics 26 ToF Camera: Multi-echo Distortions Reflective surfaces distort ToF camera 3D data. IR energy reflects and scattered returns biases range reading longer than actual geometry Causes ruts in floors and bows in vertical walls Polarization can help mitigate but severely limits maximum range. RGB Image of Scene
  • 27. Copyright © 2019 Carnegie Robotics 27 Stereo & LIDAR in Obscurant Stereo & Laser Point Clouds Viewed from above Colorized Stereo Point Cloud 50 m Tunnel viewed from near camera Colorized Stereo Point Cloud 3D Laser
  • 28. Copyright © 2019 Carnegie Robotics 28 3D LIDAR: Lighting: Vendor A 300 lux: Fluorescent 5000 lux: Halogen Lexan Mirror Mesh Black Tile Acrylic 8 3 6 9 7 5 8 3 6 9 7 5 White areas shifted 2.5 cm closer to sensor Std. dev. of first row (mm)
  • 29. Copyright © 2019 Carnegie Robotics Mirror Lexan Polished Brass Mesh Acrylic Black Tile 29 3D LIDAR: Lighting: Vendor B Lexan Mirror Mesh Black Tile Acrylic 10 7 7 9 10 8 11 9 10 12 13 12 High IR content of Halogen increases noise additional 40% Std. dev. of first row (mm) 50% more noise than Vendor A 300 lux: Fluorescent 5000 lux: Halogen
  • 30. Copyright © 2019 Carnegie Robotics 30 ToF Camera: Lighting Lexan Mirror Mesh Black Tile Acrylic High IR content of Halogen increases noise by 2x to 5x Higher range noise on dark objects Mirror Lexan Polished Brass Mesh Acrylic Black Tile 300 lux: Fluorescent 5000 lux: Halogen
  • 31. Copyright © 2019 Carnegie Robotics 31 Stereo Camera: 7cm Baseline Range Accuracy 4 4 4 4 4 4 9 9 12 10 8 10 Higher range noise on low-texture objects Std. dev. of first row (mm) Mirror Lexan Black Tile Polished Brass Mesh Acrylic 0.4 meter 1.2 meter
  • 32. Copyright © 2019 Carnegie Robotics 32 3D LIDAR: Cost vs Performance Price X Price 3X • Test is static scan of flat surface, 5 cm to 140 cm in 5 cm increments • Datasets are from the same vendor, different models of LIDAR • Very different scan density — can be known from datasheet • Range noise, distortions, & localized errors cannot be known from datasheet Error to Color Mapping 0.00 cm: Blue 0.50 cm: Green 1.00 cm: Yellow 1.25 cm: Orange 1.50+ cm: Red
  • 33. Copyright © 2019 Carnegie Robotics 33 3D LIDAR: Error Changes Over Time • Every laser (even same SKU) is different • Lasers change over time • Error can vary by emission angle & range Unit A Unit A Unit BJanuary July Error to Color Mapping 0.00 cm: Blue 0.50 cm: Green 1.00 cm: Yellow 1.25 cm: Orange 1.50+ cm: Red
  • 34. Copyright © 2019 Carnegie Robotics 34 Stereo: Lens Flare • Lens flare can cause loss of 3D data due to over-exposure of that region of the scene. • It is possible for flare to be similar enough between left and right cameras to result in 3D matches. More likely with Block Matching and with lower resolution stereo system. • Can also affect ToF camera systems.
  • 35. Copyright © 2019 Carnegie Robotics 35 3D Resolution Testing In some applications XYZ resolution matters as much as depth accuracy. How do you measure 3D resolution? Take cues from 2D camera testing. • Raised Relief 3D Resolution Target • Derived from USAF 1951 resolution target • THE standard image quality test for 60 years
  • 36. Copyright © 2019 Carnegie Robotics 36 3D Resolution Testing Process Capture Image Identify and Isolate Target Outputs: • Colorized Depth Image • Colorized Depth Error Image • RMS error on features and between features for each grouping size
  • 37. Copyright © 2019 Carnegie Robotics 37 3D Resolution: Stereo Error Image Range Image
  • 38. Copyright © 2019 Carnegie Robotics 38 3D Resolution: Industrial ToF Camera Error Image Range Image
  • 39. Copyright © 2019 Carnegie Robotics 39 3D Resolution: Low Cost ToF Camera Error Image Range Image
  • 40. Copyright © 2019 Carnegie Robotics 40 Take Aways There is no perfect sensor. They all have limitations and edge cases. Look for published test data. If none is available: • Ask the sensor manufacturer, • Collect it yourself, or • Turn to those who have past experience in integration and downstream software. It is critical to understand the performance & non-performance criteria that matter for your application. • Those weights must drive your sensor selection process. • You may have to sacrifice one or more lesser characteristics to meet the ones which really matter to you.
  • 41. Copyright © 2019 Carnegie Robotics 41 Resources http://carnegierobotics.com/evs/ http://carnegierobotics.com/support/ Companies & Products Carnegie Robotics MultiSense IFM O3D SICK 3vistor-T ToF Camera pmd pico ToF Camera ZED Camera Orbbec Persee Structure Sensor SICK TiM / LMS LIDARs Velodyne 3D LIDARs Hokuyo 2D & 3D LIDAR Carnegie Robotics Intel RealSense nerian Karmin2 Stereo Camera iDS Ensenso Roboception rc_visard SICK PLB500 Stereo Camera LIDAR ToF Camera