SlideShare ist ein Scribd-Unternehmen logo
1 von 41
Downloaden Sie, um offline zu lesen
© 2020 GoPro, Inc.
CMOS Image Sensors: A Guide
to Building the Eyes of
a Vision System
Jon Stern, Ph.D.
Director, Optical Systems
Sept 2020
© 2020 GoPro, Inc.
Light Basics
2
© 2020 GoPro, Inc.
What is Light?
• Light is represented as both a particle and an electromagnetic wave
• When discussing sensors we tend to rather casually switch between the two
• A photon = a light particle
• The energy determines wavelength (higher energy = shorter wavelength)
• Energy (J) = Planck’s constant (Js) x speed of light (ms-1) / wavelength (m)
• When visible (to human beings), we perceive wavelength as color
• Intensity of light = number of photons
3
380nm 740nm
Ultraviolet ← → Infrared
© 2020 GoPro, Inc.
Bands of Light
Visible: 380-740nm
• Sensors: CCDs and CMOS image sensors (CIS)
Near Infra Red: 0.74µm – 1µm
• Sensors: CCDs and NIR-enhanced CIS
SWIR: 1-2.5µm
• Sensors: InGaAs sensors
MWIR: 3-5µm
• Sensors: Indium Antimonide, Mercury Cadmium
Telluride (HgCdTe), III-V semiconductor
superlattices
• Thermal imaging
LWIR: 8-14µm
• Sensors: Microbolometers, HgCdTe
• Thermal imaging
4
By Darth Kule - Own work, Public Domain,
https://commons.wikimedia.org/w/index.php?curid=10555337
Black body radiation as governed by Planck’s equation
© 2020 GoPro, Inc.
Photoelectric Effect
• Image sensors make use of The
Photoelectric Effect
• Electromagnetic radiation hits a material
and dislodges electrons
• The electrons can be collected and
“counted”
• Interesting aside: Einstein was awarded a
Nobel Prize for his work on the
Photoelectric Effect, not for his theory of
relativity
• The number of electrons is dependent on
both light intensity and wavelength
5
By Ponor - Own work, CC BY-SA 4.0,
https://commons.wikimedia.org/w/index.php?curid=91353714
© 2020 GoPro, Inc.
Quantum Efficiency
• Quantum Efficiency (QE) is the ratio of
number of electrons collected to the
number of photons incident
• QE is sensor specific, and is frequently
normalized (“Relative QE”) to obscure IP
• The peak QE of silicon aligns nicely with
the peak response of human vision
@555nm
6
100
320 1100
0
QuantumEfficiency(%)
Wavelength (nm)
Sea Level Solar Spectrum
Typical Silicon Sensor QE
© 2020 GoPro, Inc.
CMOS Image Sensors
7
© 2020 GoPro, Inc.
CMOS Image Sensor (CIS)
• Leverages Complimentary Metal Oxide Semiconductor (CMOS) manufacturing processes
• Each pixel contains a photodiode and a current amplifier (“active pixel sensor”)
• Very high level of integration (SOC)
• On-chip ADCs, timing control, voltage conversion, corrections (black level, defective pixels, etc.)
8
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
PixelAmp
CDS
Column
ADC
Bits Out
RowDriversandAccess
Digital
Corrections
Clock &
Timing
Generation
Phase
Locked Loop
Bias
Generation
Additional
Digital
Features
CDS
Column
ADC
CDS
Column
ADC
CDS
Column
ADC
© 2020 GoPro, Inc.
Seeing in Color
• Apart from in some specialist sensors
(like Foveon) capturing color from a
single image sensor requires spatial sub-
sampling
• Place a color filter over each
photodetector
• Demosaic algorithm interpolates missing
colors to generate RGB for each location
• Add an IR cut off filter if you want to see
“natural” (photoptic eye) color
• Novel filters patterns can be used for
specialist applications, like CCCR (C=
clear), RGB+IR (IR = infrared pass filter)
9
Color Filter Array (CFA)
Microlens Array
Photodiode
Bayer Color Filter Pattern Quantum Efficiency for Color Sensor
© 2020 GoPro, Inc.
Noise Sources in CMOS Image Sensors
Shot noise
• Noise that is intrinsic to the quantum nature of photons
• Uncertainty in measurement
• Shot noise = SQRT (number of photons)
Read noise
• RMS noise of sensor circuitry in electrons
Fixed pattern noise
• Pixel FPN – pixel to pixel dark offsets
• Includes dark signal non-uniformity, which changes with integration
time and temperature
• Column FPN - per column dark offsets
• Visible well below the pixel temporal noise floor (e.g., visible down to ~1/20th
read noise)
• Row FPN – per row dark offsets
Photo Response Non-Uniformity
• Pixel to pixel variation in responsivity (gain error)
Row noise
• Per row variation in the black level of each row. Dominated by temporal noise (row FPN
generally not an issue in modern CIS)
10
Column fixed pattern noise
Column fixed pattern noise +
row noise
© 2020 GoPro, Inc.
Hot Pixels
• Pixels with high leakage
• Bright pixels in images captured in the
dark with long exposure and/or high
temperature
• Need correcting (really hiding) by the
image processing pipeline
• Static map programmed at factory
• Dynamic detection and correction
• Replace with average of neighboring
pixel values (in the same color plane)
11
PixelCounts
Pixel Value
© 2020 GoPro, Inc.
Global vs Rolling Shutter
Global shutter: all sensor rows capture light at the same time
Rolling shutter:
• Each row reset one at a time, then read out n row periods later
• Creates rolling shutter artifacts (moving object distortions, “jello
effect” from camera vibration)
12
Vibrating camera ”Jello effect” from
rolling shutter
Rolling shutter temporal aliasing with
spinning propeller
Last row
1st row
2nd row
Integration Time (~µs – ms)
Row Time (~µs)
Time
© 2020 GoPro, Inc.
Front-Side Illumination vs Back-Side Illumination
• Front-side illumination (FSI) sensors have
the photodiode below the metal wiring
• Impacts sensitivity, acceptance angle,
optical cross talk (photons being
captured by the wrong photodiode)
• In back-side illumination (BSI) sensors the
wafer is flipped and bonded to a blank
wafer, or an ASIC layer, and thinned
• The color filter array and microlenses
are built on top of the thinned
photodiode layer
In FSI (left) metal impedes the optical path to the photodiode. In BSI (right),
the optical stack height is minimized
13
© 2020 GoPro, Inc.
Key Sensor Specifications Impacting Image Quality
• Full well capacity (really linear full-well capacity)
• Small pixels: 5,000 electrons
• Medium pixels: 10,000 electrons
• Large pixels: 50,000 electrons
• Sensitivity
• Voltage swing per lux-s
• Read noise
• The noise floor measured in electrons
• Small-pixel, low noise sensor have read noise ~2 electrons
• Resolution
• Resolution is really a measure of linear resolving power, but colloquially it has come to mean number of
pixels
• Higher resolution means higher spatial sampling of the optical modulation transfer function (MTF)
14
© 2020 GoPro, Inc.
Image Quality Metrics
• Signal to noise ratio
• Ratio of the signal from photons collected to unwanted noise
• SNR10: the number of lux (measurement of luminous flux per unit area) to achieve
an SNR of 10
• Dynamic range
• Ratio between the max. output signal level and noise floor
• Sensor companies typically misrepresent this as:
• Max. output signal at minimum analog gain / noise floor at max. analog gain
• This is unachievable within a single frame
• 𝑑𝐵 = 20 × log (
𝐵𝑟𝑖𝑔ℎ𝑡𝑒𝑠𝑡 𝐿𝑢𝑥
𝐷𝑎𝑟𝑘𝑒𝑠𝑡 𝐿𝑢𝑥
)
• Resolution
• Not just about number of pixels. More formally is a measure of the smallest
features that can be resolved. Described in more detail later
15
Illuminance (Lux) Scene
0.2 - 1 Full moon
100 Dim indoor
500 – 1,000 Office
10,000– 30,000 Outdoors, cloudy
100,000 Direct sunlight
Ratio of
Brightest/Darkest
Dynamic Range
(dB)
10 20
100 40
1,000 60
10,000 80
100,000 100
© 2020 GoPro, Inc.
Image Sensor Interfaces
• MIPI
• D-PHY
• Dominant interface in phones for a decade
• Up to 4 differential paired lanes
• Separate clock lane
• C-PHY
• Higher performance interface with embedded clock
• 3-wire lanes (aka “trios”)
• Starting to gain traction in high performance phones
• Likely to gain broader adoption over time
• A-PHY
• New MIPI standard for automotive adoption
• Various proprietary LVDS, subLVDS, and SLVS interfaces
• It is important to either select a sensor that will directly interface with your SOC, or accept the added cost and power of an
external interface bridge chip
16
© 2020 GoPro, Inc.
High Dynamic Range (HDR) Sensors
Typical commercial sensors have a dynamic range around 70dB
HDR sensors use a multitude of techniques to extend the dynamic range that can be captured
• Multi-frame HDR: capture sequential frames with long and short exposure times
• Long exposure captures the shadows
• Short exposure captures the highlights
• Stitch the frames together to create a single higher dynamic range frame with higher
bit depth
• For scenes with motion, this technique can result in significant motion artifacts
(ghosting)
• Overlapping multi-frame exposure
• Multiple exposures are produced on a line-by-line basis, rather than frame by frame
• Greatly reduces the motion artifacts by minimizing the time delta between the
different exposure read outs
• Spatially interleaved methods
• Multiple exposure times are captured simultaneously by varying the exposure time of
different groups of pixels (alternate lines, zig-zag pattern, etc.)
• Almost eliminates HDR motion artifacts, but reduces resolution and creates aliasing
artifacts on high contrast edges
17
G R G R G R G
B G B G B G B
G R G R G R G
B G B G B G B
G R G R G R G
B G B G B G B
R
G
R
G
R
G
G R G R G R G
B G B G B G B
R
G
Short exposure
pixels
Long exposure
pixels
Sony SME HDR
© 2020 GoPro, Inc.
High Dynamic Range (HDR) Sensors - Continued
• Dual conversion gains
• Technology that allows the pixel gain (µV/e-) to be switched between two values
• Can be used in a multi-frame, or spatially interleaved manner with similar characteristics, but improved noise (as there is
no short exposure throwing away photons)
• Logarithmic pixels
• Pixel has a logarithmic response, or a logarithmic response region, that greatly extends dynamic range compared to a
linear mode pixel
• Mismatches between the pixel response curves and changes with temperature can create high fixed pattern noise and
high photo-response non-uniformity
• Overflow pixels
• Pixels that contain an overflow gate that allow charge (electrons) to flow to a secondary overflow node
• Requires additional per-pixel storage node, and control line mean that the technology only works for larger pixels (~6µm)
18
© 2020 GoPro, Inc.
Optics for Image Sensors
19
© 2020 GoPro, Inc.
Matching Optics with a Sensor: Overview
• Optical format
• Chief ray angle
• Focal length
• Magnification
• Field of view
• F-number
• Depth of field
• Wavelength range
• Modulation transfer function
• Relative illumination
• Stray light
20
© 2020 GoPro, Inc.
Optical Format
• Optical format is probably not what you think!
• A 1/2.3-inch format sensor does not have a diagonal of 1/2.3”
(11.04mm). It’s actually 1/3.2” (7.83mm)
• Convention is legacy of imaging tube technology, where the stated
format was the mechanical diameter of the tube
• Optical format ~3/2 x sensor diagonal
• From film cameras, “35mm format” is 36mm x 24mm (43mm diag.)
• 35mm is the pitch between the feed sprocket holes
• The lens and sensor optical formats must be matched
• Smaller sensor optical format is okay, but this will crop the field of view,
and may create a CRA mismatch issue (more on this later)
By Sphl - Own work, CC BY-SA 3.0,
https://commons.wikimedia.org/w/index.php?curid=809979
2/3-inch format Vidicon tube
35mm
1-inch
16mm
2/3-inch
11mm
1/1.8-inch
9mm
1/2.3-inch
7.8mm
1-inch lens image circle Sensor area
21
© 2020 GoPro, Inc.
Chief Ray Angle (CRA)
• The angle of light rays that pass through the center of the exit pupil of the lens relative to the optical axis
• Often quoted as a number (for the maximum CRA), but it’s actually a curve
• The microlenses on a sensor are (typically) shifted to compensate for the anticipated change in CRA across the array
• Mismatched lens and sensor CRA can lead to luma and chroma shading
• Match for 1µm pitch pixels should be within ~2°
• Match for 2µm pitch pixels should be within ~5°
22
LensAperture
CRA
Sensor
Marginal rays
0
1
2
3
4
5
6
7
8
0 0.2 0.4 0.6 0.8 1
CRA vs Image Height
Image Height
CRA(°)
© 2020 GoPro, Inc.
Focal Length & Magnification
• Focal length is a measure of how strongly the
system converges (or diverges) light
• Thin lens approximation:
1
𝑜
+
1
𝑖
=
1
𝑓
• Magnification, M:
𝑀 =
−𝑖
𝑜
=
ℎ′
ℎ
23
Object distance, o
f
Image distance, i
Focal
length, f
Objectheight,h
Image height, h’
© 2020 GoPro, Inc.
Field of View (FOV)
• FOV is determined by focal length and distortion
• For a rectilinear lens (one where straight features form straight lines on the sensor)
the field of view, ⍺, is:
𝛼 = 2 arctan
𝑑
2𝑓
Where d = sensor size, and f = focal length
• This is true so long as the object distance >> the lens focal length
• For lenses with non-rectilinear distortion (like GoPro lenses) the calculation gets more
complex
• FOV is quoted as the diagonal FOV (DFOV), horizontal FOV (HFOV), or vertical FOV
(VFOV)
• Sometimes without defining which is being used
24
By Dicklyon at English Wikipedia - Transferred from
en.wikipedia to Commons., Public Domain,
https://commons.wikimedia.org/w/index.php?curid=1078320
0
© 2020 GoPro, Inc.
F-Number
• Ratio of focal length to the diameter of the entrance pupil
• Entrance pupil is the image of the physical aperture, as “seen” from the front (the object side) of a lens
𝑁 =
𝑓
𝐷
• The smaller N, the more light reaches the sensor
• A lens with a smaller f-number is often referred to as a “faster” lens, because it permits a faster shutter
speed
• ”Stop” scale for f-number with each stop being 1/2 the amount of light passing through the optical
system:
25
N 1.0 1.4 2 2.8 4 5.6 8 11 16 22 32
© 2020 GoPro, Inc.
F-Number & Resolution Limits
• Diffraction of light by the aperture creates a lower “diffraction limit” to the resolution of the camera
• This is another quantum mechanical effect
• A circular aperture creates an “Airy disc” pattern of light on the image sensor
• The smaller the aperture, the larger the airy disc
• For small pixels, or lenses with high f-numbers, real world performance can be limited by the
diffraction limit
• Diameter of Airy disc to first null, d:
• 𝑑 = 2.44𝜆𝑁
• Where λ = wavelength and N f-number
• Table on right shows minimum resolvable pixel size vs f-number
• This is for edge detection. For imaging points (like stars) the limits are higher
26
Simulated Airy Disc for white light
By SiriusB - Own work, CC0,
https://commons.wikimedia.org/w/index.php?curid=68302545
F-Number Blue (430nm) Green (550nm) Red (620nm)
1.0 0.44 µm 0.54 µm 0.61 µm
1.2 0.53 µm 0.64 µm 0.73 µm
1.4 0.61 µm 0.75 µm 0.85 µm
1.6 0.70 µm 0.86 µm 0.97 µm
1.8 0.79 µm 0.97 µm 1.09 µm
2 0.88 µm 1.07 µm 1.21 µm
2.2 0.97 µm 1.18 µm 1.33 µm
2.4 1.05 µm 1.29 µm 1.45 µm
2.6 1.14 µm 1.40 µm 1.57 µm
2.8 1.23 µm 1.50 µm 1.69 µm
3 1.32 µm 1.61 µm 1.82 µm
3.2 1.41 µm 1.72 µm 1.94 µm
3.4 1.49 µm 1.83 µm 2.06 µm
3.6 1.58 µm 1.93 µm 2.18 µm
3.8 1.67 µm 2.04 µm 2.30 µm
4 1.76 µm 2.15 µm 2.42 µm
© 2020 GoPro, Inc.
Depth of Field (DoF)
• Depth of field is the distance between the nearest and farthest objects that are acceptably in focus
𝐷𝑜𝐹 ≈
2𝑜2
Nc
𝑓2
• Where o = object distance, N = f-number, c = circle of confusion (acceptable blur circle), f = focal length
• DoF increases with:
• Focus distance (squared)
• Circle of confusion
• F-number
• At the cost of light reaching the sensor, and the diffraction limit
• DoF decreases with:
• Focal length (squared)
• A shorter focal length lens has a larger depth of field. So either a wider FOV, or a smaller sensor will increase the DoF
27
© 2020 GoPro, Inc.
Wavelength Range
• A lens is designed for a specific wavelength range
• The wider the wavelength range the more difficult the design becomes
• Leading to higher cost/performance ratio
• A lens designed for visible light will have poor performance in the NIR, and visa versa
• So a lens should be selected to match the application
• A camera for NIR only, should use a NIR lens
• If using visible + NIR, a lens designed for the full wavelength range should be selected
28
© 2020 GoPro, Inc.
Modulation Transfer Function (MTF)
• MTF is a measurement of the sharpness and contrast of a
lens over a range of spatial frequencies (starting at 0)
• MTF plots allow a lens to be evaluated for a particular
application, and lenses to be compared
• A MTF score of 30% means that an edge will be well resolved
• Horizontal axis is line pairs per mm. Higher lp/mm -> smaller
features
• The highest spatial frequency that can be sampled by the
image sensor is:
𝑙𝑝
𝑚𝑚
=
1000
µ𝑚
𝑚𝑚
2 𝑥 𝑃𝑖𝑥𝑒𝑙 𝑆𝑖𝑧𝑒 (µ𝑚)
29
Resolving limit for 3.45µm pixels
© 2020 GoPro, Inc.
MTF (Continued)
• This type of MTF plot typically contains lines for various image heights
• For example 0% IH, 60%IH, 90% IH
• This allows MTF in the center, middle, and corners of the image to be
assessed
• Plots typically show different lines for sagittal and tangential directions
• Alternative plots show MTF vs image height for specific spatial frequencies
• Warning on MTF:
• Unless stated, quoted MTF is for an “as designed” lens, without
manufacturing tolerances
• Actual ”as built” MTF will always be worse that this
• This can be simulated using Monte Carlo simulations, and measured
on optical test equipment
30
60% IH
80% IH 0% IH
Sagittal
lines
Tangential
lines
90% IH
© 2020 GoPro, Inc.
Relative Illumination (RI)
• Relative illumination (RI) is a measure of the roll-off in light intensity from
the center to the edge of the lens image circle
• Often stated as a single number, being the RI at 100% IH
• Plot on right is for a lens with a 88% RI
• This is very good, a typical mobile phone lens, by contrast, will have an RI <30%
• This shading effect means corners will be noisier (lower SNR) than the
center of the image
• The sensor will also have some roll-off in sensitivity with CRA, so camera
RI is the product of the lens RI and the sensor roll off
• RI can be corrected by applying a spatially-varying normalization function,
but this amplifies noise in the corners
31
© 2020 GoPro, Inc.
Stray Light & Ghosts
• Stray light is light scattered within the optical system that reduces contrast
• Ghosts are spatially localized reflections that can obscure details or confuse machine vision
• Sources include:
• Imperfect anti-reflective coatings
• Typical coatings on glass lens elements reflect <0.3% of normal incident light
• Typical coating on plastic lens elements reflect <0.4% of normal incident light
• Edges of lens elements (sometimes inking helps)
• Mechanical surfaces (can be blackened or textured)
• Diffraction off the image sensor
• Dirt and oils on optical surfaces
• Can be simulated using the full lens design and specialist tools (Fred, LightTools, ASAP)
• Can be measured in the lab using a point light source (fiber bundle) and a robotic stage to
rotate the camera
32
Faux stray light rendered in Adobe Photoshop.
Subjectively good for art, bad for object
recognition
© 2020 GoPro, Inc.
Vignetting
• Vignetting is unintended blocking of light rays towards the edge of the image
• Check all mechanical features for possible blocking
• Filter/window holders
• Main product housing
• Have an optical engineer generate a mechanical FOV for importing in to CAD
• Include mechanical tolerances in determining clear window sizing
• Lens to sensor centration
• Lens to sensor tip/tilt
• Camera module to housing alignment tolerances (x, y, z, tip/tilt)
33
© 2020 GoPro, Inc.
Other Design Factors
34
© 2020 GoPro, Inc.
Thermal Considerations
• Dark current doubles about every 6°C
• Dark signal non-uniformity increases with temperature
• Dark signal shot noise (= SQRT number of dark signal electrons) can become a factor at high temperatures
• Hot pixels increase with temperature (defect thermal activation)
• Black level corrections get worse with temperature
• Thermal design of a camera should be considered up front
• Design a low thermal resistance path from the sensor die to ambient
Heatsink/Chassis
Sensor Package
PCB with thermal vias
Thermal pad
Heatsink
Sensor Package
PCB with hole
Thermal pad
35
© 2020 GoPro, Inc.
Electrical Factors (Keep it Quiet)
• Image sensors are mixed signal devices
• Care must be taken to ensure low power supply noise, particularly on analog supply rails
• Dedicated supply (LDO) for sensor VANA is a good idea
• Careful placement and low-impedance routing of decoupling capacitors is essential
• Ask sensor supplier for a module design reference guide
• Prioritize placement of analog rail decoupling capacitors (place as close to the sensor as possible) with
minimal vias
• Analyze the noise
• Capture dark raw images (no image processing) and extract noise components
• Analog supply noise is particularly prone to creating temporal row noise
• Compare with sensor noise on manufacturers evaluation system, or other reference design
36
© 2020 GoPro, Inc.
Selecting a Sensor & Lens
37
© 2020 GoPro, Inc.
Selecting a Sensor - Determine Key Requirements
1. Determine resolution requirements:
• Object space resolution
• Field of view
• Max. object distance
• Min. linear pixels for feature recognition
𝑃𝑖𝑥𝑒𝑙 𝑆𝑖𝑧𝑒 µ𝑚 = 𝑂𝑏𝑗𝑒𝑐𝑡 𝑆𝑝𝑎𝑐𝑒 𝑅𝑒𝑠𝑜𝑙𝑢𝑡𝑖𝑜𝑛 µ𝑚 ×
𝐹𝑂𝑉 (𝑚𝑚)
𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 (𝑚𝑚)
2. Wavelength of light to be imaged: visible or NIR-enhanced sensor
3. Color (full RGB, mono + R, etc.), or monochrome
4. Dynamic range requirements
5. High-speed motion (camera or scene): rolling/global shutter
6. Interface requirements for SOC
38
© 2020 GoPro, Inc.
Selecting a Sensor – Evaluate using Sensor Evaluation Kit
• Use requirements from previous slide to select a
sensor evaluation kit (EVK)
• Look for sensors intended for similar
applications
• Many sensor companies have online
product selection tools, or can support at
the sales/application engineering level
• If possible, pair EVK with lens that’s a close fit for
final application (see following section)
• Test EVK in your application including with
minimum illumination conditions
• If lens FOV does not match, adjust object
distance, or chart size to compensate
• If low light SNR is acceptable proceed to lens selection
• If low light SNR is not acceptable:
• Can required SNR be achieved by adjusting the lens
aperture?
• Trade-off = depth of field, lens size/cost
• Can required SNR be achieved by increasing exposure
time?
• Trade-off = motion blur
• If SNR requirements cannot be achieved, select a larger
sensor (more light collection) and repeat process
• Trade-off = depth of field, sensor cost, lens size/cost
• Evaluate dynamic range
• This is one of the more difficult tests to perform correctly,
but online resources exist like Imatest test software
tutorials
39
© 2020 GoPro, Inc.
Selecting a Lens
1. Determine optical format (from sensor size)
2. Select field of view  focal length
3. Determine target f-number
4. Check depth of field
5. MTF
• From resolution study, determine line pairs/mm resolution needed
• An MTF ≥ 30% at the target lp/mm is a good rule of thumb for the detection of edges
• Check MTF across field: center, mid, corner regions
6. Pick a lens with a CRA that closely matches the sensor CRA specification
7. Check the impact of relative illumination on performance in corners (SNR decrease)
8. Determine if the lens includes a filter (IR-cut, IR passband, etc.), if one is needed
9. Check if the lens is suitable for the dynamic range requirements (HDR requires very good anti-reflective coatings)
40
© 2020 GoPro, Inc.
Resources
Image Sensor Manufacturers
ON Semiconductor: ON Semi Image Sensors
Sony: Sony image sensors
Omnivision: https://www.ovt.com/
Samsung: Samsung image sensors
Fairchild Imaging (BAE Systems):
https://www.fairchildimaging.com/
ams: https://ams.com/cmos-imaging-sensors
Interfaces, Optics, and Imaging
MIPI Alliance: https://www.mipi.org/
Edmunds Optics Knowledge Center:
• Optics: https://www.edmundoptics.com/knowledge-
center/#!&CategoryId=114
• Imaging: https://www.edmundoptics.com/knowledge-
center/#!&CategoryId=175
Cambridge in Colour imaging tutorials:
https://www.cambridgeincolour.com/tutorials.htm
Imaging Test Software and Charts
Imatest: https://www.imatest.com/
41

Weitere ähnliche Inhalte

Was ist angesagt?

Automotive radar in english
Automotive radar in englishAutomotive radar in english
Automotive radar in englishMoh Ali Fauzi
 
Lidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rainLidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rainYu Huang
 
Lidar : light detection and rangeing
Lidar : light detection and rangeingLidar : light detection and rangeing
Lidar : light detection and rangeingRahul Bhagore
 
IMED 2018: An intro to Remote Sensing and Machine Learning
IMED 2018: An intro to Remote Sensing and Machine LearningIMED 2018: An intro to Remote Sensing and Machine Learning
IMED 2018: An intro to Remote Sensing and Machine LearningLouisa Diggs
 
Electronic Toll Collection System
Electronic Toll Collection SystemElectronic Toll Collection System
Electronic Toll Collection SystemArshad Shareef
 
Lidar- light detection and ranging
Lidar- light detection and rangingLidar- light detection and ranging
Lidar- light detection and rangingKarthick Subramaniam
 
Multisensor data fusion for defense application
Multisensor data fusion for defense applicationMultisensor data fusion for defense application
Multisensor data fusion for defense applicationSayed Abulhasan Quadri
 
Automotive Radar Comparison 2018
Automotive Radar Comparison 2018Automotive Radar Comparison 2018
Automotive Radar Comparison 2018system_plus
 
LiDAR Data Processing and Classification
LiDAR Data Processing and ClassificationLiDAR Data Processing and Classification
LiDAR Data Processing and ClassificationMichal Bularz
 
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)Yu Huang
 
Report On Image Sensors
Report On Image SensorsReport On Image Sensors
Report On Image Sensorspranavhaldar
 
Embedded systems in automobiles
Embedded systems in automobilesEmbedded systems in automobiles
Embedded systems in automobilesTilak Marupilla
 
8K Extremely High Resolution Camera System
8K Extremely High Resolution Camera System8K Extremely High Resolution Camera System
8K Extremely High Resolution Camera SystemPrejith Pavanan
 
Seminar report on image sensor
Seminar report on image sensorSeminar report on image sensor
Seminar report on image sensorJaydeepBhayani773
 
Eye gaze technology
Eye gaze technologyEye gaze technology
Eye gaze technologyKetan Hulaji
 

Was ist angesagt? (20)

Automotive radar in english
Automotive radar in englishAutomotive radar in english
Automotive radar in english
 
Airborne radar
Airborne  radarAirborne  radar
Airborne radar
 
LiDAR 101
LiDAR 101LiDAR 101
LiDAR 101
 
Lidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rainLidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rain
 
Lidar : light detection and rangeing
Lidar : light detection and rangeingLidar : light detection and rangeing
Lidar : light detection and rangeing
 
IMED 2018: An intro to Remote Sensing and Machine Learning
IMED 2018: An intro to Remote Sensing and Machine LearningIMED 2018: An intro to Remote Sensing and Machine Learning
IMED 2018: An intro to Remote Sensing and Machine Learning
 
Electronic Toll Collection System
Electronic Toll Collection SystemElectronic Toll Collection System
Electronic Toll Collection System
 
Lidar- light detection and ranging
Lidar- light detection and rangingLidar- light detection and ranging
Lidar- light detection and ranging
 
Multisensor data fusion for defense application
Multisensor data fusion for defense applicationMultisensor data fusion for defense application
Multisensor data fusion for defense application
 
Automotive Radar Comparison 2018
Automotive Radar Comparison 2018Automotive Radar Comparison 2018
Automotive Radar Comparison 2018
 
LiDAR Data Processing and Classification
LiDAR Data Processing and ClassificationLiDAR Data Processing and Classification
LiDAR Data Processing and Classification
 
Lidar
LidarLidar
Lidar
 
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
 
Report On Image Sensors
Report On Image SensorsReport On Image Sensors
Report On Image Sensors
 
Embedded systems in automobiles
Embedded systems in automobilesEmbedded systems in automobiles
Embedded systems in automobiles
 
8K Extremely High Resolution Camera System
8K Extremely High Resolution Camera System8K Extremely High Resolution Camera System
8K Extremely High Resolution Camera System
 
Seminar report on image sensor
Seminar report on image sensorSeminar report on image sensor
Seminar report on image sensor
 
Eye gaze technology
Eye gaze technologyEye gaze technology
Eye gaze technology
 
Embedded system
Embedded systemEmbedded system
Embedded system
 
Gps measurements
Gps measurementsGps measurements
Gps measurements
 

Ähnlich wie “CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Presentation from GoPro

Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Dr. Mohieddin Moradi
 
Linovision intelligent ipc key features introduction
Linovision intelligent ipc key features introductionLinovision intelligent ipc key features introduction
Linovision intelligent ipc key features introductionlinovision
 
HDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsHDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsDr. Mohieddin Moradi
 
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...Edge AI and Vision Alliance
 
Sony α7 III (ILCE-7M3) Full Frame Mirror-less Camera
Sony α7 III (ILCE-7M3) Full Frame Mirror-less CameraSony α7 III (ILCE-7M3) Full Frame Mirror-less Camera
Sony α7 III (ILCE-7M3) Full Frame Mirror-less CameraAggarwal Color Lab (Sirsa)
 
Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...
Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...
Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...Keith Harris
 
“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...
“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...
“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...Edge AI and Vision Alliance
 
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...Unity Technologies
 
Computed radiography &digital radiography
Computed radiography &digital radiographyComputed radiography &digital radiography
Computed radiography &digital radiographyRad Tech
 
20200509 sid china digital optics and digital modulation_v5.0
20200509 sid china digital optics and digital modulation_v5.020200509 sid china digital optics and digital modulation_v5.0
20200509 sid china digital optics and digital modulation_v5.0Chun-Wei Tsai
 
Fujifilm - e2v L3 Vision
Fujifilm - e2v L3 VisionFujifilm - e2v L3 Vision
Fujifilm - e2v L3 VisionBernard Genoud
 
Alex_Vlachos_Advanced_VR_Rendering_GDC2015
Alex_Vlachos_Advanced_VR_Rendering_GDC2015Alex_Vlachos_Advanced_VR_Rendering_GDC2015
Alex_Vlachos_Advanced_VR_Rendering_GDC2015Alex Vlachos
 
Camera , Visual , Imaging Technology : A Walk-through
Camera , Visual ,  Imaging Technology : A Walk-through Camera , Visual ,  Imaging Technology : A Walk-through
Camera , Visual , Imaging Technology : A Walk-through Sherin Sasidharan
 
From STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to Blender
From STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to BlenderFrom STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to Blender
From STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to BlenderEmanuele Simioni
 
Modern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operationModern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operationDr. Mohieddin Moradi
 
Brian Elliott's "Camera Basics" Lecture
Brian Elliott's "Camera Basics" LectureBrian Elliott's "Camera Basics" Lecture
Brian Elliott's "Camera Basics" Lecturejpowers
 
Canon eos c300 cinema camera body ef mount
Canon  eos c300 cinema camera body ef mountCanon  eos c300 cinema camera body ef mount
Canon eos c300 cinema camera body ef mountElectronic Bazaar
 

Ähnlich wie “CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Presentation from GoPro (20)

Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3
 
SECURICO CCTV BOOK
SECURICO CCTV BOOK SECURICO CCTV BOOK
SECURICO CCTV BOOK
 
Linovision intelligent ipc key features introduction
Linovision intelligent ipc key features introductionLinovision intelligent ipc key features introduction
Linovision intelligent ipc key features introduction
 
HDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsHDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting Considerations
 
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
 
Sony α7 III (ILCE-7M3) Full Frame Mirror-less Camera
Sony α7 III (ILCE-7M3) Full Frame Mirror-less CameraSony α7 III (ILCE-7M3) Full Frame Mirror-less Camera
Sony α7 III (ILCE-7M3) Full Frame Mirror-less Camera
 
Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...
Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...
Step Into Security Webinar - IP Security Camera Techniques for Video Surveill...
 
“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...
“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...
“Building the Eyes of a Vision System: From Photons to Bits,” a Presentation ...
 
Image Sensing & Acquisition
Image Sensing & AcquisitionImage Sensing & Acquisition
Image Sensing & Acquisition
 
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...
 
Computed radiography &digital radiography
Computed radiography &digital radiographyComputed radiography &digital radiography
Computed radiography &digital radiography
 
20200509 sid china digital optics and digital modulation_v5.0
20200509 sid china digital optics and digital modulation_v5.020200509 sid china digital optics and digital modulation_v5.0
20200509 sid china digital optics and digital modulation_v5.0
 
Fujifilm - e2v L3 Vision
Fujifilm - e2v L3 VisionFujifilm - e2v L3 Vision
Fujifilm - e2v L3 Vision
 
Dr,system abhishek
Dr,system abhishekDr,system abhishek
Dr,system abhishek
 
Alex_Vlachos_Advanced_VR_Rendering_GDC2015
Alex_Vlachos_Advanced_VR_Rendering_GDC2015Alex_Vlachos_Advanced_VR_Rendering_GDC2015
Alex_Vlachos_Advanced_VR_Rendering_GDC2015
 
Camera , Visual , Imaging Technology : A Walk-through
Camera , Visual ,  Imaging Technology : A Walk-through Camera , Visual ,  Imaging Technology : A Walk-through
Camera , Visual , Imaging Technology : A Walk-through
 
From STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to Blender
From STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to BlenderFrom STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to Blender
From STC (Stereo Camera onboard on Bepi Colombo ESA Mission) to Blender
 
Modern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operationModern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operation
 
Brian Elliott's "Camera Basics" Lecture
Brian Elliott's "Camera Basics" LectureBrian Elliott's "Camera Basics" Lecture
Brian Elliott's "Camera Basics" Lecture
 
Canon eos c300 cinema camera body ef mount
Canon  eos c300 cinema camera body ef mountCanon  eos c300 cinema camera body ef mount
Canon eos c300 cinema camera body ef mount
 

Mehr von Edge AI and Vision Alliance

“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...Edge AI and Vision Alliance
 
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...Edge AI and Vision Alliance
 
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...Edge AI and Vision Alliance
 
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...Edge AI and Vision Alliance
 
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...Edge AI and Vision Alliance
 
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...Edge AI and Vision Alliance
 
“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...Edge AI and Vision Alliance
 
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsightsEdge AI and Vision Alliance
 
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...Edge AI and Vision Alliance
 
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...Edge AI and Vision Alliance
 
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...Edge AI and Vision Alliance
 
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...Edge AI and Vision Alliance
 
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...Edge AI and Vision Alliance
 
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...Edge AI and Vision Alliance
 
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...Edge AI and Vision Alliance
 
“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from SamsaraEdge AI and Vision Alliance
 
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...Edge AI and Vision Alliance
 
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...Edge AI and Vision Alliance
 
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...Edge AI and Vision Alliance
 
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...Edge AI and Vision Alliance
 

Mehr von Edge AI and Vision Alliance (20)

“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
“Learning Compact DNN Models for Embedded Vision,” a Presentation from the Un...
 
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
“Introduction to Computer Vision with CNNs,” a Presentation from Mohammad Hag...
 
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
“Selecting Tools for Developing, Monitoring and Maintaining ML Models,” a Pre...
 
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
“Building Accelerated GStreamer Applications for Video and Audio AI,” a Prese...
 
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
“Understanding, Selecting and Optimizing Object Detectors for Edge Applicatio...
 
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
“Introduction to Modern LiDAR for Machine Perception,” a Presentation from th...
 
“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...“Vision-language Representations for Robotics,” a Presentation from the Unive...
“Vision-language Representations for Robotics,” a Presentation from the Unive...
 
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
“ADAS and AV Sensors: What’s Winning and Why?,” a Presentation from TechInsights
 
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentati...
 
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
“Detecting Data Drift in Image Classification Neural Networks,” a Presentatio...
 
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
“Deep Neural Network Training: Diagnosing Problems and Implementing Solutions...
 
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
“AI Start-ups: The Perils of Fishing for Whales (War Stories from the Entrepr...
 
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentati...
 
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
“Bias in Computer Vision—It’s Bigger Than Facial Recognition!,” a Presentatio...
 
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
“Sensor Fusion Techniques for Accurate Perception of Objects in the Environme...
 
“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara“Updating the Edge ML Development Process,” a Presentation from Samsara
“Updating the Edge ML Development Process,” a Presentation from Samsara
 
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
“Combating Bias in Production Computer Vision Systems,” a Presentation from R...
 
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
“Developing an Embedded Vision AI-powered Fitness System,” a Presentation fro...
 
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
“Navigating the Evolving Venture Capital Landscape for Edge AI Start-ups,” a ...
 
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
“Advanced Presence Sensing: What It Means for the Smart Home,” a Presentation...
 

Kürzlich hochgeladen

Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...gurkirankumar98700
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 

Kürzlich hochgeladen (20)

Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 

“CMOS Image Sensors: A Guide to Building the Eyes of a Vision System,” a Presentation from GoPro

  • 1. © 2020 GoPro, Inc. CMOS Image Sensors: A Guide to Building the Eyes of a Vision System Jon Stern, Ph.D. Director, Optical Systems Sept 2020
  • 2. © 2020 GoPro, Inc. Light Basics 2
  • 3. © 2020 GoPro, Inc. What is Light? • Light is represented as both a particle and an electromagnetic wave • When discussing sensors we tend to rather casually switch between the two • A photon = a light particle • The energy determines wavelength (higher energy = shorter wavelength) • Energy (J) = Planck’s constant (Js) x speed of light (ms-1) / wavelength (m) • When visible (to human beings), we perceive wavelength as color • Intensity of light = number of photons 3 380nm 740nm Ultraviolet ← → Infrared
  • 4. © 2020 GoPro, Inc. Bands of Light Visible: 380-740nm • Sensors: CCDs and CMOS image sensors (CIS) Near Infra Red: 0.74µm – 1µm • Sensors: CCDs and NIR-enhanced CIS SWIR: 1-2.5µm • Sensors: InGaAs sensors MWIR: 3-5µm • Sensors: Indium Antimonide, Mercury Cadmium Telluride (HgCdTe), III-V semiconductor superlattices • Thermal imaging LWIR: 8-14µm • Sensors: Microbolometers, HgCdTe • Thermal imaging 4 By Darth Kule - Own work, Public Domain, https://commons.wikimedia.org/w/index.php?curid=10555337 Black body radiation as governed by Planck’s equation
  • 5. © 2020 GoPro, Inc. Photoelectric Effect • Image sensors make use of The Photoelectric Effect • Electromagnetic radiation hits a material and dislodges electrons • The electrons can be collected and “counted” • Interesting aside: Einstein was awarded a Nobel Prize for his work on the Photoelectric Effect, not for his theory of relativity • The number of electrons is dependent on both light intensity and wavelength 5 By Ponor - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=91353714
  • 6. © 2020 GoPro, Inc. Quantum Efficiency • Quantum Efficiency (QE) is the ratio of number of electrons collected to the number of photons incident • QE is sensor specific, and is frequently normalized (“Relative QE”) to obscure IP • The peak QE of silicon aligns nicely with the peak response of human vision @555nm 6 100 320 1100 0 QuantumEfficiency(%) Wavelength (nm) Sea Level Solar Spectrum Typical Silicon Sensor QE
  • 7. © 2020 GoPro, Inc. CMOS Image Sensors 7
  • 8. © 2020 GoPro, Inc. CMOS Image Sensor (CIS) • Leverages Complimentary Metal Oxide Semiconductor (CMOS) manufacturing processes • Each pixel contains a photodiode and a current amplifier (“active pixel sensor”) • Very high level of integration (SOC) • On-chip ADCs, timing control, voltage conversion, corrections (black level, defective pixels, etc.) 8 PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp PixelAmp CDS Column ADC Bits Out RowDriversandAccess Digital Corrections Clock & Timing Generation Phase Locked Loop Bias Generation Additional Digital Features CDS Column ADC CDS Column ADC CDS Column ADC
  • 9. © 2020 GoPro, Inc. Seeing in Color • Apart from in some specialist sensors (like Foveon) capturing color from a single image sensor requires spatial sub- sampling • Place a color filter over each photodetector • Demosaic algorithm interpolates missing colors to generate RGB for each location • Add an IR cut off filter if you want to see “natural” (photoptic eye) color • Novel filters patterns can be used for specialist applications, like CCCR (C= clear), RGB+IR (IR = infrared pass filter) 9 Color Filter Array (CFA) Microlens Array Photodiode Bayer Color Filter Pattern Quantum Efficiency for Color Sensor
  • 10. © 2020 GoPro, Inc. Noise Sources in CMOS Image Sensors Shot noise • Noise that is intrinsic to the quantum nature of photons • Uncertainty in measurement • Shot noise = SQRT (number of photons) Read noise • RMS noise of sensor circuitry in electrons Fixed pattern noise • Pixel FPN – pixel to pixel dark offsets • Includes dark signal non-uniformity, which changes with integration time and temperature • Column FPN - per column dark offsets • Visible well below the pixel temporal noise floor (e.g., visible down to ~1/20th read noise) • Row FPN – per row dark offsets Photo Response Non-Uniformity • Pixel to pixel variation in responsivity (gain error) Row noise • Per row variation in the black level of each row. Dominated by temporal noise (row FPN generally not an issue in modern CIS) 10 Column fixed pattern noise Column fixed pattern noise + row noise
  • 11. © 2020 GoPro, Inc. Hot Pixels • Pixels with high leakage • Bright pixels in images captured in the dark with long exposure and/or high temperature • Need correcting (really hiding) by the image processing pipeline • Static map programmed at factory • Dynamic detection and correction • Replace with average of neighboring pixel values (in the same color plane) 11 PixelCounts Pixel Value
  • 12. © 2020 GoPro, Inc. Global vs Rolling Shutter Global shutter: all sensor rows capture light at the same time Rolling shutter: • Each row reset one at a time, then read out n row periods later • Creates rolling shutter artifacts (moving object distortions, “jello effect” from camera vibration) 12 Vibrating camera ”Jello effect” from rolling shutter Rolling shutter temporal aliasing with spinning propeller Last row 1st row 2nd row Integration Time (~µs – ms) Row Time (~µs) Time
  • 13. © 2020 GoPro, Inc. Front-Side Illumination vs Back-Side Illumination • Front-side illumination (FSI) sensors have the photodiode below the metal wiring • Impacts sensitivity, acceptance angle, optical cross talk (photons being captured by the wrong photodiode) • In back-side illumination (BSI) sensors the wafer is flipped and bonded to a blank wafer, or an ASIC layer, and thinned • The color filter array and microlenses are built on top of the thinned photodiode layer In FSI (left) metal impedes the optical path to the photodiode. In BSI (right), the optical stack height is minimized 13
  • 14. © 2020 GoPro, Inc. Key Sensor Specifications Impacting Image Quality • Full well capacity (really linear full-well capacity) • Small pixels: 5,000 electrons • Medium pixels: 10,000 electrons • Large pixels: 50,000 electrons • Sensitivity • Voltage swing per lux-s • Read noise • The noise floor measured in electrons • Small-pixel, low noise sensor have read noise ~2 electrons • Resolution • Resolution is really a measure of linear resolving power, but colloquially it has come to mean number of pixels • Higher resolution means higher spatial sampling of the optical modulation transfer function (MTF) 14
  • 15. © 2020 GoPro, Inc. Image Quality Metrics • Signal to noise ratio • Ratio of the signal from photons collected to unwanted noise • SNR10: the number of lux (measurement of luminous flux per unit area) to achieve an SNR of 10 • Dynamic range • Ratio between the max. output signal level and noise floor • Sensor companies typically misrepresent this as: • Max. output signal at minimum analog gain / noise floor at max. analog gain • This is unachievable within a single frame • 𝑑𝐵 = 20 × log ( 𝐵𝑟𝑖𝑔ℎ𝑡𝑒𝑠𝑡 𝐿𝑢𝑥 𝐷𝑎𝑟𝑘𝑒𝑠𝑡 𝐿𝑢𝑥 ) • Resolution • Not just about number of pixels. More formally is a measure of the smallest features that can be resolved. Described in more detail later 15 Illuminance (Lux) Scene 0.2 - 1 Full moon 100 Dim indoor 500 – 1,000 Office 10,000– 30,000 Outdoors, cloudy 100,000 Direct sunlight Ratio of Brightest/Darkest Dynamic Range (dB) 10 20 100 40 1,000 60 10,000 80 100,000 100
  • 16. © 2020 GoPro, Inc. Image Sensor Interfaces • MIPI • D-PHY • Dominant interface in phones for a decade • Up to 4 differential paired lanes • Separate clock lane • C-PHY • Higher performance interface with embedded clock • 3-wire lanes (aka “trios”) • Starting to gain traction in high performance phones • Likely to gain broader adoption over time • A-PHY • New MIPI standard for automotive adoption • Various proprietary LVDS, subLVDS, and SLVS interfaces • It is important to either select a sensor that will directly interface with your SOC, or accept the added cost and power of an external interface bridge chip 16
  • 17. © 2020 GoPro, Inc. High Dynamic Range (HDR) Sensors Typical commercial sensors have a dynamic range around 70dB HDR sensors use a multitude of techniques to extend the dynamic range that can be captured • Multi-frame HDR: capture sequential frames with long and short exposure times • Long exposure captures the shadows • Short exposure captures the highlights • Stitch the frames together to create a single higher dynamic range frame with higher bit depth • For scenes with motion, this technique can result in significant motion artifacts (ghosting) • Overlapping multi-frame exposure • Multiple exposures are produced on a line-by-line basis, rather than frame by frame • Greatly reduces the motion artifacts by minimizing the time delta between the different exposure read outs • Spatially interleaved methods • Multiple exposure times are captured simultaneously by varying the exposure time of different groups of pixels (alternate lines, zig-zag pattern, etc.) • Almost eliminates HDR motion artifacts, but reduces resolution and creates aliasing artifacts on high contrast edges 17 G R G R G R G B G B G B G B G R G R G R G B G B G B G B G R G R G R G B G B G B G B R G R G R G G R G R G R G B G B G B G B R G Short exposure pixels Long exposure pixels Sony SME HDR
  • 18. © 2020 GoPro, Inc. High Dynamic Range (HDR) Sensors - Continued • Dual conversion gains • Technology that allows the pixel gain (µV/e-) to be switched between two values • Can be used in a multi-frame, or spatially interleaved manner with similar characteristics, but improved noise (as there is no short exposure throwing away photons) • Logarithmic pixels • Pixel has a logarithmic response, or a logarithmic response region, that greatly extends dynamic range compared to a linear mode pixel • Mismatches between the pixel response curves and changes with temperature can create high fixed pattern noise and high photo-response non-uniformity • Overflow pixels • Pixels that contain an overflow gate that allow charge (electrons) to flow to a secondary overflow node • Requires additional per-pixel storage node, and control line mean that the technology only works for larger pixels (~6µm) 18
  • 19. © 2020 GoPro, Inc. Optics for Image Sensors 19
  • 20. © 2020 GoPro, Inc. Matching Optics with a Sensor: Overview • Optical format • Chief ray angle • Focal length • Magnification • Field of view • F-number • Depth of field • Wavelength range • Modulation transfer function • Relative illumination • Stray light 20
  • 21. © 2020 GoPro, Inc. Optical Format • Optical format is probably not what you think! • A 1/2.3-inch format sensor does not have a diagonal of 1/2.3” (11.04mm). It’s actually 1/3.2” (7.83mm) • Convention is legacy of imaging tube technology, where the stated format was the mechanical diameter of the tube • Optical format ~3/2 x sensor diagonal • From film cameras, “35mm format” is 36mm x 24mm (43mm diag.) • 35mm is the pitch between the feed sprocket holes • The lens and sensor optical formats must be matched • Smaller sensor optical format is okay, but this will crop the field of view, and may create a CRA mismatch issue (more on this later) By Sphl - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=809979 2/3-inch format Vidicon tube 35mm 1-inch 16mm 2/3-inch 11mm 1/1.8-inch 9mm 1/2.3-inch 7.8mm 1-inch lens image circle Sensor area 21
  • 22. © 2020 GoPro, Inc. Chief Ray Angle (CRA) • The angle of light rays that pass through the center of the exit pupil of the lens relative to the optical axis • Often quoted as a number (for the maximum CRA), but it’s actually a curve • The microlenses on a sensor are (typically) shifted to compensate for the anticipated change in CRA across the array • Mismatched lens and sensor CRA can lead to luma and chroma shading • Match for 1µm pitch pixels should be within ~2° • Match for 2µm pitch pixels should be within ~5° 22 LensAperture CRA Sensor Marginal rays 0 1 2 3 4 5 6 7 8 0 0.2 0.4 0.6 0.8 1 CRA vs Image Height Image Height CRA(°)
  • 23. © 2020 GoPro, Inc. Focal Length & Magnification • Focal length is a measure of how strongly the system converges (or diverges) light • Thin lens approximation: 1 𝑜 + 1 𝑖 = 1 𝑓 • Magnification, M: 𝑀 = −𝑖 𝑜 = ℎ′ ℎ 23 Object distance, o f Image distance, i Focal length, f Objectheight,h Image height, h’
  • 24. © 2020 GoPro, Inc. Field of View (FOV) • FOV is determined by focal length and distortion • For a rectilinear lens (one where straight features form straight lines on the sensor) the field of view, ⍺, is: 𝛼 = 2 arctan 𝑑 2𝑓 Where d = sensor size, and f = focal length • This is true so long as the object distance >> the lens focal length • For lenses with non-rectilinear distortion (like GoPro lenses) the calculation gets more complex • FOV is quoted as the diagonal FOV (DFOV), horizontal FOV (HFOV), or vertical FOV (VFOV) • Sometimes without defining which is being used 24 By Dicklyon at English Wikipedia - Transferred from en.wikipedia to Commons., Public Domain, https://commons.wikimedia.org/w/index.php?curid=1078320 0
  • 25. © 2020 GoPro, Inc. F-Number • Ratio of focal length to the diameter of the entrance pupil • Entrance pupil is the image of the physical aperture, as “seen” from the front (the object side) of a lens 𝑁 = 𝑓 𝐷 • The smaller N, the more light reaches the sensor • A lens with a smaller f-number is often referred to as a “faster” lens, because it permits a faster shutter speed • ”Stop” scale for f-number with each stop being 1/2 the amount of light passing through the optical system: 25 N 1.0 1.4 2 2.8 4 5.6 8 11 16 22 32
  • 26. © 2020 GoPro, Inc. F-Number & Resolution Limits • Diffraction of light by the aperture creates a lower “diffraction limit” to the resolution of the camera • This is another quantum mechanical effect • A circular aperture creates an “Airy disc” pattern of light on the image sensor • The smaller the aperture, the larger the airy disc • For small pixels, or lenses with high f-numbers, real world performance can be limited by the diffraction limit • Diameter of Airy disc to first null, d: • 𝑑 = 2.44𝜆𝑁 • Where λ = wavelength and N f-number • Table on right shows minimum resolvable pixel size vs f-number • This is for edge detection. For imaging points (like stars) the limits are higher 26 Simulated Airy Disc for white light By SiriusB - Own work, CC0, https://commons.wikimedia.org/w/index.php?curid=68302545 F-Number Blue (430nm) Green (550nm) Red (620nm) 1.0 0.44 µm 0.54 µm 0.61 µm 1.2 0.53 µm 0.64 µm 0.73 µm 1.4 0.61 µm 0.75 µm 0.85 µm 1.6 0.70 µm 0.86 µm 0.97 µm 1.8 0.79 µm 0.97 µm 1.09 µm 2 0.88 µm 1.07 µm 1.21 µm 2.2 0.97 µm 1.18 µm 1.33 µm 2.4 1.05 µm 1.29 µm 1.45 µm 2.6 1.14 µm 1.40 µm 1.57 µm 2.8 1.23 µm 1.50 µm 1.69 µm 3 1.32 µm 1.61 µm 1.82 µm 3.2 1.41 µm 1.72 µm 1.94 µm 3.4 1.49 µm 1.83 µm 2.06 µm 3.6 1.58 µm 1.93 µm 2.18 µm 3.8 1.67 µm 2.04 µm 2.30 µm 4 1.76 µm 2.15 µm 2.42 µm
  • 27. © 2020 GoPro, Inc. Depth of Field (DoF) • Depth of field is the distance between the nearest and farthest objects that are acceptably in focus 𝐷𝑜𝐹 ≈ 2𝑜2 Nc 𝑓2 • Where o = object distance, N = f-number, c = circle of confusion (acceptable blur circle), f = focal length • DoF increases with: • Focus distance (squared) • Circle of confusion • F-number • At the cost of light reaching the sensor, and the diffraction limit • DoF decreases with: • Focal length (squared) • A shorter focal length lens has a larger depth of field. So either a wider FOV, or a smaller sensor will increase the DoF 27
  • 28. © 2020 GoPro, Inc. Wavelength Range • A lens is designed for a specific wavelength range • The wider the wavelength range the more difficult the design becomes • Leading to higher cost/performance ratio • A lens designed for visible light will have poor performance in the NIR, and visa versa • So a lens should be selected to match the application • A camera for NIR only, should use a NIR lens • If using visible + NIR, a lens designed for the full wavelength range should be selected 28
  • 29. © 2020 GoPro, Inc. Modulation Transfer Function (MTF) • MTF is a measurement of the sharpness and contrast of a lens over a range of spatial frequencies (starting at 0) • MTF plots allow a lens to be evaluated for a particular application, and lenses to be compared • A MTF score of 30% means that an edge will be well resolved • Horizontal axis is line pairs per mm. Higher lp/mm -> smaller features • The highest spatial frequency that can be sampled by the image sensor is: 𝑙𝑝 𝑚𝑚 = 1000 µ𝑚 𝑚𝑚 2 𝑥 𝑃𝑖𝑥𝑒𝑙 𝑆𝑖𝑧𝑒 (µ𝑚) 29 Resolving limit for 3.45µm pixels
  • 30. © 2020 GoPro, Inc. MTF (Continued) • This type of MTF plot typically contains lines for various image heights • For example 0% IH, 60%IH, 90% IH • This allows MTF in the center, middle, and corners of the image to be assessed • Plots typically show different lines for sagittal and tangential directions • Alternative plots show MTF vs image height for specific spatial frequencies • Warning on MTF: • Unless stated, quoted MTF is for an “as designed” lens, without manufacturing tolerances • Actual ”as built” MTF will always be worse that this • This can be simulated using Monte Carlo simulations, and measured on optical test equipment 30 60% IH 80% IH 0% IH Sagittal lines Tangential lines 90% IH
  • 31. © 2020 GoPro, Inc. Relative Illumination (RI) • Relative illumination (RI) is a measure of the roll-off in light intensity from the center to the edge of the lens image circle • Often stated as a single number, being the RI at 100% IH • Plot on right is for a lens with a 88% RI • This is very good, a typical mobile phone lens, by contrast, will have an RI <30% • This shading effect means corners will be noisier (lower SNR) than the center of the image • The sensor will also have some roll-off in sensitivity with CRA, so camera RI is the product of the lens RI and the sensor roll off • RI can be corrected by applying a spatially-varying normalization function, but this amplifies noise in the corners 31
  • 32. © 2020 GoPro, Inc. Stray Light & Ghosts • Stray light is light scattered within the optical system that reduces contrast • Ghosts are spatially localized reflections that can obscure details or confuse machine vision • Sources include: • Imperfect anti-reflective coatings • Typical coatings on glass lens elements reflect <0.3% of normal incident light • Typical coating on plastic lens elements reflect <0.4% of normal incident light • Edges of lens elements (sometimes inking helps) • Mechanical surfaces (can be blackened or textured) • Diffraction off the image sensor • Dirt and oils on optical surfaces • Can be simulated using the full lens design and specialist tools (Fred, LightTools, ASAP) • Can be measured in the lab using a point light source (fiber bundle) and a robotic stage to rotate the camera 32 Faux stray light rendered in Adobe Photoshop. Subjectively good for art, bad for object recognition
  • 33. © 2020 GoPro, Inc. Vignetting • Vignetting is unintended blocking of light rays towards the edge of the image • Check all mechanical features for possible blocking • Filter/window holders • Main product housing • Have an optical engineer generate a mechanical FOV for importing in to CAD • Include mechanical tolerances in determining clear window sizing • Lens to sensor centration • Lens to sensor tip/tilt • Camera module to housing alignment tolerances (x, y, z, tip/tilt) 33
  • 34. © 2020 GoPro, Inc. Other Design Factors 34
  • 35. © 2020 GoPro, Inc. Thermal Considerations • Dark current doubles about every 6°C • Dark signal non-uniformity increases with temperature • Dark signal shot noise (= SQRT number of dark signal electrons) can become a factor at high temperatures • Hot pixels increase with temperature (defect thermal activation) • Black level corrections get worse with temperature • Thermal design of a camera should be considered up front • Design a low thermal resistance path from the sensor die to ambient Heatsink/Chassis Sensor Package PCB with thermal vias Thermal pad Heatsink Sensor Package PCB with hole Thermal pad 35
  • 36. © 2020 GoPro, Inc. Electrical Factors (Keep it Quiet) • Image sensors are mixed signal devices • Care must be taken to ensure low power supply noise, particularly on analog supply rails • Dedicated supply (LDO) for sensor VANA is a good idea • Careful placement and low-impedance routing of decoupling capacitors is essential • Ask sensor supplier for a module design reference guide • Prioritize placement of analog rail decoupling capacitors (place as close to the sensor as possible) with minimal vias • Analyze the noise • Capture dark raw images (no image processing) and extract noise components • Analog supply noise is particularly prone to creating temporal row noise • Compare with sensor noise on manufacturers evaluation system, or other reference design 36
  • 37. © 2020 GoPro, Inc. Selecting a Sensor & Lens 37
  • 38. © 2020 GoPro, Inc. Selecting a Sensor - Determine Key Requirements 1. Determine resolution requirements: • Object space resolution • Field of view • Max. object distance • Min. linear pixels for feature recognition 𝑃𝑖𝑥𝑒𝑙 𝑆𝑖𝑧𝑒 µ𝑚 = 𝑂𝑏𝑗𝑒𝑐𝑡 𝑆𝑝𝑎𝑐𝑒 𝑅𝑒𝑠𝑜𝑙𝑢𝑡𝑖𝑜𝑛 µ𝑚 × 𝐹𝑂𝑉 (𝑚𝑚) 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 (𝑚𝑚) 2. Wavelength of light to be imaged: visible or NIR-enhanced sensor 3. Color (full RGB, mono + R, etc.), or monochrome 4. Dynamic range requirements 5. High-speed motion (camera or scene): rolling/global shutter 6. Interface requirements for SOC 38
  • 39. © 2020 GoPro, Inc. Selecting a Sensor – Evaluate using Sensor Evaluation Kit • Use requirements from previous slide to select a sensor evaluation kit (EVK) • Look for sensors intended for similar applications • Many sensor companies have online product selection tools, or can support at the sales/application engineering level • If possible, pair EVK with lens that’s a close fit for final application (see following section) • Test EVK in your application including with minimum illumination conditions • If lens FOV does not match, adjust object distance, or chart size to compensate • If low light SNR is acceptable proceed to lens selection • If low light SNR is not acceptable: • Can required SNR be achieved by adjusting the lens aperture? • Trade-off = depth of field, lens size/cost • Can required SNR be achieved by increasing exposure time? • Trade-off = motion blur • If SNR requirements cannot be achieved, select a larger sensor (more light collection) and repeat process • Trade-off = depth of field, sensor cost, lens size/cost • Evaluate dynamic range • This is one of the more difficult tests to perform correctly, but online resources exist like Imatest test software tutorials 39
  • 40. © 2020 GoPro, Inc. Selecting a Lens 1. Determine optical format (from sensor size) 2. Select field of view  focal length 3. Determine target f-number 4. Check depth of field 5. MTF • From resolution study, determine line pairs/mm resolution needed • An MTF ≥ 30% at the target lp/mm is a good rule of thumb for the detection of edges • Check MTF across field: center, mid, corner regions 6. Pick a lens with a CRA that closely matches the sensor CRA specification 7. Check the impact of relative illumination on performance in corners (SNR decrease) 8. Determine if the lens includes a filter (IR-cut, IR passband, etc.), if one is needed 9. Check if the lens is suitable for the dynamic range requirements (HDR requires very good anti-reflective coatings) 40
  • 41. © 2020 GoPro, Inc. Resources Image Sensor Manufacturers ON Semiconductor: ON Semi Image Sensors Sony: Sony image sensors Omnivision: https://www.ovt.com/ Samsung: Samsung image sensors Fairchild Imaging (BAE Systems): https://www.fairchildimaging.com/ ams: https://ams.com/cmos-imaging-sensors Interfaces, Optics, and Imaging MIPI Alliance: https://www.mipi.org/ Edmunds Optics Knowledge Center: • Optics: https://www.edmundoptics.com/knowledge- center/#!&CategoryId=114 • Imaging: https://www.edmundoptics.com/knowledge- center/#!&CategoryId=175 Cambridge in Colour imaging tutorials: https://www.cambridgeincolour.com/tutorials.htm Imaging Test Software and Charts Imatest: https://www.imatest.com/ 41