SlideShare ist ein Scribd-Unternehmen logo
1 von 154
Downloaden Sie, um offline zu lesen
Dr. Mohieddin Moradi
mohieddinmoradi@gmail.com
Dream
Idea
Plan
Implementation
1
https://www.slideshare.net/mohieddin.moradi/presentations
2
− Vertical and Horizontal Fields of View
− F-Stop, F-Number, T-Number, , Minimum Illumination and Sensitivity
− Color Temperature Adjustment and Color Conversion in Camera
− Camera Beam Splitter Structure and Related Issuers
− Depth of Field, Depth of Focus & Permissible Circle of Confusion
− Broadcast Zoom Lens Technology
− 4K Lens Critical Performance Parameter
− Optical Accessories and Optical Filters
Outline
3
4
5
Types of Visible Perception Possible
− As move further from fovea, vision becomes more limited
− Colour vision only possible in central visual field
(Left eye)
6
Vertical and Horizontal Fields of View
(Monocular Vision)
Visual
Limit
Right
Eye
(94°)
R
L
Normal
Viewing
Field
Normal
Viewing
Field
Horizontal Sight Line
(Binocular
Vision)
Word
Pattern
Recognition
7
Horizontal field of view
• The central field of vision for most people covers an angle of
between 50° and 60° (objects are recognized).
• Within this angle, both eyes observe an object simultaneously.
• This creates a central field of greater magnitude than that
possible by each eye separately.
• This central field of vision is termed the 'binocular field' and within
this field
 images are sharp
 depth perception occurs
 colour discrimination is possible
Vertical and Horizontal Fields of View
Visual Limit
of Left Eye
Visual Limit
of Right Eye
Central Field of Vision
Vertical Field of View
• The typical line of sight is considered horizontal or 0 °.
• A person’s natural or normal line of sight is normally a 10 ° cone of
view below the horizontal and, if sitting, approximately 15 °.
Limit of Color
Discrimination
Limit of Color
Discrimination
Normal Sight Line
whilst Standing
Normal Sight Line
whilst Seated
Visual Limit
of Eye
Visual Limit
of Eye
Normal Line
of Sight
Angle Of View
8
− A certain range of the image that is captured by the camera and displayed on the picture monitor.
• The angle of view is measured by the angle between the center axis of the lens to the edges of the
image in the horizontal, vertical, and diagonal directions. Respectively, these are called the horizontal
angle of view, vertical angle of view, and diagonal angle of view.
Angle Of View
9
− Angle of view can be calculated from the following equation:
𝑤: Angle of view
𝑦: Image size on imager sensor (in horizontal, vertical and diagonal directions)
𝑓: lens focal length
𝑤 = 2 tan−1
𝑦
2𝑓
Angle Of View
10
𝑦
𝑓
ൗ
𝒘
𝟐
ൗ
𝒘
𝟐
Calculating from
W : Angle of View
L : Object Distance
Y: Object Dimension
Calculating from
L : Object Distance
f : Focal Length
Y’ : Image Size
Calculation of the Object Dimensions to Fill the Image format
11
𝑌 = 𝑌′
𝐿
𝑓
𝑌 = 2𝐿 tan
𝑊
2
Y: Object Dimension Y’ : Image Size
W : Angle of View
L : Object Distance Image Distance=
Approximately f
I. Angle of view becomes narrow when a telephoto lens is used.
II. In contrast, it becomes wider with a wide-angle lens.
• Consequently, the wider the angle of view, the wider the area of the image captured.
III. A camera’s angle of view also varies depending on the size of the imager.
• This means that 2/3-inch type CCD cameras and 1/2-inch type CCD cameras offer different angles of
view for lenses with the same focal lengths.
𝑤 = 2 tan−1
𝑦
2𝑓
Angle Of View, Image Circle and Image Size
Image Sizes for Television and Film (Actual Size) 12
13
https://qdownloader.io/download?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DjRvkpa-9Djs
Human Visual System
)middle layer of the eye(
Aperture
− In general, the word aperture refers to an opening, a hole, or any other type of narrow opening.
• When used in relation to the mechanism of a lens, it stands for the size of the lens’s opening that
determines the amount of light directed to the camera’s imager.
• The diameter of the lens aperture can be controlled by the lens iris.
• Iris consists of a combination of several thin diaphragms.
15
‫چشم‬ ‫مردمک‬
(
‫حدقیه‬
)
‫عنبیه‬
Diaphragm
Aperture
Iris
− The amount of light captured and directed to a camera’s imager is adjusted by a combination of
diaphragms integrated in the lens (This mechanism is called the lens iris).
• The Iris works just like the pupil of the human eye.
• By opening and closing these diaphragms, the diameter of the opening (also called aperture)
changes, thus controlling the amount of light that passes through it.
The amount of the iris opening is expressed by its F-stop.
16
‫چشم‬ ‫مردمک‬
(
‫حدقیه‬
)
‫عنبیه‬
Iris
17
Auto Iris
− Auto iris is a convenient function that detects the amount of light entering the lens and automatically
opens or closes the iris to maintain appropriate exposure.
• Auto iris is especially useful in situations where manual iris adjustment can be difficult, such as in ENG
applications.
• Auto iris lenses control the iris aperture by detecting and analyzing the amplitude of the video signal
produced in the camera.
• An iris control signal is generated according to the amplitude of this video signal, to either open or
close the iris for correct exposure.
18
Focal Length
– The focal length describes the distance between a lens and the point where light passing through it
converges on the optical axis. This point is where images captured by the lens are in focus and is called
the focal point.
19
Single Lens Compound Lens
Focal Length Focal Length
Focal Point
Principal Point
A lens with a short focal length:
– Captures a large area of the subject to provide a wide angle view.
– Amount of light entering the lens is that reflected from a large area of the
subject.
A lens with a long focal length:
– Captures only a small area of the subject to provide a magnified or close-up
view of the subject .
– Only the light reflected from a small area of the subject enters the lens,
resulting in a darker image.
The longer the focal length, the less light that enters the lens.
Focal Length
20
− It describes how bright a lens is, or, more simply,
The maximum amount of light a lens can direct to the camera’s image sensor.
F-number
𝑭 − 𝒏𝒖𝒎𝒃𝒆𝒓 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 𝑴𝒂𝒙𝒊𝒎𝒖𝒎 𝑰𝒓𝒊𝒔 𝑶𝒑𝒆𝒏𝒊𝒏𝒈 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓
21
This amount of light is determined by two factors:
I. The widest iris opening that the lens allows or its maximum aperture
– A wider iris opening (aperture diameter) simply means more light passing through the Lens (bigger 𝑫).
II. The focal length of the lens
– The longer the focal length, the less light that enters the lens (smaller 𝒇).
F-number
𝑭 − 𝒏𝒖𝒎𝒃𝒆𝒓 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 𝑴𝒂𝒙𝒊𝒎𝒖𝒎 𝑰𝒓𝒊𝒔 𝑶𝒑𝒆𝒏𝒊𝒏𝒈 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓
22
– Interestingly, in this definition, brighter lenses are described with smaller F-numbers. This can be
understood by substituting a shorter focal length and larger maximum iris opening in the equation.
– A lens’s F-number is usually labeled on its front.
– Since zoom lenses offer a variable focal length, these are described with an F-number range across the
entire zoom range (e.g., F2.8 - F4.0).
– While F-number is strictly used to describe a lens’s brightness performance, a parameter often mixed up
with this is F-stop.
F-number
𝑭 − 𝒏𝒖𝒎𝒃𝒆𝒓 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 𝑴𝒂𝒙𝒊𝒎𝒖𝒎 𝑰𝒓𝒊𝒔 𝑶𝒑𝒆𝒏𝒊𝒏𝒈 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓
23
F-stop
− F-number indicate the maximum amount of incident light with the lens iris fully opened.
− F-stop indicates (opposite f-number):
The amount of incident light at smaller iris openings.
Notes:
• F-stops are calibrated from the lens’s widest iris opening to its smallest using the same above equation as
F-number, however the diameter (D) being that for the given iris opening.
• The most important difference to note is that F-stops are a global reference for judging the amount of light
that should be allowed through the lens during a camera shoot.
𝑭 − 𝒔𝒕𝒐𝒑 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓)
More F-stop More Light Stop (Less Light Transmission)
24
Diaphragm
Aperture
F - 1.4
F - 2
F - 2.8
F - 4
F - 5.6
F - 8
F - 11
F - 16
F - 22
Some lenses are Faster than others
Zoom lenses are Slower than Prime lenses
• Smaller numbers let in more light.
• The lower the number the “FASTER” the lens (Recall Shutter)
• Bigger numbers let in less light.
• The higher the number the “Slower” the lens (Recall Shutter)
Each stop lets in half as much light as the one before it.
F-stop and Optical Speed
𝑭 − 𝒔𝒕𝒐𝒑 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓)
25
Main wide camera
• f/1.8 aperture
• 26mm equivalent focal length
• 12-megapixel resolution
• 100% autofocus pixels
• Optical image stabilisation
F-stop
26
Ultra wide camera
• f/2.4 aperture
• 13mm equivalent focal length
• 0.5x, 120-degree field-of-view
• 12-megapixel resolution
• 120° field of view
Telephoto
• f/2.0 aperture
• 56mm equivalent focal length
• 2x optical zoom
• 12-megapixel resolution
• Optical image stabilisation (OIS)
iPhone 11 Pro Aperture differences
F-stop and Depth of Field
− It is also important to note that F-stop is a key factor that affects depth of field.
The smaller the F-stop, the shallower the depth of field, and vice versa..
27
Wide Aperture Small Aperture
Shallow Depth of Field Deep Depth of Field
More Light Reaching Image Sensor Less Light Reaching Image Sensor
FOCUS
28
Depth of Field
F-stop and Depth of Field
29
Aperture and Depth of Field
A dot of light
from the subject
Large Aperture
Small Aperture
Light Sensor
Narrow
Depth of Field
Deep
Depth of Field
Lens
Focus Distance
Focus Distance
Focus Distance
Depth of Field Range
Depth of Field Range
Depth of Field Range
F1.4
F5.6
F22
− From the viewpoint of characteristics of lenses, shooting with the aperture set in range of f-4 to f-8 Is
generally recommended for good quality picture.
− Set FILTER control to bring the aperture setting into that range.
− However, this may not apply when special composition is desired.
F-stop
𝑭 − 𝑺𝒕𝒐𝒑 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓)
30
− F-stops are a global reference for judging the amount of light that should be allowed through the lens
during a camera shoot.
− F-stop calibrations increase by a factor of root 2, such as 1.4, 2, 2.8, 4, 5.6, 8, 11, 16, and 22.
− As the value of the F-stop increments by one step (e.g., 5.6 to 8.0), the amount of light passing through the
lens decreases by one half.
− This relation is due to the fact that F-stop is a function of the iris diameter, while incident light is a function
of the square of the diameter.
F-stop
𝑭 − 𝑺𝒕𝒐𝒑 =
𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉)
𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓)
31
To calculate the steps in a full stop (1 EV(Exposure Value)) one could use
20×0.5, 21×0.5, 22×0.5, 23×0.5, 24×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏, 𝟐, 𝟑, 𝟒 … 𝒐𝒓 𝑨𝑽 = 𝑲)
The steps in a half stop (1/2 EV) series would be
20/2×0.5, 21/2×0.5, 22/2×0.5, 23/2×0.5, 24/2×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟓, 𝟏 , 𝟏. 𝟓, 𝟐, … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟐)
The steps in a third stop (1/3 EV) series would be
20/3×0.5, 21/3×0.5, 22/3×0.5, 23/3×0.5, 24/3×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏/𝟑 , 𝟐/𝟑, 𝟏, 𝟒/𝟑 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟑)
The steps in a quarter stop (1/4 EV) series would be
20/4×0.5, 21/4×0.5, 22/4×0.5, 23/4×0.5, 24/4×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟐𝟓, 𝟎. 𝟓, 𝟎. 𝟕𝟓, 𝟏 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟒)
Fractional Stops
0 0.25 0.5 0.75 1 1.25 1.5 1.75 2
1.0 1.1 1.2 1.3 1.4 1.5 1.7 1.8 2
𝟏. 𝟎𝟎 𝟏. 𝟎𝟗 𝟏. 𝟏𝟖 𝟏. 𝟐𝟗 𝟏. 𝟒𝟏 𝟏. 𝟓𝟒 𝟏. 𝟔𝟖 𝟏. 𝟖𝟑 𝟐. 𝟎𝟎
Full-stop
One-half-stop
1/2 , light reduction
AV
1/4 , light reduction
F-stop
Calculated
The one-stop unit is also known
as the EV (Exposure Value) unit.
𝑓 − 𝑠𝑡𝑜𝑝 = 2𝐴𝑉 = 2𝐴𝑉×0.5
𝐴𝑉: 𝐴𝑝𝑒𝑟𝑡𝑢𝑟𝑒 𝑉𝑎𝑙𝑢𝑒
1/8 , light reduction
One-quarter-stop
32
To calculate the steps in a full stop (1 EV(Exposure Value)) one could use
20×0.5, 21×0.5, 22×0.5, 23×0.5, 24×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏, 𝟐, 𝟑, 𝟒 … 𝒐𝒓 𝑨𝑽 = 𝑲)
The steps in a half stop (1/2 EV) series would be
20/2×0.5, 21/2×0.5, 22/2×0.5, 23/2×0.5, 24/2×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟓, 𝟏 , 𝟏. 𝟓, 𝟐, … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟐)
The steps in a third stop (1/3 EV) series would be
20/3×0.5, 21/3×0.5, 22/3×0.5, 23/3×0.5, 24/3×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏/𝟑 , 𝟐/𝟑, 𝟏, 𝟒/𝟑 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟑)
The steps in a quarter stop (1/4 EV) series would be
20/4×0.5, 21/4×0.5, 22/4×0.5, 23/4×0.5, 24/4×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟐𝟓, 𝟎. 𝟓, 𝟎. 𝟕𝟓, 𝟏 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟒)
Fractional Stops
0 0.25 0.3 0.5 0.7 0.75 1 1.25 1.3 1.5 1.7 1.75 2
1.0 1.1 1.1 1.2 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.8 2
𝟏. 𝟎𝟎 𝟏. 𝟎𝟗 1.10 𝟏. 𝟏𝟖 1.27 𝟏. 𝟐𝟗 𝟏. 𝟒𝟏 𝟏. 𝟓𝟒 1.56 𝟏. 𝟔𝟖 1.08 𝟏. 𝟖𝟑 𝟐. 𝟓𝟓
Full-stop
One-half-stop
1/2 , light reduction
AV
1/4 , light reduction
1/8 , light reduction
F-stop
Calculated
1/6 , light reduction
0.33 0.66 1.66
1.33
One-third-stop
One-quarter-stop
33
𝑓 − 𝑠𝑡𝑜𝑝 = 2𝐴𝑉 = 2𝐴𝑉×0.5
𝐴𝑉: 𝐴𝑝𝑒𝑟𝑡𝑢𝑟𝑒 𝑉𝑎𝑙𝑢𝑒
Fractional Stops
34
𝟏𝟎𝟒
Real Word
1.6 billions 𝒄𝒅/𝒎𝟐
1𝟎𝟎 𝐦𝐢𝐥𝐥𝐢𝐨𝐧 𝒄𝒅/𝒎𝟐
1 𝐦𝐢𝐥𝐥𝐢𝐨𝐧 𝒄𝒅/𝒎𝟐
1𝟎𝟎𝟎𝟎 𝒄𝒅/𝒎𝟐
1𝟎𝟎 𝒄𝒅/𝒎𝟐
1 𝒄𝒅/𝒎𝟐
0. 𝟎𝟏 𝒄𝒅/𝒎𝟐
0. 𝟎𝟎𝟎𝟎𝟏 𝒄𝒅/𝒎𝟐
0. 𝟎𝟎𝟎𝟎𝟎𝟏 𝒄𝒅/𝒎𝟐
0 𝒄𝒅/𝒎𝟐
Direct Sunlight
Sky
Interior Lighting
Moonlight
Starlight
Absolute
Darkness
Cinema SDR TV HDR TV
Adjustment by the
human eye
Adjustment
range
of
the
human
eye
(flexible)
What We See
Visible Light
𝟏𝟎−𝟏
The dynamic range of
a typical camera and
lens system is typically
𝟏𝟎𝟓
with a fixed iris.
35
Real-world Luminance Levels and the High-level Functionality of the HVS
Light Levels in Stop
36
− As many people know, movie camera lenses are rated by a T- number instead of an F-number.
− The F-number expresses the speed of the lens on the assumption that lens transmits 100% of the incident
light.
− In reality, different lenses have different transmittance, so two lenses with the same F-number may actually
have different speed.
− The T-number solves this problem by taking both the diaphragm diameter and transmittance into account.
− Two lenses with the same T-Number will always give the same image brightness.
T-number
𝑇 − 𝑛𝑢𝑚𝑏𝑒𝑟 =
𝐹 − 𝑛𝑢𝑚𝑏𝑒𝑟
𝑇𝑟𝑎𝑛𝑠𝑚𝑖𝑡𝑎𝑛𝑐𝑒(%)
× 10
37
− If you have zoomed with a zoom lens open to full aperture, you may have noted a drop in video level at
the telephoto end. This is called the F drop.
− F drop is a major determinant of the value of zoom lenses used in live on-site sports broadcasts, which
require a long focal length and must frequently contend with twilight or inadequate artificial illumination.
F-Drop
38
− The entrance pupil of a zoom lens changes in diameter as the focal length is changed.
− As you zoom toward the telephoto end, the entrance pupil gradually enlarges. When the entrance pupil
diameter is equal to the diameter of the focusing lens group, it cannot become any larger, so the F-
number drops. That is the reason for the F drop.
− If entrance pupil (effective aperture) diameter > front lens diameter, then F-Number drops.
− To eliminate F drop completely, the focusing lens group has to be larger than the entrance pupil at the
telephoto end of the zoom. It has to be at least equal to the focal length at the telephoto end divided by
the F-number.
− To reduce the size and weight of a zoom lens, it is common to allow a certain amount of F drop. For better
composition effect, however, in some studio zoom lenses the focusing group is made large enough that no
F drop occurs.
− F drop is a major determinant of the value of zoom lenses used in live on-site sports broadcasts, which
require a long focal length and must frequently contend with twilight or inadequate artificial illumination.
F-Drop
39
Pedestal/Master Black
Pedestal or Master Black
Sut-up Level
Absolute black level or
the darkest black that can
be reproduced by the
camera.
40
Black Level
Blank Level
− Pedestal, also called master black, refers to the absolute black level or the darkest black that can be
reproduced by the camera.
− The pedestal can be adjusted as an offset to the set-up level.
− Since pedestal represents the lowest signal level available, it is used as the base reference for all other
signal levels.
 If the pedestal level is set too low due to improper adjustment, the entire image will appear darker
than it should be (the image will appear blackish and heavier).
 If the pedestal level is set too high, the image will look lighter than it should be (the image will look
foggy with less contrast).
− By adjusting the pedestal level, it is possible to intentionally increase the clearness of an image
• when shooting a foggy scene
• when shooting subjects through a window simply by lowering it
Pedestal/Master Black
41
Dynamic Range
− In general, dynamic range indicates the difference or ratio of the smallest and largest amount of
information an electrical device can handle.
− For a camera, “Dynamic Range” indicates:
− The range between the smallest and largest amount of light that can be handled.
• The native dynamic range of high performance SDR (Standard Dynamic Range) video cameras is still
in the range of merely 600% (in HDR (High Dynamic Range) video it is more than 1000%).
• This 600% dynamic range implies that the camera’s CCD can generate a video signal six times larger
in amplitude than the 1.0 V video standard.
42
− Many methods have been developed to get around this and enable more effective light handling.
− These include :
• Automatic Gain Control
• The electronic shutter
• ND filters
Dynamic Range
43
Minimum Illumination
− Minimum illumination indicates the minimum amount of light required for shooting with a camera,
particularly in the dark. It is expressed in Lux.
− When comparing minimum illumination specifications, it is important to consider the conditions they were
measured at.
• Cameras provide a gain up function that amplifies the signal when a sufficient level is not obtained.
Although convenient for shooting under low light, gain up also boosts the signal’s noise level.
− Minimum illumination is usually measured with the highest gain up setting provided by the camera, and
therefore does not represent the true sensitivity of the camera.
• Simply put in mind, minimum illumination specifications should be evaluated with this gain up setting.
44
Sensitivity
− The sensitivity of a camera indicates its ability to shoot in low-light areas without noise being introduced.
(It defines a camera’s raw response to light)
− Sensitivity is sometimes confused with minimum Illumination ,but there is a significant difference between
the two.
• Minimum illumination describes the lowest light level in which a camera can capture images without
taking noise factors into account.
− For this reason, to determine a camera’s true performance in low-light shooting, it is better to refer to
sensitivity specifications first.
45
− The camera’s sensitivity is measured by opening the camera’s lens iris from its closed position until the
white area of the grayscale chart reaches 100% video level (on a waveform monitor).
The lens’s F-stop reading at this state is the camera’s sensitivity.
− In CCD cameras, sensitivity is largely governed by:
• The aperture ratio (size) of the photosensitive sites
• On- Chip Lens structure
− The more light gathered onto each photo-sensor, the larger the CCD output and the higher the sensitivity.
Sensitivity
46
− Sensitivity is described using the camera lens F-stop number
• Camera A: Sensitivity: f11 at 2000 lx (3200K, 89.9% reflectance)
• Camera B: Sensitivity: f8 at 2000 lx (3200K, 89.9% reflectance)
− The larger the F-number indication (Camera A), the higher the sensitivity
• To make a fair comparison between cameras, sensitivity specifications are indicated with the
conditions that were used to measure them.
• In the above two cases, a 2000 lx/3200K illuminant was used to light a grayscale chart that reflects
89.9% of the light hitting its surface.
Sensitivity
47
Sensitivity, Grayscale Chart
48
Frequency Response
49
0.5 MHz
1 MHz
2 MHz
4 MHz
4.8 MHz
5.8 MHz
Multiburst for measurement of
frequency response
50
– On the Kelvin scale, zero degrees K (0 K) is defined as “absolute zero” temperature.
– This is the temperature at which molecular energy or molecular motion no longer exists.
– Since heat is a result of molecular motion, temperatures lower than 0 K do not exist.
Kelvin is calculated as:
K = Celsius + 273.15 ºC
Color Temperature
51
Color Temperature
– The spectral distribution of light emitted from a piece of carbon (a
black body that absorbs all radiation without transmission and
reflection) is determined only by its temperature.
– When heated above a certain temperature, carbon will start
glowing and emit a color spectrum particular to that temperature.
– This discovery led researchers to use the temperature of heated
carbon as a reference to describe different spectrums of light.
– This is called Color temperature.
52
Transmission, Reflection, Absorb
Color Temperature, Recall
53
250
200
150
100
50
0
300 350 400 450 500 550 600 650 700 750 800
Color Temperature
54
55
Cool & Warm Colors, Recall
Natural light Artificial light
Color Temperature
56
– Our eyes are adaptive to changes in light source colors – i.e., the color of a particular object will always
look the same under all light sources: sunlight, halogen lamps, candlelight, etc.
– However, with color video cameras this is not the case, bringing us to the definition of “color temperature.”
– When shooting images with a color video camera, it is important for the camera to be color balanced
according to the type of light source (or the illuminant) used.
Color Temperature
57
This is because different light source types emit different colors of light
(known as color spectrums) and video cameras capture this difference.
The camera color temperature is lower
than environment color temperature
The camera color temperature is upper
than environment color temperature
Color Temperature
58
– In video technology, color temperature is used to describe the spectral distribution of light emitted from a
light source.
– The cameras do not automatically adapt to the different spectrums of light emitted from different light
source types.
– In such cases, color temperature is used as a reference to adjust the camera’s color balance to match the
light source used.
• For example, if a 3200K (Kelvin) light source is used, the camera must also be color balanced at
3200K.
Color Temperature
59
Color Temperature Conversion
– All color cameras are designed to operate at a certain color temperature .
– For example, Sony professional video cameras are designed to be color balanced at 3200K, meaning that
the camera will reproduce colors correctly provided that a 3200K illuminant is used.
– This is the color temperature for indoor shooting when using common halogen lamps.
60
Cameras must also provide the ability to shoot under illuminants with color temperatures other than 3200K.
– For this reason, video cameras have a number of selectable color conversion filters placed before the
prism system.
– These filters optically convert the spectrum distribution of the ambient color temperature (illuminant) to
that of 3200K, the camera’s operating temperature.
– For example, when shooting under an illuminant of 5600K, a 5600K color conversion filter is used to convert
the incoming light’s spectrum distribution to that of approximately 3200K.
Color Temperature Conversion
61
Color Temperature Conversion
62
– When only one optical filter wheel is available within the camera, this allows all filters to be Neutral Density
types providing flexible exposure control.
– The cameras also allow color temperature conversion via electronic means.
– The Electronic Color Conversion Filter allows the operator to change the color temperature from 2,000K to
20,000K as typical.
Color Temperature Conversion
63
Color Temperature Conversion
64
− “Why do we need color conversion filters if we can correct the change of color temperature electrically
(white balance)?".
• White balance electrically adjusts the amplitudes of the red (R) and blue (B) signals to be equally
balanced to the green (G) by use of video amplifiers.
• We must keep in mind that using electrical amplification will result in degradation of signal-to-noise ratio.
• Although it may be possible to balance the camera for all color temperatures using the R/G/B amplifier
gains, this is not practical from a signal-to-noise ratio point of view, especially when large gain up is
required.
The color conversion filters reduce the gain adjustments required to achieve correct white balance.
Color Temperature Conversion
65
Variable Color Temperature
− The Variable Color Temp. Function allows the operator to change the color temperature from 20,000K to
2,000K
66
Preset Matrix Function
– Preset for 3 Matrices can be set.
– The Matrix level can be preset to different lightings.
– The settings can be easily controlled by the control panel.
67
White Balance & Color Temperature
68
The different light source types emit different colors of light (known as color spectrums) and video cameras capture this difference.
White Balance & Color Temperature
69
Daylight Incandescent Fluorescent
Halogen Cool White LED Warm White LED
Wavelength (nm)
Wavelength (nm)
Wavelength (nm)
Wavelength (nm) Wavelength (nm) Wavelength (nm)
Intensity
Intensity
Intensity
Intensity
Intensity
Intensity
− The video cameras are not adaptive to the different spectral distributions of each light source type.
• In order to obtain the same color reproduction under different light sources, color temperate
variations must be compensated by converting the ambient color temperature to the camera’s
operating color temperature (Optically or Electrically).
• Once the incoming light’s color temperature is converted to the camera’s operating color
temperature (Optically or Electrically), this conversion alone does not complete color balancing of
the camera, therefore more precise color balancing adjustment must be made .
White Balance
70
A second adjustment must be made to precisely match the incoming light’s color temperature to
that of the camera known as “white balance”
White Balance
White balance refers to shooting a pure white object, or a grayscale chart, and adjusting the camera’s video amplifiers so
the Red, Green, and Blue channels all output the same video level.
71
More Precise
Color Balancing
White Balance
72
White Balance
− Why by Performing this adjustment for the given light
source, we ensure that the color “white” and all other
colors are correctly reproduced?
• The color “white” is reproduced by combining Red,
Green, and Blue with an equal 1:1:1 ratio.
• White Balance adjusts the gains of the R/G/B video
amplifiers to provide this output ratio for a white
object shot under the given light source type.
• Once these gains are correctly set for that light
source, other colors are also output with the correct
Red, Green, and Blue ratios.
73
(SDTV)
Y=0.11B+0.3R+0.59G
White Balance
– For example, when a pure yellow object is shot, the
outputs from the Red, Green, and Blue video amplifiers
will have a 1:1:0 ratio (yellow is combined by equally
adding Red and Green).
– In contrast, if the White Balance is not adjusted, and
the video amplifiers have incorrect gains for that light
source type, the yellow color would be output
incorrectly with, for example, a Red, Green, and Blue
channel ratio of 1:0.9:0.1.
– Note: White balance must be readjusted after
changing lens.
74
(SDTV)
Y=0.11B+0.3R+0.59G
75
White Balance – Camera Shading
– Even brightness white source
• Ambi-Illuminator
– Often the center can be brighter than the edges
– Measure light output with a luminance spot meter
– Set camera gain to 0dB & camera controls to zero
– Set camera F-stop between f4 to f5.6
• Adjust distance of camera to source
– Defocus Camera
76
White Balance – Camera Shading
– Select WFM display and configure for RGB parade.
– No color hue should be present
– Red, green, blue channels must be balanced
– Ideally RGB should be at same level and flat
Original RGB parade waveform After white shading adjustment
77
White Balance with the Vector Display
 Monochrome image should be centered tightly on
the vector graticule
 Off-center ovular shape
indicates shading error
 Use gain controls on the vector display to confirm
correct white balance
78
− A neutral gray scale with the color balance
skewed toward warm light.
• Notice how the trace on the vectorscope
is pulled toward red/orange.
− The same chart with color balance skewed
toward blue.
− Notice how the trace on the vectorscope
trace is pulled toward blue.
− The gray scale with neutral color balance —
the vectorscope shows a small dot right in the
center, indicating that there is no color at all:
zero saturation.
Color Balancing with Vectorscope
79
− Parade view on the waveform monitor clearly shows the incorrect color balance of what should be a
neutral gray chart.
− On the waveform, the Red channel is high, while Green is a bit lower and Blue is very low (top end of
each channel is circled in this illustration).
− This is why so many colorists and DITs say that they “live and die by parade view.”
Color Balancing with Waveform Monitor
Preset White
– Preset White is a white-balance selection used in shooting scenarios
• When the white balance cannot be adjusted
• Or when the color temperature of the shooting environment is already known (3200K or 5600K
for instance).
– This means that by simply choosing the correct color conversion filter, optical or electronic, the
approximate white balance can be achieved.
– It must be noted however, that this method is not as accurate as when taking white balance.
o By selecting Preset White, the
R/G/B amplifiers used for white-
balance correction are set to
their center values.
Center Values
80
AWB (Auto White Balance)
− Unlike the human eye, cameras are not adaptive to different color temperatures of different light source
types or environments.
• This means that the camera must be adjusted each time a different light source is used, otherwise the
color of an object will not look the same when the light source changes.
• This is achieved by adjusting the camera’s white balance to make a ‘white’ object always appear
white.
• Once the camera is adjusted to reproduce white correctly, all other colors are also reproduced as
they should be.
81
AWB (Auto White Balance)
− The AWB is achieved by framing the camera on a white object – typically a piece of white paper/clothe
or grayscale chart – so it occupies more than 70% of the display.
− Then pressing the AWB button on the camera body instantly adjusts the camera white balance to match
the lighting environment.
Macbeth Chart
82
ATW (Auto Tracing White Balance)
– The AWB is used to set the correct color balance for one particular shooting environment or color
temperature.
– The ATW continuously adjusts camera color balance in accordance with any change in color
Temperature.
• For example, imagine shooting a scene that moves from indoors to outdoors. Since the color
temperature of the indoor lighting and outdoor sunlight are very different, the white balance must be
changed in real time in accordance with the ambient color temperature.
83
Black Balance
− To ensure accurate color reproduction throughout all
video levels, it is important that the red, green, and
blue channels are also in correct balance when there
is no incoming light.
− When there is no incoming light, the camera’s red,
green, and blue outputs represent the “signal floors”
of the red, green, and blue signals, and unless these
signal floors are matched, the color balance of other
signal levels will not match either.
84
Black Balance
− It is necessary when:
• Using the camera for the first time
• Using the camera after a significant perid out of use
• Sudden change in temperature
– Without this adjustment, the red, green, and blue color
balance cannot be precisely matched even with
correct white balance adjustments.
85
86
Basic Composition of Beam Splitter and Image Sensor
87
Aperture
Using a zoom lens correctly requirements:
• Flange back adjustment
• White balance adjustment, White shading adjustment
• Cleaning
88
Basic Composition of Beam Splitter and Image Sensor
89
Prism
Red Sensor
Green Sensor
Blue Sensor
Red Filter
Blue Filter
Green Filter
Total
Reflectance
Dichriotic
Surface
Incident
Light
Prism and Dichroic Layers
90
Green cost
Magenta cost
Prism and Dichroic Layers
91
Transmittance of dichroic coating
Spectral characteristic of blue-reflecting
dichroic coating
Spectral characteristic of an entire
color separation system
Prism and Dichroic Layers
92
– The dichroic layer is used to reflect one specific color
while passing other colors through itself.
– The three-color prisms use a combination of total
reflection layers and color selective reflection layers to
confine a certain color.
– For example, the blue prism will confine only the blue
light, and will direct this to the blue imager.
– White shading is seen in cameras that adopt a dichroic
layer in their color separation system.
Prism and Dichroic Layers
93
Characteristic Variations due to Polarization
94
– Light can be thought of as a mixture if transverse waves, some oscillating
perpendicular to the plane of incidence (S components) and some
oscillating parallel to it (P components).
– Natural light contains an equal mixture of S and P components, but light
reflected from a glossy surface is polarized, because the S components
are reflected more strongly than the P components.
– A dichroic coating has different characteristics for S polarized light and P
polarized light. The color of polarized light is therefore different from its
original.
– This effect can be prevented by placing a quarter-wave plate in front of
the prism to change the plane polarization of incident light to circular
polarization.
• A quartz filter used as a quater-wave plate can almost
completely eliminate the polarization effect.
• A disadvantage is the high cost of the filter material.
Polarization characteristics ofcolorseparationprism
Correction of polarization by a quartz filter
Quarter-Wave Plate
95
– A quarter-wave plate has an internal optic axis.
– It generates a quarter-wave phase difference between light polarized in the plane parallel to the optic axis
and light polarized in the plane perpendicular to the optic axis.
– Circularly polarized light can be thought of as a composition of two components that are polarized in
perpendicular planes and are one-quarter wavelength out of phase.
– Wave plate therefore has the following
properties:
• It changes circularly polarized light to light
polarized in a plane 45 degree to its optic
axis.
• It changes light polarized in a plane 45
degree to its optic axis into circularly
polarized light.
Quarter-Wave Plate
96
– A quartz plate is double-refractive, with different indices of refraction for ordinary rays and extraordinary
rays.
– If the refractive index for ordinary rays is 𝒏𝒐, the refractive index for extraordinary rays is 𝒏𝒆, and the
thickness of the quartz plate is 𝒅, then the plate is a quarter-wave plate for wavelengths 𝜆 satisfying the
equation:
𝑁 +
1
4
𝜆 = 𝑛𝑜 − 𝑛𝑒 𝑑
𝑁: 𝑖𝑛𝑡𝑒𝑔𝑒𝑟
𝑛𝑜 = 1.5443
𝑛𝑒 = 1.5534
Flange-Back/Back Focal Length
Ff (flange focal length) ring lock screw.
97
98
Flange-Back/Back Focal Length
– Flange-back is an important specification to keep in mind when choosing a lens.
– Flange-back describes the distance from the camera’s lens-mount plane (ring surface or flange) to the
imager’s surface.
– In other words, flange-back is the distance that the mounted lens must correctly frame images on the
camera’s image sensor.
– Therefore, it is necessary to select a lens that matches the flange-back specifications of the given
camera.
Back Focal Length
– Similar to flange-back is back focal length, which describes the distance from the very end of the lens (the
end of the cylinder that fits into the camera mount opening) to the imager’s surface.
– The back focal length of the camera is slightly shorter than its flange-back.
Flange-Back/Back Focal Length
99
Flange-back is measured differently depending on whether the camera uses a three-chip or one-chip
imaging system
– The flange-back of a one-chip camera is simply:
The distance between the lens mount plane and the imager’s surface.
– The flange-back of a three-chip camera additionally includes:
• The distance that light travels through the prism system used to separate it into R, G, and B color
components.
• The distance that light travels through this glass material is converted to the equivalent distance if it
had traveled through air.
− If a glass block of thickness d (mm) and refractive index n is inserted behind the lens, the flange-back is
affected according to the formula:
Flange-Back
100
𝑭𝑩 𝒊𝒏 𝒂𝒊𝒓 = 𝑭𝑩 𝒂𝒄𝒕𝒖𝒂𝒍 − (𝟏 −
𝟏
𝒏
) × 𝒅
Flange-Back
− In today’s cameras, flange-back is determined by the lens-mount system that the camera uses.
• Three-chip cameras use the bayonet mount system
• One-chip security cameras use either the C-Mount or CS-Mount system.
• The flange-back of the C-Mount and CS-Mount systems is standardized as 17.526 mm and 12.5 mm,
respectively.
• There are three flange-back standards for the bayonet mount system: 35.74 mm, 38.00 mm, and 48.00
mm.
101
Flange-Back Adjustment
F.B adjustment
“To fit the flange back of zoom lens to the flange back of camera”
• Without it ,focus is change during focusing.
Tracking Adjustment
“F.B adjustment for R,G,B channels”
• Tracking adjustment is not needed in CCD/CMOS camera because the fixation positions of CCDs and
CMOSes are standardized in accordance with the longitudinal chromatic aberration of lens.
102
103
Flange-Back Adjustment Procedure
Flange-Back Adjustment Procedure
Sony Instruction
1. Set the iris control to manual, and open the iris fully.
2. Place a flange focal length adjustment chart approximately 3 meters from
the camera and adjust the lighting to get an appropriate video output
level.
3. Loosen the Ff (flange focal length) ring lock screw.
4. With either manual or power zoom, set the zoom ring to telephoto.
5. Aim at the flange focal length adjustment
6. Set the zoom ring to wide angle.
7. Turn the Ff ring to bring the chart into focus. Take care not to move the
distance ring.
8. Repeat steps 4 through 7 until the image is in focus at both telephoto and
wide angle.
9. Tighten the Ff ring lock screw.
Place a Siemens star chart at an 3m for a studio
or ENG lens, and 5 to 7 m for an outdoor lens
104
Flange-Back Adjustment Procedure
105
Canon Instruction (Back Focus Adjustment)
− If the relation between the image plane of the lens and the image plane of the camera is incorrect, the
object goes out of focus at the time of zooming operation. Follow the procedure below to adjust the back
focus of the lens.
1. Select an object at an appropriate distance (1.6 to 3m recommended).
Use any object with sharp contrast to facilitate the adjustment work.
2. Set the iris fully open.
3. Set the lens to the telephoto angle by turning the zoom ring.
4. Bring the object into focus by turning the focus ring.
5. Set the lens to the widest angle by turning The zoom ring.
6. Loosen the flange back lock screw, and turn the flange back adjusting
ring to bring the object into focus.
7. Repeat steps 3 to 6 a few times until the object is brought into focus at
both the widest angle and telephoto ends.
8. Tighten the flange back lock screw.
Flare
– Flare is caused by numerous diffused (scattered) reflections of the incoming light within the camera lens.
– This results in the black level of each red, green, and blue channel being raised, and/or inaccurate color
balance between the three channels.
106
R channel G channel B channel
Inaccuracy of color in darker regions of the grayscale
Pedestal level balance incorrect due to the flare effect (B
channel pedestal higher than R channel and G channel)
Volt
Volt
H
H
CCD Imager WF Monitor
Iris
Ideal Lens
Real Lens
Flare
107
CCD Imager WF Monitor
Iris
H
H
Ideal Lens
Real Lens
Volt
Volt
Flare
108
– On a video monitor, flare causes the picture to appear as a misty (foggy) image, sometimes with a color
shade.
– In order to minimize the flare effect:
A flare adjustment function is pprovided, which optimizes the pedestal level and corrects the balance
between the three channels electronically.
Test card for overall flare measurement Test card for localized flare measurement
109
Flare
Master Flare Function
− The Master FLARE function enables one VR to control the level of the master FLARE with keeping the
tracking of all R/G/B channels.
− This feature makes it possible to control during operation since the color balance is never off.
110
111
Lens Flare
− Lens flare is the light scattered in lens systems.
− Flare manifests itself as swift in black levels with a change light level.
Camera Alignment with Diamond Display
112
Blacks Lifted
Slightly Cool
Green-Blue White Points slightly Blue
Green-Red White Points slightly Green
Green-Blue
White Point
Green-Red
White Point
Camera Alignment with Diamond Display
Flare Adjustment
• Iris down the camera
• Set black level to 0mv
• Adjust Iris so white chip is 1 to
2 f-stop above 700mv
• Adjust the flares for black chip
to 0mv
Black
Lift
Chip Chart
White Shading
Shading: Any horizontal or vertical non-linearity introduced during the image capture.
White shading: It is a phenomenon in which a green or magenta cast appears on the upper and lower parts
of the screen, even when white balance is correctly adjusted in the screen center.
113
– Due to differences in the angle of incidence of light on the dichroic coating's, when the white balance is
correct at the center of the image, the upper and lower edges may have a green or magenta cast.
– A dichroic coating exploits the interference of light. Different angles of incidence result in different light
paths in a multilayer coating, causing variations in the color separation characteristic. As a general rule,
the larger the angle of incidence, the more the characteristic is shifted in the short-wavelength direction.
White Shading
114
Incidence characteristic of a blue-reflecting dichroic coating
Relation between Exit Pupil and White Shading
– The exit pupil refers to the (virtual) image of the diaphragm formed by the lenses behind the diaphragm.
– A pencil of rays exiting from a zoom lens diverges from a point on the exit pupil, so the rays directed
toward the upper and lower edges of the image strike the dichroic coating at different angles, as can be
seen in Figure. The resulting differences in characteristics shade the upper and lower edges of the image
toward magenta or green.
White Shading
115
Entrance
Pupil
Exit
Pupil
Diaphragm
White Shading
116
Relation between Exit Pupil and White Shading
– Due to vignetting, when the lens is zoomed or stopped down, the exit pupil changes slightly, causing
changes in the shading.
– Use of an extender also causes shading effects by changing the exit pupil.
– The amount of shading is related to the exit pupil of the lens, so white shading has to be readjusted when
a lens is replaced by a lens with a different exit pupil distance.
Vignetting
Color Shading of Defocused Images
– This effect is not present when the image is in focus, but when the subject has depth, so that part of it is
defocused, the colors of the defocused part are shaded in the vertical direction.
– As with white shading, the cause is the difference in spectral characteristics at different angles of
incidence on the dichroic coating.
White Shading
117
– Because rays a and b strike the dichroic
coating at different angles, ray a is
transmitted as magenta light and ray b as
closer to green.
• When the image is in focus, both rays arrive at
the same point, and their colors average out
so that no shading occurs.
• When the image is out of focus, however, part
of it looks magenta and part of it looks green.
This effect is difficult to correct electronically.
– The color-filtering characteristics of each prism slightly change according to the angle that the light enters
each reflection layer (incident angle).
– Different incident angles cause different light paths in the multilayer-structured dichroic coating layer,
resulting in a change of the prism’s spectral characteristics.
– This effect is seen as the upper and lower parts of the screen having a green or magenta cast, even with
the white balance correctly adjusted in the center.
White Shading, Type 1
118
− Another type of white shading is also caused by a lens’s uneven transmission characteristics.
• In this case, it is observed as the center of the image being brighter than the edges.
• This can be corrected by applying a parabolic correction signal to the video amplifiers used for white
balance.
− Another cause of White shading is uneven sensitivity of the photo sensor in the imager array.
• In this case, the white shading phenomenon is not confined in the upper and lower parts of the
screen.
White Shading, Type 2 and 3
119
Prism
Volts
Horizontal
Ideal Light Box Ideal Lens
52 u Sec
Volts
20 m Sec
Vertical
120
A B
A B
C
D
C D
Ideal Light Box Real Lens
Volts
Horizontal
52 u Sec
Volts
20 m Sec
Vertical
Lens’s uneven transmission characteristics (Type 2)
121
A B
C
D
A B C D
Volts
H
Volts
H
Shading Correction Signals
to the video amplifiers used
for white balance.
Volts
H
+ Para
- Para - Saw
+ Saw
Corrected signal
122
A B A B
A B
– The exit pupil refers to the (virtual) image of the diaphragm formed by the lenses behind the diaphragm.
– The amount of shading is related to the exit pupil of the lens, so white shading has to be readjusted when
a lens is replaced by a lens with a different exit pupil distance.
– An extender also changes the exit pupil, hence the shading.
White Shading Adjustment Note
123
Entrance
Pupil
Exit
Pupil
Diaphragm
V modulation is a type of white shading that occurs when there is a vertical disparity in the center of the lens
and prism optical axis.
– This causes the red and blue light components to be projected ‘off center’ of their associated imagers,
which results in green and magenta casts to appear on the top and bottom of the picture frame.
– V modulation is caused by
• the different characteristics of each lens and/or
• the different optical axis of each zoom position
– It can be compensated for in the camera.
• Since this compensation data directly relates to the lens, it is automatically stored/recalled as part of the Lens File.
V Modulation
124
Off Center Projection on R and B Imagers
When the red and blue light components to be projected 'off center' of their associated imager sensors,
green and magenta casts are appeared on the top and bottom of the picture frame.
off center
on center
Green Imager Blue Imager Red Imager
125
Black Shading
– Black shading is a phenomenon observed as unevenness in dark areas of the image due to dark current
noise of the imaging device.
– A black shading adjustment function is available to suppress this phenomenon to a negligible level.
Dark Current Noise:
− The noise induced in an imager by unwanted electric currents generated by various secondary factors,
such as heat accumulated within the imaging device.
126
Registration
127
– Registration means aligning the three images formed on the red, green, blue channels so that they
overlap precisely.
– With three pick-up camera it is necessary before using camera.
– In a CCD camera it is so stable that the adjustment is not necessary.
Registration Examination
128
Depth of Field
129
Deep Depth of Field Shallow Depth of Field
130
Circle of Confusion and Permissible Circle of Confusion
Circle of Confusion
– Since all lenses contain a certain amount of spherical aberration and astigmatism, they cannot perfectly
converge rays from a subject point to form a true image point (i.e., an infinitely small dot with zero area).
– In other words, images are formed from a composite of dots (not points) having a certain area, or size.
– Since the image becomes less sharp as the size of these dots
increases, the dots are called “circles of confusion.”
– Thus, one way of indicating the quality of a lens is by the
smallest dot it can form, or its “minimum circle of confusion.”
Permissible Circle of Confusion
– The maximum allowable dot size in an image is called the
“permissible circle of confusion.” (The largest circle of
confusion which still appears as a “point” in the image)
Permissible Circle of Confusion & Effect of the Image Sensor
– The Permissible Circle of Confusion is re-defined by the sampling of the image sensor.
– The permissible Circle of Confusion is the distance between two sampling lines.
– For the Super 35mm lens, the vertical height is 13.8 mm.
– For the 2/3” lens, the vertical height is 5.4 mm.
131
106
5.4 mm / 2160 vertical pixels = 0.0025mm
13.8 mm / 2160 vertical pixels = 0.0064mm
Super35mm
PixelSize
6.4 x 6.4 um
2/3-inch 4K
PixelSize
2.5 x 2.5 um
Permissible CoC isconstrained
132
Focal
Plane
Image
Sensor
Depth of Field, Depth of Focus & Permissible Circle of Confusion
133
Image
Sensor
Focal
Plane
Depth of Field
Depth of Focus
Permissible
Circle
of
Confusion
Depth of Field, Depth of Focus & Permissible Circle of Confusion
134
Image
Sensor
Focal
Plane
Depth of Field
Depth of Focus
Circle of Confusion
Circle of Confusion
Depth of Field, Depth of Focus & Permissible Circle of Confusion
Permissible
Circle
of
Confusion
– In optics, a circle of confusion is an optical spot caused by a cone of light rays from a lens not coming to a
perfect focus when imaging a point source.
– If an image is out of focus by less than the “Permissible Circle of Confusion”, the out-of-focus is
undetectable.
Depth of Field, Depth of Focus & Permissible Circle of Confusion
135
Maximum non-convergance
allowed to be in focus
Permissible Circle of
Confusion (CoC)
Film/Sensor
Where the light is recorded
Depth of Field
Range that is focus
Focus Point
Near limit of Focus
Far limit of Focus
Permissible Circle of Confusion
136
53
Perfect Focus
Acceptable
Focus
Unacceptable
Focus
Assumption: Permissible
Circle of Confusion
Permissible Circle of Confusion
137
Permissible Circle
of Confusion
Depth of Field (DoF)
Focused Plane
Depth of Field
138
Sensor
Depth of Field is greater behind the subject than in front.
f=Focal Length
𝐝𝟏: Far limit of depth of field (Front depth of field) Depth of focus
Sensor
𝐝𝟐: Near limit of depth of field (Rear depth of field ) Depth of focus
𝜹: permissible circle of confusion (CoC) diameter
𝒍: subject distance (distance from the first
principal point to subject)
𝒅𝟏 =
𝜹 × 𝑭𝑵𝑶 × 𝒍𝟐
𝒇𝟐 − 𝜹 × 𝑭𝑵𝑶 × 𝒍
𝒅𝟐 =
𝜹 × 𝑭𝑵𝑶 × 𝒍𝟐
𝒇𝟐 + 𝜹 × 𝑭𝑵𝑶 × 𝒍
Depth of Field
− When focusing a lense on an object, there is a certain distance range in front of and behind the focused
object that also comes into focus.
− Depth of field indicates the distance between the closest and furthest object that are in focus.
• When this distance is long ,the depth of field is deep.
• When this distance is short ,the depth of field is shallow.
139
It is governed by the three following factors:
I. The larger the iris (bigger F-number & F-stop), the deeper the depth of field (smaller aperture).
II. The shorter the lens’s focal length, the deeper the depth of field.
III. The further the distance between the camera and the subject, the deeper the depth of field.
– Depth of field can therefore be controlled by changing these factors, allowing camera operators
creative expression.
– For example: A shallow depth of field is used for shooting portraits, where the subject is highlighted and
the entire background is blurred.
Depth of Field
140
Focal Length and Depth of Field
141
1
Aperture and Depth of Field
142
2
Focus Distance and Depth of Field
143
3
144
Depth of Field Is Influenced by the Aperture Setting
145
Aperture and Depth of Field
A dot of light
from the subject
Large Aperture
Small Aperture
Light Sensor
Narrow
Depth of Field
Deep
Depth of Field
Lens
Lens
Large
Aperture
Small
Aperture
𝑪𝟎: Permissible
Circle of Confusion
Large Aperture
Small Aperture
Focus Plane
Focus Plane
DOF
DOF
146
Depth of Field Is Influenced by the Aperture Setting
Permissible Circle
of Confusion
Permissible Circle
of Confusion
Circle of Least Confusion
Circle of Least Confusion
Focal Plane
Focal Plane
Depth of Field
Depth of Field
Depth of Focus
Depth of Focus
Circle of Confusion (CoC)
Circle of Confusion (CoC)
Sensor
Sensor
Aperture
Aperture
Optical
Axis
Optical
Axis
Far
Focus
Far
Focus
Near
Focus
Near
Focus
Limit
Limit
Limit
Limit
147
Depth of Field Is Influenced by the Aperture Setting
Depth of Focus
Depth of Focus
Depth of Focus
148
Depth of Field Is Influenced by the Aperture Setting
Depth of Field Is Influenced by the Focal Length of the Lens
149
𝒇
𝒇
Permissible Circle of
Confusion
Permissible Circle of
Confusion
Depth of field
Depth of field
Longer focal length means
smaller depth of field range
Depth of Field Is Influenced by the Focal Length of the Lens
150
Depth of Field
Depth of Focus
Depth of Focus
Depth of Field
Lenses set for
Sharpest Focus
Scene Image Sensor
Far
Limit
Near
Limit
Permissible
Circle
of
Confusion
Permissible
Circle
of
Confusion
Far
Limit
Near
Limit
Depth of Field Is Influenced by the Focal Length of the Lens
151
𝒇𝟏
𝒇𝟐
𝑫𝟏
𝑫𝟐
𝑰𝒎𝒂𝒈𝒆 𝒐𝒏 𝑺𝒆𝒏𝒔𝒐𝒓
𝑰𝒎𝒂𝒈𝒆 𝒐𝒏 𝑺𝒆𝒏𝒔𝒐𝒓
Depth of Field is influenced by the Subject to Camera Distance
152
Permissible Circle of
Confusion
Permissible Circle of
Confusion
Depth of field
Depth of field
Longer subject distances means
larger depth of field range
153
Depth of Field Calculators Apps
Questions??
Discussion!!
Suggestions!!
Criticism!!
154

Weitere ähnliche Inhalte

Was ist angesagt?

Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2Dr. Mohieddin Moradi
 
An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2Dr. Mohieddin Moradi
 
Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Dr. Mohieddin Moradi
 
An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1    An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1 Dr. Mohieddin Moradi
 
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGESVIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGESDr. Mohieddin Moradi
 
An Introduction to HDTV Principles-Part 4
An Introduction to HDTV Principles-Part 4An Introduction to HDTV Principles-Part 4
An Introduction to HDTV Principles-Part 4Dr. Mohieddin Moradi
 
Video Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video CodecsVideo Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video CodecsDr. Mohieddin Moradi
 
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-basedDesigning an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-basedDr. Mohieddin Moradi
 
HDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsHDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsDr. Mohieddin Moradi
 
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1Dr. Mohieddin Moradi
 
Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts Dr. Mohieddin Moradi
 
HDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdfHDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdfssuserc5a4dd
 

Was ist angesagt? (20)

Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
 
An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2
 
Video Quality Control
Video Quality ControlVideo Quality Control
Video Quality Control
 
Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3
 
An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1    An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1
 
HDR and WCG Principles-Part 1
HDR and WCG Principles-Part 1HDR and WCG Principles-Part 1
HDR and WCG Principles-Part 1
 
HDR and WCG Principles-Part 5
HDR and WCG Principles-Part 5HDR and WCG Principles-Part 5
HDR and WCG Principles-Part 5
 
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGESVIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGES
 
An Introduction to HDTV Principles-Part 4
An Introduction to HDTV Principles-Part 4An Introduction to HDTV Principles-Part 4
An Introduction to HDTV Principles-Part 4
 
SDI to IP 2110 Transition Part 2
SDI to IP 2110 Transition Part 2SDI to IP 2110 Transition Part 2
SDI to IP 2110 Transition Part 2
 
Video Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video CodecsVideo Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video Codecs
 
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-basedDesigning an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
 
HDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsHDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting Considerations
 
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
 
HDR and WCG Principles-Part 6
HDR and WCG Principles-Part 6HDR and WCG Principles-Part 6
HDR and WCG Principles-Part 6
 
SDI to IP 2110 Transition Part 1
SDI to IP 2110 Transition Part 1SDI to IP 2110 Transition Part 1
SDI to IP 2110 Transition Part 1
 
Broadcast Lens Technology Part 2
Broadcast Lens Technology Part 2Broadcast Lens Technology Part 2
Broadcast Lens Technology Part 2
 
HDR and WCG Principles-Part 4
HDR and WCG Principles-Part 4HDR and WCG Principles-Part 4
HDR and WCG Principles-Part 4
 
Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts
 
HDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdfHDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdf
 

Ähnlich wie Broadcast Lens Technology Part 1 (20)

Camera Lens
Camera LensCamera Lens
Camera Lens
 
Chapter 4Lenses and OpticsCONTENTS4.1 Overview4.2
Chapter 4Lenses and OpticsCONTENTS4.1 Overview4.2 Chapter 4Lenses and OpticsCONTENTS4.1 Overview4.2
Chapter 4Lenses and OpticsCONTENTS4.1 Overview4.2
 
Photography lession 03
Photography  lession  03 Photography  lession  03
Photography lession 03
 
Photography lession 03
Photography  lession  03 Photography  lession  03
Photography lession 03
 
Lens Operations and Controls Project
Lens Operations and Controls ProjectLens Operations and Controls Project
Lens Operations and Controls Project
 
Lens Language
Lens LanguageLens Language
Lens Language
 
Focus
FocusFocus
Focus
 
Basics of photography
Basics of photographyBasics of photography
Basics of photography
 
High Speed Camera.pptx
High Speed Camera.pptxHigh Speed Camera.pptx
High Speed Camera.pptx
 
Technicality Of Camera
Technicality Of CameraTechnicality Of Camera
Technicality Of Camera
 
Basic parts of camera and camera exposure
Basic parts of camera and camera exposureBasic parts of camera and camera exposure
Basic parts of camera and camera exposure
 
Ef lens work_book_7_en
Ef lens work_book_7_enEf lens work_book_7_en
Ef lens work_book_7_en
 
Ef lens work_book_7_en
Ef lens work_book_7_enEf lens work_book_7_en
Ef lens work_book_7_en
 
01.intro
01.intro01.intro
01.intro
 
Chapter Light: Grade 10 Physics
Chapter Light: Grade 10 PhysicsChapter Light: Grade 10 Physics
Chapter Light: Grade 10 Physics
 
Art of Photography by Vivek Desai
Art of Photography by Vivek DesaiArt of Photography by Vivek Desai
Art of Photography by Vivek Desai
 
CAMERA BSCRIM ).ppt
CAMERA BSCRIM ).pptCAMERA BSCRIM ).ppt
CAMERA BSCRIM ).ppt
 
Basic camera controls
Basic camera controlsBasic camera controls
Basic camera controls
 
2-Camera-Properties.pdf
2-Camera-Properties.pdf2-Camera-Properties.pdf
2-Camera-Properties.pdf
 
Basic principles of photography
Basic principles of photographyBasic principles of photography
Basic principles of photography
 

Mehr von Dr. Mohieddin Moradi

An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3Dr. Mohieddin Moradi
 
An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2Dr. Mohieddin Moradi
 
An Introduction to Audio Principles
An Introduction to Audio Principles An Introduction to Audio Principles
An Introduction to Audio Principles Dr. Mohieddin Moradi
 
Video Compression, Part 4 Section 1, Video Quality Assessment
Video Compression, Part 4 Section 1,  Video Quality Assessment Video Compression, Part 4 Section 1,  Video Quality Assessment
Video Compression, Part 4 Section 1, Video Quality Assessment Dr. Mohieddin Moradi
 
Video Compression, Part 4 Section 2, Video Quality Assessment
Video Compression, Part 4 Section 2,  Video Quality Assessment Video Compression, Part 4 Section 2,  Video Quality Assessment
Video Compression, Part 4 Section 2, Video Quality Assessment Dr. Mohieddin Moradi
 
Video Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video CodecsVideo Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video CodecsDr. Mohieddin Moradi
 
Video Compression, Part 2-Section 1, Video Coding Concepts
Video Compression, Part 2-Section 1, Video Coding Concepts Video Compression, Part 2-Section 1, Video Coding Concepts
Video Compression, Part 2-Section 1, Video Coding Concepts Dr. Mohieddin Moradi
 

Mehr von Dr. Mohieddin Moradi (9)

HDR and WCG Principles-Part 3
HDR and WCG Principles-Part 3HDR and WCG Principles-Part 3
HDR and WCG Principles-Part 3
 
HDR and WCG Principles-Part 2
HDR and WCG Principles-Part 2HDR and WCG Principles-Part 2
HDR and WCG Principles-Part 2
 
An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3
 
An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2
 
An Introduction to Audio Principles
An Introduction to Audio Principles An Introduction to Audio Principles
An Introduction to Audio Principles
 
Video Compression, Part 4 Section 1, Video Quality Assessment
Video Compression, Part 4 Section 1,  Video Quality Assessment Video Compression, Part 4 Section 1,  Video Quality Assessment
Video Compression, Part 4 Section 1, Video Quality Assessment
 
Video Compression, Part 4 Section 2, Video Quality Assessment
Video Compression, Part 4 Section 2,  Video Quality Assessment Video Compression, Part 4 Section 2,  Video Quality Assessment
Video Compression, Part 4 Section 2, Video Quality Assessment
 
Video Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video CodecsVideo Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video Codecs
 
Video Compression, Part 2-Section 1, Video Coding Concepts
Video Compression, Part 2-Section 1, Video Coding Concepts Video Compression, Part 2-Section 1, Video Coding Concepts
Video Compression, Part 2-Section 1, Video Coding Concepts
 

Kürzlich hochgeladen

Glass Ceramics: Processing and Properties
Glass Ceramics: Processing and PropertiesGlass Ceramics: Processing and Properties
Glass Ceramics: Processing and PropertiesPrabhanshu Chaturvedi
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordAsst.prof M.Gokilavani
 
Online banking management system project.pdf
Online banking management system project.pdfOnline banking management system project.pdf
Online banking management system project.pdfKamal Acharya
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...Call Girls in Nagpur High Profile
 
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxBSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxfenichawla
 
UNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular ConduitsUNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular Conduitsrknatarajan
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxupamatechverse
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfKamal Acharya
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
Russian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
Russian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur EscortsRussian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
Russian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)simmis5
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college projectTonystark477637
 

Kürzlich hochgeladen (20)

Glass Ceramics: Processing and Properties
Glass Ceramics: Processing and PropertiesGlass Ceramics: Processing and Properties
Glass Ceramics: Processing and Properties
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
Online banking management system project.pdf
Online banking management system project.pdfOnline banking management system project.pdf
Online banking management system project.pdf
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
 
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxBSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
 
UNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular ConduitsUNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular Conduits
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptx
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
Russian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
Russian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur EscortsRussian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
Russian Call Girls in Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college project
 
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
 

Broadcast Lens Technology Part 1

  • 3. − Vertical and Horizontal Fields of View − F-Stop, F-Number, T-Number, , Minimum Illumination and Sensitivity − Color Temperature Adjustment and Color Conversion in Camera − Camera Beam Splitter Structure and Related Issuers − Depth of Field, Depth of Focus & Permissible Circle of Confusion − Broadcast Zoom Lens Technology − 4K Lens Critical Performance Parameter − Optical Accessories and Optical Filters Outline 3
  • 4. 4
  • 5. 5 Types of Visible Perception Possible − As move further from fovea, vision becomes more limited − Colour vision only possible in central visual field (Left eye)
  • 6. 6 Vertical and Horizontal Fields of View (Monocular Vision) Visual Limit Right Eye (94°) R L Normal Viewing Field Normal Viewing Field Horizontal Sight Line (Binocular Vision) Word Pattern Recognition
  • 7. 7 Horizontal field of view • The central field of vision for most people covers an angle of between 50° and 60° (objects are recognized). • Within this angle, both eyes observe an object simultaneously. • This creates a central field of greater magnitude than that possible by each eye separately. • This central field of vision is termed the 'binocular field' and within this field  images are sharp  depth perception occurs  colour discrimination is possible Vertical and Horizontal Fields of View Visual Limit of Left Eye Visual Limit of Right Eye Central Field of Vision Vertical Field of View • The typical line of sight is considered horizontal or 0 °. • A person’s natural or normal line of sight is normally a 10 ° cone of view below the horizontal and, if sitting, approximately 15 °. Limit of Color Discrimination Limit of Color Discrimination Normal Sight Line whilst Standing Normal Sight Line whilst Seated Visual Limit of Eye Visual Limit of Eye Normal Line of Sight
  • 9. − A certain range of the image that is captured by the camera and displayed on the picture monitor. • The angle of view is measured by the angle between the center axis of the lens to the edges of the image in the horizontal, vertical, and diagonal directions. Respectively, these are called the horizontal angle of view, vertical angle of view, and diagonal angle of view. Angle Of View 9
  • 10. − Angle of view can be calculated from the following equation: 𝑤: Angle of view 𝑦: Image size on imager sensor (in horizontal, vertical and diagonal directions) 𝑓: lens focal length 𝑤 = 2 tan−1 𝑦 2𝑓 Angle Of View 10 𝑦 𝑓 ൗ 𝒘 𝟐 ൗ 𝒘 𝟐
  • 11. Calculating from W : Angle of View L : Object Distance Y: Object Dimension Calculating from L : Object Distance f : Focal Length Y’ : Image Size Calculation of the Object Dimensions to Fill the Image format 11 𝑌 = 𝑌′ 𝐿 𝑓 𝑌 = 2𝐿 tan 𝑊 2 Y: Object Dimension Y’ : Image Size W : Angle of View L : Object Distance Image Distance= Approximately f
  • 12. I. Angle of view becomes narrow when a telephoto lens is used. II. In contrast, it becomes wider with a wide-angle lens. • Consequently, the wider the angle of view, the wider the area of the image captured. III. A camera’s angle of view also varies depending on the size of the imager. • This means that 2/3-inch type CCD cameras and 1/2-inch type CCD cameras offer different angles of view for lenses with the same focal lengths. 𝑤 = 2 tan−1 𝑦 2𝑓 Angle Of View, Image Circle and Image Size Image Sizes for Television and Film (Actual Size) 12
  • 14. Human Visual System )middle layer of the eye(
  • 15. Aperture − In general, the word aperture refers to an opening, a hole, or any other type of narrow opening. • When used in relation to the mechanism of a lens, it stands for the size of the lens’s opening that determines the amount of light directed to the camera’s imager. • The diameter of the lens aperture can be controlled by the lens iris. • Iris consists of a combination of several thin diaphragms. 15 ‫چشم‬ ‫مردمک‬ ( ‫حدقیه‬ ) ‫عنبیه‬ Diaphragm Aperture
  • 16. Iris − The amount of light captured and directed to a camera’s imager is adjusted by a combination of diaphragms integrated in the lens (This mechanism is called the lens iris). • The Iris works just like the pupil of the human eye. • By opening and closing these diaphragms, the diameter of the opening (also called aperture) changes, thus controlling the amount of light that passes through it. The amount of the iris opening is expressed by its F-stop. 16 ‫چشم‬ ‫مردمک‬ ( ‫حدقیه‬ ) ‫عنبیه‬
  • 18. Auto Iris − Auto iris is a convenient function that detects the amount of light entering the lens and automatically opens or closes the iris to maintain appropriate exposure. • Auto iris is especially useful in situations where manual iris adjustment can be difficult, such as in ENG applications. • Auto iris lenses control the iris aperture by detecting and analyzing the amplitude of the video signal produced in the camera. • An iris control signal is generated according to the amplitude of this video signal, to either open or close the iris for correct exposure. 18
  • 19. Focal Length – The focal length describes the distance between a lens and the point where light passing through it converges on the optical axis. This point is where images captured by the lens are in focus and is called the focal point. 19 Single Lens Compound Lens Focal Length Focal Length Focal Point Principal Point
  • 20. A lens with a short focal length: – Captures a large area of the subject to provide a wide angle view. – Amount of light entering the lens is that reflected from a large area of the subject. A lens with a long focal length: – Captures only a small area of the subject to provide a magnified or close-up view of the subject . – Only the light reflected from a small area of the subject enters the lens, resulting in a darker image. The longer the focal length, the less light that enters the lens. Focal Length 20
  • 21. − It describes how bright a lens is, or, more simply, The maximum amount of light a lens can direct to the camera’s image sensor. F-number 𝑭 − 𝒏𝒖𝒎𝒃𝒆𝒓 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 𝑴𝒂𝒙𝒊𝒎𝒖𝒎 𝑰𝒓𝒊𝒔 𝑶𝒑𝒆𝒏𝒊𝒏𝒈 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓 21
  • 22. This amount of light is determined by two factors: I. The widest iris opening that the lens allows or its maximum aperture – A wider iris opening (aperture diameter) simply means more light passing through the Lens (bigger 𝑫). II. The focal length of the lens – The longer the focal length, the less light that enters the lens (smaller 𝒇). F-number 𝑭 − 𝒏𝒖𝒎𝒃𝒆𝒓 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 𝑴𝒂𝒙𝒊𝒎𝒖𝒎 𝑰𝒓𝒊𝒔 𝑶𝒑𝒆𝒏𝒊𝒏𝒈 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓 22
  • 23. – Interestingly, in this definition, brighter lenses are described with smaller F-numbers. This can be understood by substituting a shorter focal length and larger maximum iris opening in the equation. – A lens’s F-number is usually labeled on its front. – Since zoom lenses offer a variable focal length, these are described with an F-number range across the entire zoom range (e.g., F2.8 - F4.0). – While F-number is strictly used to describe a lens’s brightness performance, a parameter often mixed up with this is F-stop. F-number 𝑭 − 𝒏𝒖𝒎𝒃𝒆𝒓 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 𝑴𝒂𝒙𝒊𝒎𝒖𝒎 𝑰𝒓𝒊𝒔 𝑶𝒑𝒆𝒏𝒊𝒏𝒈 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓 23
  • 24. F-stop − F-number indicate the maximum amount of incident light with the lens iris fully opened. − F-stop indicates (opposite f-number): The amount of incident light at smaller iris openings. Notes: • F-stops are calibrated from the lens’s widest iris opening to its smallest using the same above equation as F-number, however the diameter (D) being that for the given iris opening. • The most important difference to note is that F-stops are a global reference for judging the amount of light that should be allowed through the lens during a camera shoot. 𝑭 − 𝒔𝒕𝒐𝒑 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓) More F-stop More Light Stop (Less Light Transmission) 24 Diaphragm Aperture
  • 25. F - 1.4 F - 2 F - 2.8 F - 4 F - 5.6 F - 8 F - 11 F - 16 F - 22 Some lenses are Faster than others Zoom lenses are Slower than Prime lenses • Smaller numbers let in more light. • The lower the number the “FASTER” the lens (Recall Shutter) • Bigger numbers let in less light. • The higher the number the “Slower” the lens (Recall Shutter) Each stop lets in half as much light as the one before it. F-stop and Optical Speed 𝑭 − 𝒔𝒕𝒐𝒑 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓) 25
  • 26. Main wide camera • f/1.8 aperture • 26mm equivalent focal length • 12-megapixel resolution • 100% autofocus pixels • Optical image stabilisation F-stop 26 Ultra wide camera • f/2.4 aperture • 13mm equivalent focal length • 0.5x, 120-degree field-of-view • 12-megapixel resolution • 120° field of view Telephoto • f/2.0 aperture • 56mm equivalent focal length • 2x optical zoom • 12-megapixel resolution • Optical image stabilisation (OIS) iPhone 11 Pro Aperture differences
  • 27. F-stop and Depth of Field − It is also important to note that F-stop is a key factor that affects depth of field. The smaller the F-stop, the shallower the depth of field, and vice versa.. 27 Wide Aperture Small Aperture Shallow Depth of Field Deep Depth of Field More Light Reaching Image Sensor Less Light Reaching Image Sensor FOCUS
  • 29. F-stop and Depth of Field 29 Aperture and Depth of Field A dot of light from the subject Large Aperture Small Aperture Light Sensor Narrow Depth of Field Deep Depth of Field Lens Focus Distance Focus Distance Focus Distance Depth of Field Range Depth of Field Range Depth of Field Range F1.4 F5.6 F22
  • 30. − From the viewpoint of characteristics of lenses, shooting with the aperture set in range of f-4 to f-8 Is generally recommended for good quality picture. − Set FILTER control to bring the aperture setting into that range. − However, this may not apply when special composition is desired. F-stop 𝑭 − 𝑺𝒕𝒐𝒑 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓) 30
  • 31. − F-stops are a global reference for judging the amount of light that should be allowed through the lens during a camera shoot. − F-stop calibrations increase by a factor of root 2, such as 1.4, 2, 2.8, 4, 5.6, 8, 11, 16, and 22. − As the value of the F-stop increments by one step (e.g., 5.6 to 8.0), the amount of light passing through the lens decreases by one half. − This relation is due to the fact that F-stop is a function of the iris diameter, while incident light is a function of the square of the diameter. F-stop 𝑭 − 𝑺𝒕𝒐𝒑 = 𝒇 (𝑭𝒐𝒄𝒂𝒍 𝑳𝒆𝒏𝒈𝒕𝒉) 𝑫 (𝑨𝒑𝒆𝒓𝒂𝒕𝒖𝒓𝒆 𝑫𝒊𝒂𝒎𝒆𝒕𝒆𝒓) 31
  • 32. To calculate the steps in a full stop (1 EV(Exposure Value)) one could use 20×0.5, 21×0.5, 22×0.5, 23×0.5, 24×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏, 𝟐, 𝟑, 𝟒 … 𝒐𝒓 𝑨𝑽 = 𝑲) The steps in a half stop (1/2 EV) series would be 20/2×0.5, 21/2×0.5, 22/2×0.5, 23/2×0.5, 24/2×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟓, 𝟏 , 𝟏. 𝟓, 𝟐, … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟐) The steps in a third stop (1/3 EV) series would be 20/3×0.5, 21/3×0.5, 22/3×0.5, 23/3×0.5, 24/3×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏/𝟑 , 𝟐/𝟑, 𝟏, 𝟒/𝟑 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟑) The steps in a quarter stop (1/4 EV) series would be 20/4×0.5, 21/4×0.5, 22/4×0.5, 23/4×0.5, 24/4×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟐𝟓, 𝟎. 𝟓, 𝟎. 𝟕𝟓, 𝟏 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟒) Fractional Stops 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 1.0 1.1 1.2 1.3 1.4 1.5 1.7 1.8 2 𝟏. 𝟎𝟎 𝟏. 𝟎𝟗 𝟏. 𝟏𝟖 𝟏. 𝟐𝟗 𝟏. 𝟒𝟏 𝟏. 𝟓𝟒 𝟏. 𝟔𝟖 𝟏. 𝟖𝟑 𝟐. 𝟎𝟎 Full-stop One-half-stop 1/2 , light reduction AV 1/4 , light reduction F-stop Calculated The one-stop unit is also known as the EV (Exposure Value) unit. 𝑓 − 𝑠𝑡𝑜𝑝 = 2𝐴𝑉 = 2𝐴𝑉×0.5 𝐴𝑉: 𝐴𝑝𝑒𝑟𝑡𝑢𝑟𝑒 𝑉𝑎𝑙𝑢𝑒 1/8 , light reduction One-quarter-stop 32
  • 33. To calculate the steps in a full stop (1 EV(Exposure Value)) one could use 20×0.5, 21×0.5, 22×0.5, 23×0.5, 24×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏, 𝟐, 𝟑, 𝟒 … 𝒐𝒓 𝑨𝑽 = 𝑲) The steps in a half stop (1/2 EV) series would be 20/2×0.5, 21/2×0.5, 22/2×0.5, 23/2×0.5, 24/2×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟓, 𝟏 , 𝟏. 𝟓, 𝟐, … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟐) The steps in a third stop (1/3 EV) series would be 20/3×0.5, 21/3×0.5, 22/3×0.5, 23/3×0.5, 24/3×0.5 etc. (𝑨𝑽 = 𝟎, 𝟏/𝟑 , 𝟐/𝟑, 𝟏, 𝟒/𝟑 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟑) The steps in a quarter stop (1/4 EV) series would be 20/4×0.5, 21/4×0.5, 22/4×0.5, 23/4×0.5, 24/4×0.5 etc. (𝑨𝑽 = 𝟎, 𝟎. 𝟐𝟓, 𝟎. 𝟓, 𝟎. 𝟕𝟓, 𝟏 … 𝒐𝒓 𝑨𝑽 = 𝑲/𝟒) Fractional Stops 0 0.25 0.3 0.5 0.7 0.75 1 1.25 1.3 1.5 1.7 1.75 2 1.0 1.1 1.1 1.2 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.8 2 𝟏. 𝟎𝟎 𝟏. 𝟎𝟗 1.10 𝟏. 𝟏𝟖 1.27 𝟏. 𝟐𝟗 𝟏. 𝟒𝟏 𝟏. 𝟓𝟒 1.56 𝟏. 𝟔𝟖 1.08 𝟏. 𝟖𝟑 𝟐. 𝟓𝟓 Full-stop One-half-stop 1/2 , light reduction AV 1/4 , light reduction 1/8 , light reduction F-stop Calculated 1/6 , light reduction 0.33 0.66 1.66 1.33 One-third-stop One-quarter-stop 33 𝑓 − 𝑠𝑡𝑜𝑝 = 2𝐴𝑉 = 2𝐴𝑉×0.5 𝐴𝑉: 𝐴𝑝𝑒𝑟𝑡𝑢𝑟𝑒 𝑉𝑎𝑙𝑢𝑒
  • 35. 𝟏𝟎𝟒 Real Word 1.6 billions 𝒄𝒅/𝒎𝟐 1𝟎𝟎 𝐦𝐢𝐥𝐥𝐢𝐨𝐧 𝒄𝒅/𝒎𝟐 1 𝐦𝐢𝐥𝐥𝐢𝐨𝐧 𝒄𝒅/𝒎𝟐 1𝟎𝟎𝟎𝟎 𝒄𝒅/𝒎𝟐 1𝟎𝟎 𝒄𝒅/𝒎𝟐 1 𝒄𝒅/𝒎𝟐 0. 𝟎𝟏 𝒄𝒅/𝒎𝟐 0. 𝟎𝟎𝟎𝟎𝟏 𝒄𝒅/𝒎𝟐 0. 𝟎𝟎𝟎𝟎𝟎𝟏 𝒄𝒅/𝒎𝟐 0 𝒄𝒅/𝒎𝟐 Direct Sunlight Sky Interior Lighting Moonlight Starlight Absolute Darkness Cinema SDR TV HDR TV Adjustment by the human eye Adjustment range of the human eye (flexible) What We See Visible Light 𝟏𝟎−𝟏 The dynamic range of a typical camera and lens system is typically 𝟏𝟎𝟓 with a fixed iris. 35 Real-world Luminance Levels and the High-level Functionality of the HVS
  • 36. Light Levels in Stop 36
  • 37. − As many people know, movie camera lenses are rated by a T- number instead of an F-number. − The F-number expresses the speed of the lens on the assumption that lens transmits 100% of the incident light. − In reality, different lenses have different transmittance, so two lenses with the same F-number may actually have different speed. − The T-number solves this problem by taking both the diaphragm diameter and transmittance into account. − Two lenses with the same T-Number will always give the same image brightness. T-number 𝑇 − 𝑛𝑢𝑚𝑏𝑒𝑟 = 𝐹 − 𝑛𝑢𝑚𝑏𝑒𝑟 𝑇𝑟𝑎𝑛𝑠𝑚𝑖𝑡𝑎𝑛𝑐𝑒(%) × 10 37
  • 38. − If you have zoomed with a zoom lens open to full aperture, you may have noted a drop in video level at the telephoto end. This is called the F drop. − F drop is a major determinant of the value of zoom lenses used in live on-site sports broadcasts, which require a long focal length and must frequently contend with twilight or inadequate artificial illumination. F-Drop 38
  • 39. − The entrance pupil of a zoom lens changes in diameter as the focal length is changed. − As you zoom toward the telephoto end, the entrance pupil gradually enlarges. When the entrance pupil diameter is equal to the diameter of the focusing lens group, it cannot become any larger, so the F- number drops. That is the reason for the F drop. − If entrance pupil (effective aperture) diameter > front lens diameter, then F-Number drops. − To eliminate F drop completely, the focusing lens group has to be larger than the entrance pupil at the telephoto end of the zoom. It has to be at least equal to the focal length at the telephoto end divided by the F-number. − To reduce the size and weight of a zoom lens, it is common to allow a certain amount of F drop. For better composition effect, however, in some studio zoom lenses the focusing group is made large enough that no F drop occurs. − F drop is a major determinant of the value of zoom lenses used in live on-site sports broadcasts, which require a long focal length and must frequently contend with twilight or inadequate artificial illumination. F-Drop 39
  • 40. Pedestal/Master Black Pedestal or Master Black Sut-up Level Absolute black level or the darkest black that can be reproduced by the camera. 40 Black Level Blank Level
  • 41. − Pedestal, also called master black, refers to the absolute black level or the darkest black that can be reproduced by the camera. − The pedestal can be adjusted as an offset to the set-up level. − Since pedestal represents the lowest signal level available, it is used as the base reference for all other signal levels.  If the pedestal level is set too low due to improper adjustment, the entire image will appear darker than it should be (the image will appear blackish and heavier).  If the pedestal level is set too high, the image will look lighter than it should be (the image will look foggy with less contrast). − By adjusting the pedestal level, it is possible to intentionally increase the clearness of an image • when shooting a foggy scene • when shooting subjects through a window simply by lowering it Pedestal/Master Black 41
  • 42. Dynamic Range − In general, dynamic range indicates the difference or ratio of the smallest and largest amount of information an electrical device can handle. − For a camera, “Dynamic Range” indicates: − The range between the smallest and largest amount of light that can be handled. • The native dynamic range of high performance SDR (Standard Dynamic Range) video cameras is still in the range of merely 600% (in HDR (High Dynamic Range) video it is more than 1000%). • This 600% dynamic range implies that the camera’s CCD can generate a video signal six times larger in amplitude than the 1.0 V video standard. 42
  • 43. − Many methods have been developed to get around this and enable more effective light handling. − These include : • Automatic Gain Control • The electronic shutter • ND filters Dynamic Range 43
  • 44. Minimum Illumination − Minimum illumination indicates the minimum amount of light required for shooting with a camera, particularly in the dark. It is expressed in Lux. − When comparing minimum illumination specifications, it is important to consider the conditions they were measured at. • Cameras provide a gain up function that amplifies the signal when a sufficient level is not obtained. Although convenient for shooting under low light, gain up also boosts the signal’s noise level. − Minimum illumination is usually measured with the highest gain up setting provided by the camera, and therefore does not represent the true sensitivity of the camera. • Simply put in mind, minimum illumination specifications should be evaluated with this gain up setting. 44
  • 45. Sensitivity − The sensitivity of a camera indicates its ability to shoot in low-light areas without noise being introduced. (It defines a camera’s raw response to light) − Sensitivity is sometimes confused with minimum Illumination ,but there is a significant difference between the two. • Minimum illumination describes the lowest light level in which a camera can capture images without taking noise factors into account. − For this reason, to determine a camera’s true performance in low-light shooting, it is better to refer to sensitivity specifications first. 45
  • 46. − The camera’s sensitivity is measured by opening the camera’s lens iris from its closed position until the white area of the grayscale chart reaches 100% video level (on a waveform monitor). The lens’s F-stop reading at this state is the camera’s sensitivity. − In CCD cameras, sensitivity is largely governed by: • The aperture ratio (size) of the photosensitive sites • On- Chip Lens structure − The more light gathered onto each photo-sensor, the larger the CCD output and the higher the sensitivity. Sensitivity 46
  • 47. − Sensitivity is described using the camera lens F-stop number • Camera A: Sensitivity: f11 at 2000 lx (3200K, 89.9% reflectance) • Camera B: Sensitivity: f8 at 2000 lx (3200K, 89.9% reflectance) − The larger the F-number indication (Camera A), the higher the sensitivity • To make a fair comparison between cameras, sensitivity specifications are indicated with the conditions that were used to measure them. • In the above two cases, a 2000 lx/3200K illuminant was used to light a grayscale chart that reflects 89.9% of the light hitting its surface. Sensitivity 47
  • 49. Frequency Response 49 0.5 MHz 1 MHz 2 MHz 4 MHz 4.8 MHz 5.8 MHz Multiburst for measurement of frequency response
  • 50. 50
  • 51. – On the Kelvin scale, zero degrees K (0 K) is defined as “absolute zero” temperature. – This is the temperature at which molecular energy or molecular motion no longer exists. – Since heat is a result of molecular motion, temperatures lower than 0 K do not exist. Kelvin is calculated as: K = Celsius + 273.15 ºC Color Temperature 51
  • 52. Color Temperature – The spectral distribution of light emitted from a piece of carbon (a black body that absorbs all radiation without transmission and reflection) is determined only by its temperature. – When heated above a certain temperature, carbon will start glowing and emit a color spectrum particular to that temperature. – This discovery led researchers to use the temperature of heated carbon as a reference to describe different spectrums of light. – This is called Color temperature. 52 Transmission, Reflection, Absorb
  • 53. Color Temperature, Recall 53 250 200 150 100 50 0 300 350 400 450 500 550 600 650 700 750 800
  • 55. 55 Cool & Warm Colors, Recall
  • 56. Natural light Artificial light Color Temperature 56
  • 57. – Our eyes are adaptive to changes in light source colors – i.e., the color of a particular object will always look the same under all light sources: sunlight, halogen lamps, candlelight, etc. – However, with color video cameras this is not the case, bringing us to the definition of “color temperature.” – When shooting images with a color video camera, it is important for the camera to be color balanced according to the type of light source (or the illuminant) used. Color Temperature 57 This is because different light source types emit different colors of light (known as color spectrums) and video cameras capture this difference.
  • 58. The camera color temperature is lower than environment color temperature The camera color temperature is upper than environment color temperature Color Temperature 58
  • 59. – In video technology, color temperature is used to describe the spectral distribution of light emitted from a light source. – The cameras do not automatically adapt to the different spectrums of light emitted from different light source types. – In such cases, color temperature is used as a reference to adjust the camera’s color balance to match the light source used. • For example, if a 3200K (Kelvin) light source is used, the camera must also be color balanced at 3200K. Color Temperature 59
  • 60. Color Temperature Conversion – All color cameras are designed to operate at a certain color temperature . – For example, Sony professional video cameras are designed to be color balanced at 3200K, meaning that the camera will reproduce colors correctly provided that a 3200K illuminant is used. – This is the color temperature for indoor shooting when using common halogen lamps. 60
  • 61. Cameras must also provide the ability to shoot under illuminants with color temperatures other than 3200K. – For this reason, video cameras have a number of selectable color conversion filters placed before the prism system. – These filters optically convert the spectrum distribution of the ambient color temperature (illuminant) to that of 3200K, the camera’s operating temperature. – For example, when shooting under an illuminant of 5600K, a 5600K color conversion filter is used to convert the incoming light’s spectrum distribution to that of approximately 3200K. Color Temperature Conversion 61
  • 63. – When only one optical filter wheel is available within the camera, this allows all filters to be Neutral Density types providing flexible exposure control. – The cameras also allow color temperature conversion via electronic means. – The Electronic Color Conversion Filter allows the operator to change the color temperature from 2,000K to 20,000K as typical. Color Temperature Conversion 63
  • 65. − “Why do we need color conversion filters if we can correct the change of color temperature electrically (white balance)?". • White balance electrically adjusts the amplitudes of the red (R) and blue (B) signals to be equally balanced to the green (G) by use of video amplifiers. • We must keep in mind that using electrical amplification will result in degradation of signal-to-noise ratio. • Although it may be possible to balance the camera for all color temperatures using the R/G/B amplifier gains, this is not practical from a signal-to-noise ratio point of view, especially when large gain up is required. The color conversion filters reduce the gain adjustments required to achieve correct white balance. Color Temperature Conversion 65
  • 66. Variable Color Temperature − The Variable Color Temp. Function allows the operator to change the color temperature from 20,000K to 2,000K 66
  • 67. Preset Matrix Function – Preset for 3 Matrices can be set. – The Matrix level can be preset to different lightings. – The settings can be easily controlled by the control panel. 67
  • 68. White Balance & Color Temperature 68
  • 69. The different light source types emit different colors of light (known as color spectrums) and video cameras capture this difference. White Balance & Color Temperature 69 Daylight Incandescent Fluorescent Halogen Cool White LED Warm White LED Wavelength (nm) Wavelength (nm) Wavelength (nm) Wavelength (nm) Wavelength (nm) Wavelength (nm) Intensity Intensity Intensity Intensity Intensity Intensity
  • 70. − The video cameras are not adaptive to the different spectral distributions of each light source type. • In order to obtain the same color reproduction under different light sources, color temperate variations must be compensated by converting the ambient color temperature to the camera’s operating color temperature (Optically or Electrically). • Once the incoming light’s color temperature is converted to the camera’s operating color temperature (Optically or Electrically), this conversion alone does not complete color balancing of the camera, therefore more precise color balancing adjustment must be made . White Balance 70 A second adjustment must be made to precisely match the incoming light’s color temperature to that of the camera known as “white balance”
  • 71. White Balance White balance refers to shooting a pure white object, or a grayscale chart, and adjusting the camera’s video amplifiers so the Red, Green, and Blue channels all output the same video level. 71
  • 73. White Balance − Why by Performing this adjustment for the given light source, we ensure that the color “white” and all other colors are correctly reproduced? • The color “white” is reproduced by combining Red, Green, and Blue with an equal 1:1:1 ratio. • White Balance adjusts the gains of the R/G/B video amplifiers to provide this output ratio for a white object shot under the given light source type. • Once these gains are correctly set for that light source, other colors are also output with the correct Red, Green, and Blue ratios. 73 (SDTV) Y=0.11B+0.3R+0.59G
  • 74. White Balance – For example, when a pure yellow object is shot, the outputs from the Red, Green, and Blue video amplifiers will have a 1:1:0 ratio (yellow is combined by equally adding Red and Green). – In contrast, if the White Balance is not adjusted, and the video amplifiers have incorrect gains for that light source type, the yellow color would be output incorrectly with, for example, a Red, Green, and Blue channel ratio of 1:0.9:0.1. – Note: White balance must be readjusted after changing lens. 74 (SDTV) Y=0.11B+0.3R+0.59G
  • 75. 75 White Balance – Camera Shading – Even brightness white source • Ambi-Illuminator – Often the center can be brighter than the edges – Measure light output with a luminance spot meter – Set camera gain to 0dB & camera controls to zero – Set camera F-stop between f4 to f5.6 • Adjust distance of camera to source – Defocus Camera
  • 76. 76 White Balance – Camera Shading – Select WFM display and configure for RGB parade. – No color hue should be present – Red, green, blue channels must be balanced – Ideally RGB should be at same level and flat Original RGB parade waveform After white shading adjustment
  • 77. 77 White Balance with the Vector Display  Monochrome image should be centered tightly on the vector graticule  Off-center ovular shape indicates shading error  Use gain controls on the vector display to confirm correct white balance
  • 78. 78 − A neutral gray scale with the color balance skewed toward warm light. • Notice how the trace on the vectorscope is pulled toward red/orange. − The same chart with color balance skewed toward blue. − Notice how the trace on the vectorscope trace is pulled toward blue. − The gray scale with neutral color balance — the vectorscope shows a small dot right in the center, indicating that there is no color at all: zero saturation. Color Balancing with Vectorscope
  • 79. 79 − Parade view on the waveform monitor clearly shows the incorrect color balance of what should be a neutral gray chart. − On the waveform, the Red channel is high, while Green is a bit lower and Blue is very low (top end of each channel is circled in this illustration). − This is why so many colorists and DITs say that they “live and die by parade view.” Color Balancing with Waveform Monitor
  • 80. Preset White – Preset White is a white-balance selection used in shooting scenarios • When the white balance cannot be adjusted • Or when the color temperature of the shooting environment is already known (3200K or 5600K for instance). – This means that by simply choosing the correct color conversion filter, optical or electronic, the approximate white balance can be achieved. – It must be noted however, that this method is not as accurate as when taking white balance. o By selecting Preset White, the R/G/B amplifiers used for white- balance correction are set to their center values. Center Values 80
  • 81. AWB (Auto White Balance) − Unlike the human eye, cameras are not adaptive to different color temperatures of different light source types or environments. • This means that the camera must be adjusted each time a different light source is used, otherwise the color of an object will not look the same when the light source changes. • This is achieved by adjusting the camera’s white balance to make a ‘white’ object always appear white. • Once the camera is adjusted to reproduce white correctly, all other colors are also reproduced as they should be. 81
  • 82. AWB (Auto White Balance) − The AWB is achieved by framing the camera on a white object – typically a piece of white paper/clothe or grayscale chart – so it occupies more than 70% of the display. − Then pressing the AWB button on the camera body instantly adjusts the camera white balance to match the lighting environment. Macbeth Chart 82
  • 83. ATW (Auto Tracing White Balance) – The AWB is used to set the correct color balance for one particular shooting environment or color temperature. – The ATW continuously adjusts camera color balance in accordance with any change in color Temperature. • For example, imagine shooting a scene that moves from indoors to outdoors. Since the color temperature of the indoor lighting and outdoor sunlight are very different, the white balance must be changed in real time in accordance with the ambient color temperature. 83
  • 84. Black Balance − To ensure accurate color reproduction throughout all video levels, it is important that the red, green, and blue channels are also in correct balance when there is no incoming light. − When there is no incoming light, the camera’s red, green, and blue outputs represent the “signal floors” of the red, green, and blue signals, and unless these signal floors are matched, the color balance of other signal levels will not match either. 84
  • 85. Black Balance − It is necessary when: • Using the camera for the first time • Using the camera after a significant perid out of use • Sudden change in temperature – Without this adjustment, the red, green, and blue color balance cannot be precisely matched even with correct white balance adjustments. 85
  • 86. 86
  • 87. Basic Composition of Beam Splitter and Image Sensor 87 Aperture
  • 88. Using a zoom lens correctly requirements: • Flange back adjustment • White balance adjustment, White shading adjustment • Cleaning 88 Basic Composition of Beam Splitter and Image Sensor
  • 89. 89 Prism Red Sensor Green Sensor Blue Sensor Red Filter Blue Filter Green Filter Total Reflectance Dichriotic Surface Incident Light
  • 90. Prism and Dichroic Layers 90 Green cost Magenta cost
  • 91. Prism and Dichroic Layers 91 Transmittance of dichroic coating Spectral characteristic of blue-reflecting dichroic coating Spectral characteristic of an entire color separation system
  • 92. Prism and Dichroic Layers 92
  • 93. – The dichroic layer is used to reflect one specific color while passing other colors through itself. – The three-color prisms use a combination of total reflection layers and color selective reflection layers to confine a certain color. – For example, the blue prism will confine only the blue light, and will direct this to the blue imager. – White shading is seen in cameras that adopt a dichroic layer in their color separation system. Prism and Dichroic Layers 93
  • 94. Characteristic Variations due to Polarization 94 – Light can be thought of as a mixture if transverse waves, some oscillating perpendicular to the plane of incidence (S components) and some oscillating parallel to it (P components). – Natural light contains an equal mixture of S and P components, but light reflected from a glossy surface is polarized, because the S components are reflected more strongly than the P components. – A dichroic coating has different characteristics for S polarized light and P polarized light. The color of polarized light is therefore different from its original. – This effect can be prevented by placing a quarter-wave plate in front of the prism to change the plane polarization of incident light to circular polarization. • A quartz filter used as a quater-wave plate can almost completely eliminate the polarization effect. • A disadvantage is the high cost of the filter material. Polarization characteristics ofcolorseparationprism Correction of polarization by a quartz filter
  • 95. Quarter-Wave Plate 95 – A quarter-wave plate has an internal optic axis. – It generates a quarter-wave phase difference between light polarized in the plane parallel to the optic axis and light polarized in the plane perpendicular to the optic axis. – Circularly polarized light can be thought of as a composition of two components that are polarized in perpendicular planes and are one-quarter wavelength out of phase. – Wave plate therefore has the following properties: • It changes circularly polarized light to light polarized in a plane 45 degree to its optic axis. • It changes light polarized in a plane 45 degree to its optic axis into circularly polarized light.
  • 96. Quarter-Wave Plate 96 – A quartz plate is double-refractive, with different indices of refraction for ordinary rays and extraordinary rays. – If the refractive index for ordinary rays is 𝒏𝒐, the refractive index for extraordinary rays is 𝒏𝒆, and the thickness of the quartz plate is 𝒅, then the plate is a quarter-wave plate for wavelengths 𝜆 satisfying the equation: 𝑁 + 1 4 𝜆 = 𝑛𝑜 − 𝑛𝑒 𝑑 𝑁: 𝑖𝑛𝑡𝑒𝑔𝑒𝑟 𝑛𝑜 = 1.5443 𝑛𝑒 = 1.5534
  • 97. Flange-Back/Back Focal Length Ff (flange focal length) ring lock screw. 97
  • 99. – Flange-back is an important specification to keep in mind when choosing a lens. – Flange-back describes the distance from the camera’s lens-mount plane (ring surface or flange) to the imager’s surface. – In other words, flange-back is the distance that the mounted lens must correctly frame images on the camera’s image sensor. – Therefore, it is necessary to select a lens that matches the flange-back specifications of the given camera. Back Focal Length – Similar to flange-back is back focal length, which describes the distance from the very end of the lens (the end of the cylinder that fits into the camera mount opening) to the imager’s surface. – The back focal length of the camera is slightly shorter than its flange-back. Flange-Back/Back Focal Length 99
  • 100. Flange-back is measured differently depending on whether the camera uses a three-chip or one-chip imaging system – The flange-back of a one-chip camera is simply: The distance between the lens mount plane and the imager’s surface. – The flange-back of a three-chip camera additionally includes: • The distance that light travels through the prism system used to separate it into R, G, and B color components. • The distance that light travels through this glass material is converted to the equivalent distance if it had traveled through air. − If a glass block of thickness d (mm) and refractive index n is inserted behind the lens, the flange-back is affected according to the formula: Flange-Back 100 𝑭𝑩 𝒊𝒏 𝒂𝒊𝒓 = 𝑭𝑩 𝒂𝒄𝒕𝒖𝒂𝒍 − (𝟏 − 𝟏 𝒏 ) × 𝒅
  • 101. Flange-Back − In today’s cameras, flange-back is determined by the lens-mount system that the camera uses. • Three-chip cameras use the bayonet mount system • One-chip security cameras use either the C-Mount or CS-Mount system. • The flange-back of the C-Mount and CS-Mount systems is standardized as 17.526 mm and 12.5 mm, respectively. • There are three flange-back standards for the bayonet mount system: 35.74 mm, 38.00 mm, and 48.00 mm. 101
  • 102. Flange-Back Adjustment F.B adjustment “To fit the flange back of zoom lens to the flange back of camera” • Without it ,focus is change during focusing. Tracking Adjustment “F.B adjustment for R,G,B channels” • Tracking adjustment is not needed in CCD/CMOS camera because the fixation positions of CCDs and CMOSes are standardized in accordance with the longitudinal chromatic aberration of lens. 102
  • 104. Flange-Back Adjustment Procedure Sony Instruction 1. Set the iris control to manual, and open the iris fully. 2. Place a flange focal length adjustment chart approximately 3 meters from the camera and adjust the lighting to get an appropriate video output level. 3. Loosen the Ff (flange focal length) ring lock screw. 4. With either manual or power zoom, set the zoom ring to telephoto. 5. Aim at the flange focal length adjustment 6. Set the zoom ring to wide angle. 7. Turn the Ff ring to bring the chart into focus. Take care not to move the distance ring. 8. Repeat steps 4 through 7 until the image is in focus at both telephoto and wide angle. 9. Tighten the Ff ring lock screw. Place a Siemens star chart at an 3m for a studio or ENG lens, and 5 to 7 m for an outdoor lens 104
  • 105. Flange-Back Adjustment Procedure 105 Canon Instruction (Back Focus Adjustment) − If the relation between the image plane of the lens and the image plane of the camera is incorrect, the object goes out of focus at the time of zooming operation. Follow the procedure below to adjust the back focus of the lens. 1. Select an object at an appropriate distance (1.6 to 3m recommended). Use any object with sharp contrast to facilitate the adjustment work. 2. Set the iris fully open. 3. Set the lens to the telephoto angle by turning the zoom ring. 4. Bring the object into focus by turning the focus ring. 5. Set the lens to the widest angle by turning The zoom ring. 6. Loosen the flange back lock screw, and turn the flange back adjusting ring to bring the object into focus. 7. Repeat steps 3 to 6 a few times until the object is brought into focus at both the widest angle and telephoto ends. 8. Tighten the flange back lock screw.
  • 106. Flare – Flare is caused by numerous diffused (scattered) reflections of the incoming light within the camera lens. – This results in the black level of each red, green, and blue channel being raised, and/or inaccurate color balance between the three channels. 106 R channel G channel B channel Inaccuracy of color in darker regions of the grayscale Pedestal level balance incorrect due to the flare effect (B channel pedestal higher than R channel and G channel)
  • 107. Volt Volt H H CCD Imager WF Monitor Iris Ideal Lens Real Lens Flare 107
  • 108. CCD Imager WF Monitor Iris H H Ideal Lens Real Lens Volt Volt Flare 108
  • 109. – On a video monitor, flare causes the picture to appear as a misty (foggy) image, sometimes with a color shade. – In order to minimize the flare effect: A flare adjustment function is pprovided, which optimizes the pedestal level and corrects the balance between the three channels electronically. Test card for overall flare measurement Test card for localized flare measurement 109 Flare
  • 110. Master Flare Function − The Master FLARE function enables one VR to control the level of the master FLARE with keeping the tracking of all R/G/B channels. − This feature makes it possible to control during operation since the color balance is never off. 110
  • 111. 111 Lens Flare − Lens flare is the light scattered in lens systems. − Flare manifests itself as swift in black levels with a change light level. Camera Alignment with Diamond Display
  • 112. 112 Blacks Lifted Slightly Cool Green-Blue White Points slightly Blue Green-Red White Points slightly Green Green-Blue White Point Green-Red White Point Camera Alignment with Diamond Display Flare Adjustment • Iris down the camera • Set black level to 0mv • Adjust Iris so white chip is 1 to 2 f-stop above 700mv • Adjust the flares for black chip to 0mv Black Lift Chip Chart
  • 113. White Shading Shading: Any horizontal or vertical non-linearity introduced during the image capture. White shading: It is a phenomenon in which a green or magenta cast appears on the upper and lower parts of the screen, even when white balance is correctly adjusted in the screen center. 113
  • 114. – Due to differences in the angle of incidence of light on the dichroic coating's, when the white balance is correct at the center of the image, the upper and lower edges may have a green or magenta cast. – A dichroic coating exploits the interference of light. Different angles of incidence result in different light paths in a multilayer coating, causing variations in the color separation characteristic. As a general rule, the larger the angle of incidence, the more the characteristic is shifted in the short-wavelength direction. White Shading 114 Incidence characteristic of a blue-reflecting dichroic coating
  • 115. Relation between Exit Pupil and White Shading – The exit pupil refers to the (virtual) image of the diaphragm formed by the lenses behind the diaphragm. – A pencil of rays exiting from a zoom lens diverges from a point on the exit pupil, so the rays directed toward the upper and lower edges of the image strike the dichroic coating at different angles, as can be seen in Figure. The resulting differences in characteristics shade the upper and lower edges of the image toward magenta or green. White Shading 115 Entrance Pupil Exit Pupil Diaphragm
  • 116. White Shading 116 Relation between Exit Pupil and White Shading – Due to vignetting, when the lens is zoomed or stopped down, the exit pupil changes slightly, causing changes in the shading. – Use of an extender also causes shading effects by changing the exit pupil. – The amount of shading is related to the exit pupil of the lens, so white shading has to be readjusted when a lens is replaced by a lens with a different exit pupil distance. Vignetting
  • 117. Color Shading of Defocused Images – This effect is not present when the image is in focus, but when the subject has depth, so that part of it is defocused, the colors of the defocused part are shaded in the vertical direction. – As with white shading, the cause is the difference in spectral characteristics at different angles of incidence on the dichroic coating. White Shading 117 – Because rays a and b strike the dichroic coating at different angles, ray a is transmitted as magenta light and ray b as closer to green. • When the image is in focus, both rays arrive at the same point, and their colors average out so that no shading occurs. • When the image is out of focus, however, part of it looks magenta and part of it looks green. This effect is difficult to correct electronically.
  • 118. – The color-filtering characteristics of each prism slightly change according to the angle that the light enters each reflection layer (incident angle). – Different incident angles cause different light paths in the multilayer-structured dichroic coating layer, resulting in a change of the prism’s spectral characteristics. – This effect is seen as the upper and lower parts of the screen having a green or magenta cast, even with the white balance correctly adjusted in the center. White Shading, Type 1 118
  • 119. − Another type of white shading is also caused by a lens’s uneven transmission characteristics. • In this case, it is observed as the center of the image being brighter than the edges. • This can be corrected by applying a parabolic correction signal to the video amplifiers used for white balance. − Another cause of White shading is uneven sensitivity of the photo sensor in the imager array. • In this case, the white shading phenomenon is not confined in the upper and lower parts of the screen. White Shading, Type 2 and 3 119 Prism
  • 120. Volts Horizontal Ideal Light Box Ideal Lens 52 u Sec Volts 20 m Sec Vertical 120 A B A B C D C D
  • 121. Ideal Light Box Real Lens Volts Horizontal 52 u Sec Volts 20 m Sec Vertical Lens’s uneven transmission characteristics (Type 2) 121 A B C D A B C D
  • 122. Volts H Volts H Shading Correction Signals to the video amplifiers used for white balance. Volts H + Para - Para - Saw + Saw Corrected signal 122 A B A B A B
  • 123. – The exit pupil refers to the (virtual) image of the diaphragm formed by the lenses behind the diaphragm. – The amount of shading is related to the exit pupil of the lens, so white shading has to be readjusted when a lens is replaced by a lens with a different exit pupil distance. – An extender also changes the exit pupil, hence the shading. White Shading Adjustment Note 123 Entrance Pupil Exit Pupil Diaphragm
  • 124. V modulation is a type of white shading that occurs when there is a vertical disparity in the center of the lens and prism optical axis. – This causes the red and blue light components to be projected ‘off center’ of their associated imagers, which results in green and magenta casts to appear on the top and bottom of the picture frame. – V modulation is caused by • the different characteristics of each lens and/or • the different optical axis of each zoom position – It can be compensated for in the camera. • Since this compensation data directly relates to the lens, it is automatically stored/recalled as part of the Lens File. V Modulation 124
  • 125. Off Center Projection on R and B Imagers When the red and blue light components to be projected 'off center' of their associated imager sensors, green and magenta casts are appeared on the top and bottom of the picture frame. off center on center Green Imager Blue Imager Red Imager 125
  • 126. Black Shading – Black shading is a phenomenon observed as unevenness in dark areas of the image due to dark current noise of the imaging device. – A black shading adjustment function is available to suppress this phenomenon to a negligible level. Dark Current Noise: − The noise induced in an imager by unwanted electric currents generated by various secondary factors, such as heat accumulated within the imaging device. 126
  • 127. Registration 127 – Registration means aligning the three images formed on the red, green, blue channels so that they overlap precisely. – With three pick-up camera it is necessary before using camera. – In a CCD camera it is so stable that the adjustment is not necessary. Registration Examination
  • 128. 128
  • 129. Depth of Field 129 Deep Depth of Field Shallow Depth of Field
  • 130. 130 Circle of Confusion and Permissible Circle of Confusion Circle of Confusion – Since all lenses contain a certain amount of spherical aberration and astigmatism, they cannot perfectly converge rays from a subject point to form a true image point (i.e., an infinitely small dot with zero area). – In other words, images are formed from a composite of dots (not points) having a certain area, or size. – Since the image becomes less sharp as the size of these dots increases, the dots are called “circles of confusion.” – Thus, one way of indicating the quality of a lens is by the smallest dot it can form, or its “minimum circle of confusion.” Permissible Circle of Confusion – The maximum allowable dot size in an image is called the “permissible circle of confusion.” (The largest circle of confusion which still appears as a “point” in the image)
  • 131. Permissible Circle of Confusion & Effect of the Image Sensor – The Permissible Circle of Confusion is re-defined by the sampling of the image sensor. – The permissible Circle of Confusion is the distance between two sampling lines. – For the Super 35mm lens, the vertical height is 13.8 mm. – For the 2/3” lens, the vertical height is 5.4 mm. 131 106 5.4 mm / 2160 vertical pixels = 0.0025mm 13.8 mm / 2160 vertical pixels = 0.0064mm Super35mm PixelSize 6.4 x 6.4 um 2/3-inch 4K PixelSize 2.5 x 2.5 um Permissible CoC isconstrained
  • 132. 132 Focal Plane Image Sensor Depth of Field, Depth of Focus & Permissible Circle of Confusion
  • 133. 133 Image Sensor Focal Plane Depth of Field Depth of Focus Permissible Circle of Confusion Depth of Field, Depth of Focus & Permissible Circle of Confusion
  • 134. 134 Image Sensor Focal Plane Depth of Field Depth of Focus Circle of Confusion Circle of Confusion Depth of Field, Depth of Focus & Permissible Circle of Confusion Permissible Circle of Confusion
  • 135. – In optics, a circle of confusion is an optical spot caused by a cone of light rays from a lens not coming to a perfect focus when imaging a point source. – If an image is out of focus by less than the “Permissible Circle of Confusion”, the out-of-focus is undetectable. Depth of Field, Depth of Focus & Permissible Circle of Confusion 135 Maximum non-convergance allowed to be in focus Permissible Circle of Confusion (CoC) Film/Sensor Where the light is recorded Depth of Field Range that is focus Focus Point Near limit of Focus Far limit of Focus
  • 136. Permissible Circle of Confusion 136 53 Perfect Focus Acceptable Focus Unacceptable Focus Assumption: Permissible Circle of Confusion
  • 137. Permissible Circle of Confusion 137 Permissible Circle of Confusion Depth of Field (DoF) Focused Plane
  • 138. Depth of Field 138 Sensor Depth of Field is greater behind the subject than in front. f=Focal Length 𝐝𝟏: Far limit of depth of field (Front depth of field) Depth of focus Sensor 𝐝𝟐: Near limit of depth of field (Rear depth of field ) Depth of focus 𝜹: permissible circle of confusion (CoC) diameter 𝒍: subject distance (distance from the first principal point to subject) 𝒅𝟏 = 𝜹 × 𝑭𝑵𝑶 × 𝒍𝟐 𝒇𝟐 − 𝜹 × 𝑭𝑵𝑶 × 𝒍 𝒅𝟐 = 𝜹 × 𝑭𝑵𝑶 × 𝒍𝟐 𝒇𝟐 + 𝜹 × 𝑭𝑵𝑶 × 𝒍
  • 139. Depth of Field − When focusing a lense on an object, there is a certain distance range in front of and behind the focused object that also comes into focus. − Depth of field indicates the distance between the closest and furthest object that are in focus. • When this distance is long ,the depth of field is deep. • When this distance is short ,the depth of field is shallow. 139
  • 140. It is governed by the three following factors: I. The larger the iris (bigger F-number & F-stop), the deeper the depth of field (smaller aperture). II. The shorter the lens’s focal length, the deeper the depth of field. III. The further the distance between the camera and the subject, the deeper the depth of field. – Depth of field can therefore be controlled by changing these factors, allowing camera operators creative expression. – For example: A shallow depth of field is used for shooting portraits, where the subject is highlighted and the entire background is blurred. Depth of Field 140
  • 141. Focal Length and Depth of Field 141 1
  • 142. Aperture and Depth of Field 142 2
  • 143. Focus Distance and Depth of Field 143 3
  • 144. 144
  • 145. Depth of Field Is Influenced by the Aperture Setting 145 Aperture and Depth of Field A dot of light from the subject Large Aperture Small Aperture Light Sensor Narrow Depth of Field Deep Depth of Field Lens Lens Large Aperture Small Aperture 𝑪𝟎: Permissible Circle of Confusion Large Aperture Small Aperture Focus Plane Focus Plane DOF DOF
  • 146. 146 Depth of Field Is Influenced by the Aperture Setting Permissible Circle of Confusion Permissible Circle of Confusion Circle of Least Confusion Circle of Least Confusion Focal Plane Focal Plane Depth of Field Depth of Field Depth of Focus Depth of Focus Circle of Confusion (CoC) Circle of Confusion (CoC) Sensor Sensor Aperture Aperture Optical Axis Optical Axis Far Focus Far Focus Near Focus Near Focus Limit Limit Limit Limit
  • 147. 147 Depth of Field Is Influenced by the Aperture Setting Depth of Focus Depth of Focus Depth of Focus
  • 148. 148 Depth of Field Is Influenced by the Aperture Setting
  • 149. Depth of Field Is Influenced by the Focal Length of the Lens 149 𝒇 𝒇 Permissible Circle of Confusion Permissible Circle of Confusion Depth of field Depth of field Longer focal length means smaller depth of field range
  • 150. Depth of Field Is Influenced by the Focal Length of the Lens 150 Depth of Field Depth of Focus Depth of Focus Depth of Field Lenses set for Sharpest Focus Scene Image Sensor Far Limit Near Limit Permissible Circle of Confusion Permissible Circle of Confusion Far Limit Near Limit
  • 151. Depth of Field Is Influenced by the Focal Length of the Lens 151 𝒇𝟏 𝒇𝟐 𝑫𝟏 𝑫𝟐 𝑰𝒎𝒂𝒈𝒆 𝒐𝒏 𝑺𝒆𝒏𝒔𝒐𝒓 𝑰𝒎𝒂𝒈𝒆 𝒐𝒏 𝑺𝒆𝒏𝒔𝒐𝒓
  • 152. Depth of Field is influenced by the Subject to Camera Distance 152 Permissible Circle of Confusion Permissible Circle of Confusion Depth of field Depth of field Longer subject distances means larger depth of field range
  • 153. 153 Depth of Field Calculators Apps