SlideShare ist ein Scribd-Unternehmen logo
1 von 32
What is Spatial Resolution ?
A presentation for better understanding!
S.A.Quadri
CEDEC , USM , Malaysia
Effect of Spatial resolution on visualization




(Satellite image : Reference http://visibleearth.nasa.gov/view_rec.php?id=1427)
Image resolution
It is an umbrella term that describes the detail an image holds.
The term applies to raster digital images, film images, and other types of images.
Higher resolution means more image details.


Image resolution can be measured in various ways.
Resolution quantifies how close lines can be to each other and still be visibly resolved.
Resolution units can be tied to physical sizes (e.g. lines per mm, lines per inch), to the
overall size of a picture (lines per picture height, also known simply as lines, TV lines, or
TVL), or to angular subtenant.
Line pairs are often used instead of lines.
A line pair comprises a dark line and an adjacent light line.
A line is either a dark line or a light line.
A resolution of 10 lines per millimeter means 5 dark lines alternating with 5 light lines, or 5
line pairs per millimeter (5 LP/mm).
Photographic lens and film resolution are most often quoted in line pairs per millimeter.
Resolution of digital images

The resolution of digital images can be described in many different ways.

The term resolution is often used for a pixel count in digital imaging, even though American, Japanese, &
international standards specify that it should not be so used, at least in the digital camera field.

•An image of N pixels high by M pixels wide can have any resolution less than N lines per picture height, or N TV
lines. But when the pixel counts are referred to as resolution, the convention is to describe the pixel resolution with
the set of two positive integer numbers, where the first number is the number of pixel columns (width) and the
second is the number of pixel rows (height), for example as 640 by 480.

•Another popular convention is to cite resolution as the total number of pixels in the image, typically given as
number of megapixels, which can be calculated by multiplying pixel columns by pixel rows and dividing by one
million.

•Other conventions include describing pixels per length unit or pixels per area unit, such as pixels per inch or per
square inch.

•According to the same standards, the number of effective pixels that an image sensor or digital camera has is the
count of elementary pixel sensors that contribute to the final image, as opposed to the number of total pixels,
which includes unused or light-shielded pixels around the edges.

None of these pixel resolutions are true resolutions, but they are widely referred to as such;
they serve as upper bounds on image resolution.
Effect of pixel resolutions

Below is an illustration of how the same image might appear at different pixel
resolutions, if the pixels were poorly rendered as sharp squares (normally, a
smooth image reconstruction from pixels would be preferred, but for illustration of
pixels, the sharp squares make the point better).
Further Explanation
An image that is 2048 pixels in width and 1536 pixels in height has a total of 2048×1536 = 3,145,728
pixels.
One could refer to it as 2048 by 1536 or a 3.1-megapixel image.


Unfortunately, the count of pixels is not a real measure of the resolution of digital camera
images, because :
Color image sensors are typically set up to alternate color filter types over the light sensitive individual
pixel sensors.
Digital images ultimately require a red, green, and blue value for each pixel to be displayed or printed,
but one individual pixel in the image sensor will only supply one of those three pieces of information.
The image has to be interpolated or demosaiced to produce all three colors for each output pixel.
Spatial resolution
The measure of how closely lines can be resolved in an image is called spatial resolution, and it depends on properties of the system
creating the image, not just the pixel resolution in pixels per inch (ppi).
For practical purposes the clarity of the image is decided by its spatial resolution, not the number of pixels in an image.


In effect, spatial resolution refers to the number of independent pixel values per unit length.


•The spatial resolution of computer monitors is generally 72 to 100 lines per inch, corresponding to pixel resolutions of 72 to 100 ppi.
•With scanners, optical resolution is used to distinguish spatial resolution from the number of pixels per inch.
•In geographic information systems (GISs), spatial resolution is measured by the ground sample distance (GSD) of an image, the pixel
spacing on the Earth's surface.
•In astronomy one often measures spatial resolution in data points per arc second subtended at the point of observation, since the physical
distance between objects in the image depends on their distance away & this varies widely with the object of interest.
•In electron microscopy, line or fringe resolution refers to the minimum separation detectable between adjacent parallel lines (e.g.
between planes of atoms), while point resolution instead refers to the minimum separation between adjacent points that can be both
detected & interpreted e.g. as adjacent columns of atoms, for instance.
•In Stereoscopic 3D images, spatial resolution could be defined as the spatial information recorded or captured by two viewpoints of a
stereo camera (left & right camera).
It could be argued that such "spatial resolution" could add an image that then would not depend solely on pixel count or Dots per inch
alone, when classifying and interpreting overall resolution of a given photographic image or video frame.
Spatial resolution and Pixel count




        Just make out difference !
Spatial resolution      Pixel count
Spectral resolution

Color images distinguish light of different spectra.
Multi-spectral images resolve even finer differences of spectrum or wavelength than is needed to reproduce
color. That is, they can have higher spectral resolution. i.e. (high strength of each band).

Temporal resolution

Movie cameras and high-speed cameras can resolve events at different points in time.
The time resolution used for movies is usually 15 to 30 frames per second (frames/s),
while high-speed cameras may resolve 100 to 1000 frames/s, or even more.

Radiometric resolution

Radiometric resolution determines how finely a system can represent or distinguish differences of intensity,
and is usually expressed as a number of levels or a number of bits, for example, 8 bits or 256 levels that is
typical of computer image files.
The higher the radiometric resolution, the better subtle differences of intensity or reflectivity can be
represented, at least in theory.
In practice, the effective radiometric resolution is typically limited by the noise level, rather than by the
number of bits of representation.
Resolution in various media
This is a list of resolutions for various media.

Analog and early digital
352×240 : Video CD
300×480 : Umatic, Betamax, VHS, Video8
350×480 : Super Betamax, Betacam
420×480 : LaserDisc, Super VHS, Hi8
640×480 : Analog broadcast (NTSC)
670×480 : Enhanced Definition Betamax
768×576 : Analog broadcast (PAL, SECAM)

Digital
720×480 : D-VHS, DVD, miniDV, Digital8, Digital Betacam
720×480 : Widescreen DVD (anamorphic)
1280×720 : D-VHS, HD DVD, Blue-ray, HDV (miniDV)
1440×1080 : HDV (miniDV)
1920×1080 : HDV (miniDV), AVCHD, HD DVD, Blu-ray, HDCAM SR
2048×1080 : 2K Digital Cinema
4096×2160 : 4K Digital Cinema
7680×4320 : UHDTV

Film
35 mm film is scanned for release on DVD at 1080 or 2000 lines as of 2005.
However some photography sources gives 5380 x 3620 as the resolution of 35mm film.
It is similar to 19.5 Mpix, of course with identical spatial resolution.
IMAX, including IMAX HD and OMNIMAX: approximately 10,000×7000 (7000 lines) resolution.
It is about 70 Mpix, which may be considered to the biggest resolution.
Spatial Resolution and Pixel Size

The image resolution and pixel size are often used interchangeably.
In reality, they are not equivalent. An image sampled at a small pixel size does not necessarily has a high resolution.
The following three images illustrate this point. The first image is a SPOT image of 10 m pixel size.
It was derived by merging a SPOT panchromatic image of 10 m resolution with a SPOT multispectral image of 20 m
resolution.
 The effective resolution is thus determined by the resolution of the panchromatic image, which is 10 m.
This image is further processed to degrade the resolution while maintaining the same pixel size.
The next two images are the blurred versions of the image with larger resolution size, but still digitized at the same
pixel size of 10 m.

Even though they have the same pixel size as the first image, they do not have the same resolution
RESOLUTION AND SHARPNESS
To determine resolution, a raster is normally used, employing increasingly fine bars and gaps. A common example in
real images would be a picket fence displayed to perspective.
In the image of the fence, shown in Fig. 1, it is evident that the gaps between the boards become increasingly difficult
to discriminate as the distance becomes greater.
 This effect is the basic problem of every optical image.

In the foreground of the image, where the boards and gaps have not yet been squeezed together by the perspective, a
large difference in brightness is recognized.
The more the boards and gaps are squeezed together in the distance, the less difference is seen in the brightness.
To better understand this effect, the brightness values are shown along the yellow arrow in an x / y diagram (Fig. 2).
The brightness difference seen in the y-axis is called contrast.
The curve itself functions like a harmonic oscillation; because the brightness does not change over time but spatially
from left to right, the x-axis is called spatial frequency.
It can be clearly seen in Fig. 1 that the finer the reproduced structure, the more the contrast
will be “slurred” at that point in the image.

The limit of the resolution has been reached when one can no longer clearly differentiate
between the structures.

This means the resolution limit (red circle indicated in Fig. 2) lies at the spatial frequency
where there is just enough contrast left to clearly differentiate between board and gap.
Resolution = Sharpness?
Are resolution and sharpness the same? By looking at the images shown below, one can quickly determine which image
is sharper.
Although the image on the left comprises twice as many pixels, the image on the right, whose contrast at coarse details
is increased with a filter, looks at first glance to be distinctly sharper.

The resolution limit describes how much information makes up each image, but not how a person evaluates this
information.
The human eye, in fact, is able to resolve extremely fine details.
This ability is also valid for objects at a greater distance.
The decisive physiological point, however, is that fine details do not contribute to the subjective perception of
sharpness.
 Therefore, it’s important to clearly separate the two terms, resolution and sharpness.
MTF
Modulation transfer function describes the relationship between resolution and sharpness, and is the basis for a
scientific confirmation of the phenomenon described earlier.
The modulation component in MTF means approximately the same as contrast.
If we evaluate the contrast (modulation) not only where the resolution reaches its limit, but over as many spatial
frequencies as possible and connect these points with a curve, we arrive at the so-called MTF.

As shown in figure , the x-axis illustrates the already-established spatial frequency expressed in lp/ mm on the y-axis,
instead of the brightness seen in modulation.
A modulation of 1 (or 100%) is the ratio of the brightness of a completely white image to the brightness of a
completely black image.
The higher the spatial frequency— in other words the finer the structures in the image - the lower the transferred
modulation. (lp= lines pair )




                                                               Conclusions:
                                                               •Sharpness does not depend only on resolution.

                                                               • The modulation at lower spatial frequencies is
                                                                 essential.

                                                               •Contrast in coarse details is significantly more imp for
                                                               the impression of sharpness than contrast at the
                                                               resolution limit.
Resolution of the human eye
The fovea of the human eye (the part of the retina that is responsible for sharp central vision) includes
about 140 000 sensor-cells per square millimeter.
This means that if two objects are projected with a separation distance of more than 4 m on the fovea,
a human with a normal visual acuity (20/20) can resolve them.
On the object side, this corresponds to 0.2 mm in a distance of 1 m (or 1 minute of arc).
In practice of course, this depends on whether the viewer is concentrating only on the center of the
viewing field, whether the object is moving very slowly or not at all, and whether the object has good
contrast to the background. Allowing for some amount of tolerance, this would be around 0.3 mm at 1
m distance (= 1.03 minutes of arc ). In a certain range, one can assume a linear relation between
distance and the detail size
This hypothesis can be easily proved !!!
Pin the test pattern displayed in Figure below on a well-lit wall and walk away 10 m.
One should be able to clearly differentiate between the lines and gaps in Figure.
Of course, this requires an ideal visual acuity of 20/20.
Nevertheless, if you can’t resolve the pattern in Figure,
you might consider paying a visit to an ophthalmologist ! 
How we interpret optical images ?

Let us see significance of spatial resolution and various other related terms:

Four main types of information contained in an optical image are often utilized for
image interpretation:

•Radiometric Information (i.e. brightness, intensity, tone),

•Spectral Information (i.e. color, hue),

•Textural Information,

•Geometric and Contextual Information.


          They are illustrated in the following examples,
There are different types of images :

   •Panchromatic Images

   •Multispectral Images

   •Color Composite Images

   •True Color Composite images

   •False Color Composite images

   •Natural Color Composite
Panchromatic image
A panchromatic image consists of only one band.
It is usually displayed as a grey scale image.
Panchromatic image may be similarly interpreted as a black-and-white aerial photograph of the area.
The Radiometric Information is the main information type utilized in the interpretation.




                A panchromatic image extracted from a SPOT panchromatic scene at a ground resolution of 10 m.


           (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and
           http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
Multispectral Images
A multispectral image consists of several bands of data.
For visual display, each band of the image may be displayed one band at a time as a grey scale image, or in combination of 3 bands at a time
as a color composite image.
Interpretation of a multispectral color composite image will require the knowledge of the spectral reflectance signature of the targets in the
scene.
In this case, the spectral information content of the image is utilized in the interpretation.

The following 3 images show the 3 bands of a multispectral image extracted from a SPOT multispectral scene at a ground resolution of 20 m.




    (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
Color Composite Images

In displaying a color composite image, three primary colors (red, green and blue) are used.
When these three colors are combined in various proportions, they produce different colors in the visible
spectrum.
Associating each spectral band (not necessarily a visible band) to a separate primary color results in a
color composite image.
True Color Composite
If a multispectral image consists of the three visual primary color bands (red, green, blue), the three bands may be
combined to produce a "true color" image.
The bands 3 (red band), 2 (green band) and 1 (blue band) of a LANDSAT TM image or an IKONOS multispectral
image can be assigned respectively to the R, G, and B colors for display.
In this way, the colors of the resulting color composite image resemble closely what would be observed by the human
eyes.




                            A 1-m resolution true-color IKONOS image

 (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
False Color Composite
 The display color assignment for any band of a multispectral image can be done in an entirely arbitrary manner.
 In this case, the color of a target in the displayed image does not have any resemblance to its actual colour.
 The resulting product is known as a false colour composite image.
 There are many possible schemes of producing false colour composite images.
 Some schemes are suitable for detecting certain objects in the image.




                             False colour composite multispectral SPOT image



(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
Natural Colour Composite
For optical images lacking one or more of the three visual primary colour bands (i.e. red, green and blue), the spectral
bands (some of which may not be in the visible region) may be combined in such a way that the appearance of the
displayed image resembles a visible colour photograph, i.e. vegetation in green, water in blue, soil in brown or grey, etc.

Some people refer to this composite as a "true colour" composite. However, this term is misleading since in many
instances the colors are only simulated to look similar to the "true" colors of the targets. The term "natural colour" is
preferred.




                                   Natural colour composite multispectral SPOT image



    (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
Vegetation Indices
Different bands of a multispectral image may be combined to accentuate the vegetated areas.
One such combination is the ratio of the near-infrared band to the red band. This ratio is known as the
Ratio Vegetation Index (RVI)

RVI = NIR/Red

Normalized Difference Vegetation Index (NDVI)
Since vegetation has high NIR reflectance but low red reflectance, vegetated areas will have higher RVI
values compared to non-vegetated areas. Another commonly used vegetation index is the Normalized
Difference Vegetation Index (NDVI) computed by


NDVI = (NIR - Red)/(NIR + Red)
Textural Information
Texture is an important aid in visual image interpretation, especially for high spatial resolution imagery.
It is also possible to characterize the textural features numerically, and algorithms for computer-aided
automatic discrimination of different textures in an image are available.




                        IKONOS 1-m resolution pan-sharpened color image of an oil palm plantation.
         Even though the general colour is green throughout, three distinct, land cover types can be identified from the
                                                       image texture.


   (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
Remote Sensing Satellites
Optical remote sensing makes use of visible, near infrared & short-wave infrared sensors to form images of the earth's surface .
By detecting the solar radiation reflected from targets on the ground.
Different materials reflect and absorb differently at different wavelengths.
Thus, the targets can be differentiated by their spectral reflectance signatures in the remotely sensed images.
Optical remote sensing systems are classified into the following types, depending on the number of spectral bands used in the imaging
process.


Several remote sensing satellites are currently available, providing imagery suitable for various types of applications.
Each of these satellite-sensor platform is characterized by the
•Wavelength bands employed in image acquisition,
•Spatial resolution of the sensor,
•The coverage area and the temporal coverage, i.e. how frequent a given location on the earth surface can be imaged by the
imaging system.
In terms of the spatial resolution, the satellite imaging systems can be classified into:
•Low resolution systems (approx. 1 km or more)
•Medium resolution systems (approx. 100 m to 1 km)
•High resolution systems (approx. 5 m to 100 m)
•Very high resolution systems (approx. 5 m or less)
In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into:
•Optical imaging systems (include visible, near infrared, and shortwave infrared systems)
•Thermal imaging systems
•Synthetic aperture radar (SAR) imaging systems
Optical/thermal imaging systems can be classified according to the number of spectral bands used:
•Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems
•Multispectral (several spectral bands) systems
•Superspectral (tens of spectral bands) systems
•Hyper spectral (hundreds of spectral bands) systems



Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands &
polarization modes used in data acquisition, e.g.:
•Single frequency (L-band, or C-band, or X-band)
•Multiple frequency (Combination of two or more frequency bands)
•Single polarization (VV, or HH, or HV)
•Multiple polarization (Combination of two or more polarization modes)
References :

http://www.ssec.wisc.edu/sose/pirs/pirs_m2_res.html

http://www.crisp.nus.edu.sg/~research/tutorial/opt_int.htm


http://www.arri.de/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf


http://en.wikipedia.org/wiki/Image_resolution
Thank
 You

Weitere ähnliche Inhalte

Was ist angesagt?

DIGITAL IMAGE PROCESSING - LECTURE NOTES
DIGITAL IMAGE PROCESSING - LECTURE NOTESDIGITAL IMAGE PROCESSING - LECTURE NOTES
DIGITAL IMAGE PROCESSING - LECTURE NOTESEzhilya venkat
 
Digital image processing 1
Digital  image processing 1Digital  image processing 1
Digital image processing 1Dhaval Jalalpara
 
Digital image processing
Digital image processingDigital image processing
Digital image processinglakhveer singh
 
Digital image processing
Digital image processingDigital image processing
Digital image processingVandana Verma
 
Digital image processing
Digital image processingDigital image processing
Digital image processingMuheeb Awawdeh
 
Image enhancement ppt nal2
Image enhancement ppt nal2Image enhancement ppt nal2
Image enhancement ppt nal2Surabhi Ks
 
Remote Sensing: Resolution Merge
Remote Sensing: Resolution MergeRemote Sensing: Resolution Merge
Remote Sensing: Resolution MergeKamlesh Kumar
 
Introduction to digital image processing
Introduction to digital image processingIntroduction to digital image processing
Introduction to digital image processingHossain Md Shakhawat
 
Image enhancement technique digital image analysis, in remote sensing ,P K MANI
Image enhancement technique  digital image analysis, in remote sensing ,P K MANIImage enhancement technique  digital image analysis, in remote sensing ,P K MANI
Image enhancement technique digital image analysis, in remote sensing ,P K MANIP.K. Mani
 
Digital image processing
Digital image processingDigital image processing
Digital image processingChetan Hulsure
 
Lecture 08 tilted photograph
Lecture 08  tilted photographLecture 08  tilted photograph
Lecture 08 tilted photographSarhat Adam
 
Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies UTTIYACHATTOPADHYAY2
 
Geometry of Aerial Photographs.pdf
Geometry of Aerial Photographs.pdfGeometry of Aerial Photographs.pdf
Geometry of Aerial Photographs.pdfkunedzimwefrancisca
 
Topic stereoscopy, Parallax, Relief displacement
Topic  stereoscopy, Parallax, Relief displacementTopic  stereoscopy, Parallax, Relief displacement
Topic stereoscopy, Parallax, Relief displacementsrinivas2036
 

Was ist angesagt? (20)

DIGITAL IMAGE PROCESSING - LECTURE NOTES
DIGITAL IMAGE PROCESSING - LECTURE NOTESDIGITAL IMAGE PROCESSING - LECTURE NOTES
DIGITAL IMAGE PROCESSING - LECTURE NOTES
 
Types of stereoscope
Types of stereoscopeTypes of stereoscope
Types of stereoscope
 
Digital image processing 1
Digital  image processing 1Digital  image processing 1
Digital image processing 1
 
Spatial resolution
Spatial resolutionSpatial resolution
Spatial resolution
 
Stereoscopy
StereoscopyStereoscopy
Stereoscopy
 
Hyperspectral Imaging
Hyperspectral ImagingHyperspectral Imaging
Hyperspectral Imaging
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Image enhancement ppt nal2
Image enhancement ppt nal2Image enhancement ppt nal2
Image enhancement ppt nal2
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Remote Sensing: Resolution Merge
Remote Sensing: Resolution MergeRemote Sensing: Resolution Merge
Remote Sensing: Resolution Merge
 
Introduction to digital image processing
Introduction to digital image processingIntroduction to digital image processing
Introduction to digital image processing
 
Image enhancement technique digital image analysis, in remote sensing ,P K MANI
Image enhancement technique  digital image analysis, in remote sensing ,P K MANIImage enhancement technique  digital image analysis, in remote sensing ,P K MANI
Image enhancement technique digital image analysis, in remote sensing ,P K MANI
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Hyperspectral Imaging
Hyperspectral ImagingHyperspectral Imaging
Hyperspectral Imaging
 
Lecture 08 tilted photograph
Lecture 08  tilted photographLecture 08  tilted photograph
Lecture 08 tilted photograph
 
Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies
 
Geometry of Aerial Photographs.pdf
Geometry of Aerial Photographs.pdfGeometry of Aerial Photographs.pdf
Geometry of Aerial Photographs.pdf
 
Topic stereoscopy, Parallax, Relief displacement
Topic  stereoscopy, Parallax, Relief displacementTopic  stereoscopy, Parallax, Relief displacement
Topic stereoscopy, Parallax, Relief displacement
 

Andere mochten auch (20)

Nguyen khoi viet cardiac mri for the evaluation of ischemic heart disease jfi...
Nguyen khoi viet cardiac mri for the evaluation of ischemic heart disease jfi...Nguyen khoi viet cardiac mri for the evaluation of ischemic heart disease jfi...
Nguyen khoi viet cardiac mri for the evaluation of ischemic heart disease jfi...
 
Mri artifacts gamal mahdaly
Mri artifacts gamal mahdalyMri artifacts gamal mahdaly
Mri artifacts gamal mahdaly
 
Gre&SE
Gre&SEGre&SE
Gre&SE
 
Distortion Artifacts in MRI and their correction
Distortion Artifacts in MRI and their correctionDistortion Artifacts in MRI and their correction
Distortion Artifacts in MRI and their correction
 
Contrast based artifacts
Contrast based artifactsContrast based artifacts
Contrast based artifacts
 
Radiographic factor
Radiographic factorRadiographic factor
Radiographic factor
 
Mri artifacts
Mri artifactsMri artifacts
Mri artifacts
 
Advances in Brachytherapy Treatment Planning and Delivery
Advances in Brachytherapy Treatment Planning and DeliveryAdvances in Brachytherapy Treatment Planning and Delivery
Advances in Brachytherapy Treatment Planning and Delivery
 
Cardiac mri
Cardiac mriCardiac mri
Cardiac mri
 
Cardiac MRI principle
Cardiac MRI principleCardiac MRI principle
Cardiac MRI principle
 
Special Procedures: TBI, TSET and IORT
Special Procedures: TBI, TSET and IORTSpecial Procedures: TBI, TSET and IORT
Special Procedures: TBI, TSET and IORT
 
MRI artifacts
MRI artifactsMRI artifacts
MRI artifacts
 
Mr spectroscopy
Mr spectroscopyMr spectroscopy
Mr spectroscopy
 
Mr fluoroscopy
Mr fluoroscopyMr fluoroscopy
Mr fluoroscopy
 
Pns New
Pns NewPns New
Pns New
 
Fluroscopy
Fluroscopy Fluroscopy
Fluroscopy
 
Cardiac MRI
Cardiac MRICardiac MRI
Cardiac MRI
 
076 cardiac magnetic resonance imaging
076 cardiac magnetic resonance imaging076 cardiac magnetic resonance imaging
076 cardiac magnetic resonance imaging
 
Cardiac MRI
Cardiac MRICardiac MRI
Cardiac MRI
 
Dark room equipments and entrance
Dark room equipments and entranceDark room equipments and entrance
Dark room equipments and entrance
 

Ähnlich wie What is spatial Resolution

RDT-112-PRELIM-LESSON-2-NOTES.docx
RDT-112-PRELIM-LESSON-2-NOTES.docxRDT-112-PRELIM-LESSON-2-NOTES.docx
RDT-112-PRELIM-LESSON-2-NOTES.docxJianSoliman2
 
HA1 - Motion Graphics Now
HA1 - Motion Graphics NowHA1 - Motion Graphics Now
HA1 - Motion Graphics Nowdanhops888
 
Overview of Graphics System
Overview of Graphics SystemOverview of Graphics System
Overview of Graphics SystemPrathimaBaliga
 
Screen ratios, frame rate, video forats, compression
Screen ratios, frame rate, video forats, compressionScreen ratios, frame rate, video forats, compression
Screen ratios, frame rate, video forats, compressionsnailguinproductions
 
Digital imaging in dentistry / orthodontics courses
Digital imaging in dentistry / orthodontics courses Digital imaging in dentistry / orthodontics courses
Digital imaging in dentistry / orthodontics courses Indian dental academy
 
Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...Indian dental academy
 
Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...Indian dental academy
 
Chapter 10: Display Systems
Chapter 10: Display SystemsChapter 10: Display Systems
Chapter 10: Display Systemsaskme
 
Digital imaging /certified fixed orthodontic courses by Indian dental academy
Digital imaging /certified fixed orthodontic courses by Indian dental academy Digital imaging /certified fixed orthodontic courses by Indian dental academy
Digital imaging /certified fixed orthodontic courses by Indian dental academy Indian dental academy
 
Christie digital cinema projection choosing the right technology
Christie digital cinema projection choosing the right technologyChristie digital cinema projection choosing the right technology
Christie digital cinema projection choosing the right technologyCCS Presentation Systems Inc.
 
Digital image processing2.pptx
Digital image processing2.pptxDigital image processing2.pptx
Digital image processing2.pptxDivyanshAgarwal78
 

Ähnlich wie What is spatial Resolution (20)

RDT-112-PRELIM-LESSON-2-NOTES.docx
RDT-112-PRELIM-LESSON-2-NOTES.docxRDT-112-PRELIM-LESSON-2-NOTES.docx
RDT-112-PRELIM-LESSON-2-NOTES.docx
 
Digital imaging
Digital imagingDigital imaging
Digital imaging
 
Pixel
PixelPixel
Pixel
 
HA1 - Motion Graphics Now
HA1 - Motion Graphics NowHA1 - Motion Graphics Now
HA1 - Motion Graphics Now
 
Overview of Graphics System
Overview of Graphics SystemOverview of Graphics System
Overview of Graphics System
 
Screen ratios, frame rate, video forats, compression
Screen ratios, frame rate, video forats, compressionScreen ratios, frame rate, video forats, compression
Screen ratios, frame rate, video forats, compression
 
L3 cmp technicalfile_180911
L3 cmp technicalfile_180911L3 cmp technicalfile_180911
L3 cmp technicalfile_180911
 
Recent advances digital imaging
Recent advances digital imaging Recent advances digital imaging
Recent advances digital imaging
 
Digital imaging (2)
Digital imaging (2)Digital imaging (2)
Digital imaging (2)
 
Digital imaging in dentistry / orthodontics courses
Digital imaging in dentistry / orthodontics courses Digital imaging in dentistry / orthodontics courses
Digital imaging in dentistry / orthodontics courses
 
Digital imaging
Digital imagingDigital imaging
Digital imaging
 
Glossary
Glossary Glossary
Glossary
 
Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...
 
Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...Recent advances digital imaging /certified fixed orthodontic courses by India...
Recent advances digital imaging /certified fixed orthodontic courses by India...
 
Chapter 10: Display Systems
Chapter 10: Display SystemsChapter 10: Display Systems
Chapter 10: Display Systems
 
Glossary
GlossaryGlossary
Glossary
 
Digital imaging /certified fixed orthodontic courses by Indian dental academy
Digital imaging /certified fixed orthodontic courses by Indian dental academy Digital imaging /certified fixed orthodontic courses by Indian dental academy
Digital imaging /certified fixed orthodontic courses by Indian dental academy
 
Pixels
PixelsPixels
Pixels
 
Christie digital cinema projection choosing the right technology
Christie digital cinema projection choosing the right technologyChristie digital cinema projection choosing the right technology
Christie digital cinema projection choosing the right technology
 
Digital image processing2.pptx
Digital image processing2.pptxDigital image processing2.pptx
Digital image processing2.pptx
 

Mehr von Sayed Abulhasan Quadri

Decentralized Data Fusion Algorithm using Factor Analysis Model
Decentralized Data Fusion Algorithm using Factor Analysis ModelDecentralized Data Fusion Algorithm using Factor Analysis Model
Decentralized Data Fusion Algorithm using Factor Analysis ModelSayed Abulhasan Quadri
 
Image quality improvement of Low-resolution camera using Data fusion technique
Image quality improvement of Low-resolution camera using Data fusion techniqueImage quality improvement of Low-resolution camera using Data fusion technique
Image quality improvement of Low-resolution camera using Data fusion techniqueSayed Abulhasan Quadri
 
Feature Extraction and Principal Component Analysis
Feature Extraction and Principal Component AnalysisFeature Extraction and Principal Component Analysis
Feature Extraction and Principal Component AnalysisSayed Abulhasan Quadri
 
Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...Sayed Abulhasan Quadri
 
Multisensor data fusion for defense application
Multisensor data fusion for defense applicationMultisensor data fusion for defense application
Multisensor data fusion for defense applicationSayed Abulhasan Quadri
 
Sensor based structural health monitoring of concrete structures
Sensor based structural health monitoring of concrete structuresSensor based structural health monitoring of concrete structures
Sensor based structural health monitoring of concrete structuresSayed Abulhasan Quadri
 
Multi agent system to monitor structures
Multi agent system to monitor structuresMulti agent system to monitor structures
Multi agent system to monitor structuresSayed Abulhasan Quadri
 
Multisensor data fusion in object tracking applications
Multisensor data fusion in object tracking applicationsMultisensor data fusion in object tracking applications
Multisensor data fusion in object tracking applicationsSayed Abulhasan Quadri
 

Mehr von Sayed Abulhasan Quadri (9)

What is over-the-air programming
What is over-the-air programmingWhat is over-the-air programming
What is over-the-air programming
 
Decentralized Data Fusion Algorithm using Factor Analysis Model
Decentralized Data Fusion Algorithm using Factor Analysis ModelDecentralized Data Fusion Algorithm using Factor Analysis Model
Decentralized Data Fusion Algorithm using Factor Analysis Model
 
Image quality improvement of Low-resolution camera using Data fusion technique
Image quality improvement of Low-resolution camera using Data fusion techniqueImage quality improvement of Low-resolution camera using Data fusion technique
Image quality improvement of Low-resolution camera using Data fusion technique
 
Feature Extraction and Principal Component Analysis
Feature Extraction and Principal Component AnalysisFeature Extraction and Principal Component Analysis
Feature Extraction and Principal Component Analysis
 
Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...
 
Multisensor data fusion for defense application
Multisensor data fusion for defense applicationMultisensor data fusion for defense application
Multisensor data fusion for defense application
 
Sensor based structural health monitoring of concrete structures
Sensor based structural health monitoring of concrete structuresSensor based structural health monitoring of concrete structures
Sensor based structural health monitoring of concrete structures
 
Multi agent system to monitor structures
Multi agent system to monitor structuresMulti agent system to monitor structures
Multi agent system to monitor structures
 
Multisensor data fusion in object tracking applications
Multisensor data fusion in object tracking applicationsMultisensor data fusion in object tracking applications
Multisensor data fusion in object tracking applications
 

Kürzlich hochgeladen

Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room servicediscovermytutordmt
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...PsychoTech Services
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 

Kürzlich hochgeladen (20)

Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 

What is spatial Resolution

  • 1. What is Spatial Resolution ? A presentation for better understanding! S.A.Quadri CEDEC , USM , Malaysia
  • 2. Effect of Spatial resolution on visualization (Satellite image : Reference http://visibleearth.nasa.gov/view_rec.php?id=1427)
  • 3.
  • 4. Image resolution It is an umbrella term that describes the detail an image holds. The term applies to raster digital images, film images, and other types of images. Higher resolution means more image details. Image resolution can be measured in various ways. Resolution quantifies how close lines can be to each other and still be visibly resolved. Resolution units can be tied to physical sizes (e.g. lines per mm, lines per inch), to the overall size of a picture (lines per picture height, also known simply as lines, TV lines, or TVL), or to angular subtenant. Line pairs are often used instead of lines. A line pair comprises a dark line and an adjacent light line. A line is either a dark line or a light line. A resolution of 10 lines per millimeter means 5 dark lines alternating with 5 light lines, or 5 line pairs per millimeter (5 LP/mm). Photographic lens and film resolution are most often quoted in line pairs per millimeter.
  • 5. Resolution of digital images The resolution of digital images can be described in many different ways. The term resolution is often used for a pixel count in digital imaging, even though American, Japanese, & international standards specify that it should not be so used, at least in the digital camera field. •An image of N pixels high by M pixels wide can have any resolution less than N lines per picture height, or N TV lines. But when the pixel counts are referred to as resolution, the convention is to describe the pixel resolution with the set of two positive integer numbers, where the first number is the number of pixel columns (width) and the second is the number of pixel rows (height), for example as 640 by 480. •Another popular convention is to cite resolution as the total number of pixels in the image, typically given as number of megapixels, which can be calculated by multiplying pixel columns by pixel rows and dividing by one million. •Other conventions include describing pixels per length unit or pixels per area unit, such as pixels per inch or per square inch. •According to the same standards, the number of effective pixels that an image sensor or digital camera has is the count of elementary pixel sensors that contribute to the final image, as opposed to the number of total pixels, which includes unused or light-shielded pixels around the edges. None of these pixel resolutions are true resolutions, but they are widely referred to as such; they serve as upper bounds on image resolution.
  • 6. Effect of pixel resolutions Below is an illustration of how the same image might appear at different pixel resolutions, if the pixels were poorly rendered as sharp squares (normally, a smooth image reconstruction from pixels would be preferred, but for illustration of pixels, the sharp squares make the point better).
  • 7. Further Explanation An image that is 2048 pixels in width and 1536 pixels in height has a total of 2048×1536 = 3,145,728 pixels. One could refer to it as 2048 by 1536 or a 3.1-megapixel image. Unfortunately, the count of pixels is not a real measure of the resolution of digital camera images, because : Color image sensors are typically set up to alternate color filter types over the light sensitive individual pixel sensors. Digital images ultimately require a red, green, and blue value for each pixel to be displayed or printed, but one individual pixel in the image sensor will only supply one of those three pieces of information. The image has to be interpolated or demosaiced to produce all three colors for each output pixel.
  • 8. Spatial resolution The measure of how closely lines can be resolved in an image is called spatial resolution, and it depends on properties of the system creating the image, not just the pixel resolution in pixels per inch (ppi). For practical purposes the clarity of the image is decided by its spatial resolution, not the number of pixels in an image. In effect, spatial resolution refers to the number of independent pixel values per unit length. •The spatial resolution of computer monitors is generally 72 to 100 lines per inch, corresponding to pixel resolutions of 72 to 100 ppi. •With scanners, optical resolution is used to distinguish spatial resolution from the number of pixels per inch. •In geographic information systems (GISs), spatial resolution is measured by the ground sample distance (GSD) of an image, the pixel spacing on the Earth's surface. •In astronomy one often measures spatial resolution in data points per arc second subtended at the point of observation, since the physical distance between objects in the image depends on their distance away & this varies widely with the object of interest. •In electron microscopy, line or fringe resolution refers to the minimum separation detectable between adjacent parallel lines (e.g. between planes of atoms), while point resolution instead refers to the minimum separation between adjacent points that can be both detected & interpreted e.g. as adjacent columns of atoms, for instance. •In Stereoscopic 3D images, spatial resolution could be defined as the spatial information recorded or captured by two viewpoints of a stereo camera (left & right camera). It could be argued that such "spatial resolution" could add an image that then would not depend solely on pixel count or Dots per inch alone, when classifying and interpreting overall resolution of a given photographic image or video frame.
  • 9. Spatial resolution and Pixel count Just make out difference ! Spatial resolution Pixel count
  • 10. Spectral resolution Color images distinguish light of different spectra. Multi-spectral images resolve even finer differences of spectrum or wavelength than is needed to reproduce color. That is, they can have higher spectral resolution. i.e. (high strength of each band). Temporal resolution Movie cameras and high-speed cameras can resolve events at different points in time. The time resolution used for movies is usually 15 to 30 frames per second (frames/s), while high-speed cameras may resolve 100 to 1000 frames/s, or even more. Radiometric resolution Radiometric resolution determines how finely a system can represent or distinguish differences of intensity, and is usually expressed as a number of levels or a number of bits, for example, 8 bits or 256 levels that is typical of computer image files. The higher the radiometric resolution, the better subtle differences of intensity or reflectivity can be represented, at least in theory. In practice, the effective radiometric resolution is typically limited by the noise level, rather than by the number of bits of representation.
  • 11. Resolution in various media This is a list of resolutions for various media. Analog and early digital 352×240 : Video CD 300×480 : Umatic, Betamax, VHS, Video8 350×480 : Super Betamax, Betacam 420×480 : LaserDisc, Super VHS, Hi8 640×480 : Analog broadcast (NTSC) 670×480 : Enhanced Definition Betamax 768×576 : Analog broadcast (PAL, SECAM) Digital 720×480 : D-VHS, DVD, miniDV, Digital8, Digital Betacam 720×480 : Widescreen DVD (anamorphic) 1280×720 : D-VHS, HD DVD, Blue-ray, HDV (miniDV) 1440×1080 : HDV (miniDV) 1920×1080 : HDV (miniDV), AVCHD, HD DVD, Blu-ray, HDCAM SR 2048×1080 : 2K Digital Cinema 4096×2160 : 4K Digital Cinema 7680×4320 : UHDTV Film 35 mm film is scanned for release on DVD at 1080 or 2000 lines as of 2005. However some photography sources gives 5380 x 3620 as the resolution of 35mm film. It is similar to 19.5 Mpix, of course with identical spatial resolution. IMAX, including IMAX HD and OMNIMAX: approximately 10,000×7000 (7000 lines) resolution. It is about 70 Mpix, which may be considered to the biggest resolution.
  • 12. Spatial Resolution and Pixel Size The image resolution and pixel size are often used interchangeably. In reality, they are not equivalent. An image sampled at a small pixel size does not necessarily has a high resolution. The following three images illustrate this point. The first image is a SPOT image of 10 m pixel size. It was derived by merging a SPOT panchromatic image of 10 m resolution with a SPOT multispectral image of 20 m resolution. The effective resolution is thus determined by the resolution of the panchromatic image, which is 10 m. This image is further processed to degrade the resolution while maintaining the same pixel size. The next two images are the blurred versions of the image with larger resolution size, but still digitized at the same pixel size of 10 m. Even though they have the same pixel size as the first image, they do not have the same resolution
  • 13. RESOLUTION AND SHARPNESS To determine resolution, a raster is normally used, employing increasingly fine bars and gaps. A common example in real images would be a picket fence displayed to perspective. In the image of the fence, shown in Fig. 1, it is evident that the gaps between the boards become increasingly difficult to discriminate as the distance becomes greater. This effect is the basic problem of every optical image. In the foreground of the image, where the boards and gaps have not yet been squeezed together by the perspective, a large difference in brightness is recognized. The more the boards and gaps are squeezed together in the distance, the less difference is seen in the brightness. To better understand this effect, the brightness values are shown along the yellow arrow in an x / y diagram (Fig. 2). The brightness difference seen in the y-axis is called contrast. The curve itself functions like a harmonic oscillation; because the brightness does not change over time but spatially from left to right, the x-axis is called spatial frequency.
  • 14. It can be clearly seen in Fig. 1 that the finer the reproduced structure, the more the contrast will be “slurred” at that point in the image. The limit of the resolution has been reached when one can no longer clearly differentiate between the structures. This means the resolution limit (red circle indicated in Fig. 2) lies at the spatial frequency where there is just enough contrast left to clearly differentiate between board and gap.
  • 15. Resolution = Sharpness? Are resolution and sharpness the same? By looking at the images shown below, one can quickly determine which image is sharper. Although the image on the left comprises twice as many pixels, the image on the right, whose contrast at coarse details is increased with a filter, looks at first glance to be distinctly sharper. The resolution limit describes how much information makes up each image, but not how a person evaluates this information. The human eye, in fact, is able to resolve extremely fine details. This ability is also valid for objects at a greater distance. The decisive physiological point, however, is that fine details do not contribute to the subjective perception of sharpness. Therefore, it’s important to clearly separate the two terms, resolution and sharpness.
  • 16. MTF Modulation transfer function describes the relationship between resolution and sharpness, and is the basis for a scientific confirmation of the phenomenon described earlier. The modulation component in MTF means approximately the same as contrast. If we evaluate the contrast (modulation) not only where the resolution reaches its limit, but over as many spatial frequencies as possible and connect these points with a curve, we arrive at the so-called MTF. As shown in figure , the x-axis illustrates the already-established spatial frequency expressed in lp/ mm on the y-axis, instead of the brightness seen in modulation. A modulation of 1 (or 100%) is the ratio of the brightness of a completely white image to the brightness of a completely black image. The higher the spatial frequency— in other words the finer the structures in the image - the lower the transferred modulation. (lp= lines pair ) Conclusions: •Sharpness does not depend only on resolution. • The modulation at lower spatial frequencies is essential. •Contrast in coarse details is significantly more imp for the impression of sharpness than contrast at the resolution limit.
  • 17. Resolution of the human eye The fovea of the human eye (the part of the retina that is responsible for sharp central vision) includes about 140 000 sensor-cells per square millimeter. This means that if two objects are projected with a separation distance of more than 4 m on the fovea, a human with a normal visual acuity (20/20) can resolve them. On the object side, this corresponds to 0.2 mm in a distance of 1 m (or 1 minute of arc). In practice of course, this depends on whether the viewer is concentrating only on the center of the viewing field, whether the object is moving very slowly or not at all, and whether the object has good contrast to the background. Allowing for some amount of tolerance, this would be around 0.3 mm at 1 m distance (= 1.03 minutes of arc ). In a certain range, one can assume a linear relation between distance and the detail size
  • 18. This hypothesis can be easily proved !!! Pin the test pattern displayed in Figure below on a well-lit wall and walk away 10 m. One should be able to clearly differentiate between the lines and gaps in Figure. Of course, this requires an ideal visual acuity of 20/20. Nevertheless, if you can’t resolve the pattern in Figure, you might consider paying a visit to an ophthalmologist ! 
  • 19. How we interpret optical images ? Let us see significance of spatial resolution and various other related terms: Four main types of information contained in an optical image are often utilized for image interpretation: •Radiometric Information (i.e. brightness, intensity, tone), •Spectral Information (i.e. color, hue), •Textural Information, •Geometric and Contextual Information. They are illustrated in the following examples,
  • 20. There are different types of images : •Panchromatic Images •Multispectral Images •Color Composite Images •True Color Composite images •False Color Composite images •Natural Color Composite
  • 21. Panchromatic image A panchromatic image consists of only one band. It is usually displayed as a grey scale image. Panchromatic image may be similarly interpreted as a black-and-white aerial photograph of the area. The Radiometric Information is the main information type utilized in the interpretation. A panchromatic image extracted from a SPOT panchromatic scene at a ground resolution of 10 m. (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
  • 22. Multispectral Images A multispectral image consists of several bands of data. For visual display, each band of the image may be displayed one band at a time as a grey scale image, or in combination of 3 bands at a time as a color composite image. Interpretation of a multispectral color composite image will require the knowledge of the spectral reflectance signature of the targets in the scene. In this case, the spectral information content of the image is utilized in the interpretation. The following 3 images show the 3 bands of a multispectral image extracted from a SPOT multispectral scene at a ground resolution of 20 m. (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
  • 23. Color Composite Images In displaying a color composite image, three primary colors (red, green and blue) are used. When these three colors are combined in various proportions, they produce different colors in the visible spectrum. Associating each spectral band (not necessarily a visible band) to a separate primary color results in a color composite image.
  • 24. True Color Composite If a multispectral image consists of the three visual primary color bands (red, green, blue), the three bands may be combined to produce a "true color" image. The bands 3 (red band), 2 (green band) and 1 (blue band) of a LANDSAT TM image or an IKONOS multispectral image can be assigned respectively to the R, G, and B colors for display. In this way, the colors of the resulting color composite image resemble closely what would be observed by the human eyes. A 1-m resolution true-color IKONOS image (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
  • 25. False Color Composite The display color assignment for any band of a multispectral image can be done in an entirely arbitrary manner. In this case, the color of a target in the displayed image does not have any resemblance to its actual colour. The resulting product is known as a false colour composite image. There are many possible schemes of producing false colour composite images. Some schemes are suitable for detecting certain objects in the image. False colour composite multispectral SPOT image (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
  • 26. Natural Colour Composite For optical images lacking one or more of the three visual primary colour bands (i.e. red, green and blue), the spectral bands (some of which may not be in the visible region) may be combined in such a way that the appearance of the displayed image resembles a visible colour photograph, i.e. vegetation in green, water in blue, soil in brown or grey, etc. Some people refer to this composite as a "true colour" composite. However, this term is misleading since in many instances the colors are only simulated to look similar to the "true" colors of the targets. The term "natural colour" is preferred. Natural colour composite multispectral SPOT image (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
  • 27. Vegetation Indices Different bands of a multispectral image may be combined to accentuate the vegetated areas. One such combination is the ratio of the near-infrared band to the red band. This ratio is known as the Ratio Vegetation Index (RVI) RVI = NIR/Red Normalized Difference Vegetation Index (NDVI) Since vegetation has high NIR reflectance but low red reflectance, vegetated areas will have higher RVI values compared to non-vegetated areas. Another commonly used vegetation index is the Normalized Difference Vegetation Index (NDVI) computed by NDVI = (NIR - Red)/(NIR + Red)
  • 28. Textural Information Texture is an important aid in visual image interpretation, especially for high spatial resolution imagery. It is also possible to characterize the textural features numerically, and algorithms for computer-aided automatic discrimination of different textures in an image are available. IKONOS 1-m resolution pan-sharpened color image of an oil palm plantation. Even though the general colour is green throughout, three distinct, land cover types can be identified from the image texture. (Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
  • 29. Remote Sensing Satellites Optical remote sensing makes use of visible, near infrared & short-wave infrared sensors to form images of the earth's surface . By detecting the solar radiation reflected from targets on the ground. Different materials reflect and absorb differently at different wavelengths. Thus, the targets can be differentiated by their spectral reflectance signatures in the remotely sensed images. Optical remote sensing systems are classified into the following types, depending on the number of spectral bands used in the imaging process. Several remote sensing satellites are currently available, providing imagery suitable for various types of applications. Each of these satellite-sensor platform is characterized by the •Wavelength bands employed in image acquisition, •Spatial resolution of the sensor, •The coverage area and the temporal coverage, i.e. how frequent a given location on the earth surface can be imaged by the imaging system.
  • 30. In terms of the spatial resolution, the satellite imaging systems can be classified into: •Low resolution systems (approx. 1 km or more) •Medium resolution systems (approx. 100 m to 1 km) •High resolution systems (approx. 5 m to 100 m) •Very high resolution systems (approx. 5 m or less) In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into: •Optical imaging systems (include visible, near infrared, and shortwave infrared systems) •Thermal imaging systems •Synthetic aperture radar (SAR) imaging systems Optical/thermal imaging systems can be classified according to the number of spectral bands used: •Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems •Multispectral (several spectral bands) systems •Superspectral (tens of spectral bands) systems •Hyper spectral (hundreds of spectral bands) systems Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands & polarization modes used in data acquisition, e.g.: •Single frequency (L-band, or C-band, or X-band) •Multiple frequency (Combination of two or more frequency bands) •Single polarization (VV, or HH, or HV) •Multiple polarization (Combination of two or more polarization modes)