SlideShare ist ein Scribd-Unternehmen logo
1 von 36
Downloaden Sie, um offline zu lesen
Robert Collins
CSE486, Penn State




                         Lecture 2:
                     Intensity Surfaces
                       and Gradients
Robert Collins
CSE486, Penn State
                             Visualizing Images
                     Recall two ways of visualizing an image

                 Intensity pattern           2d array of numbers




        We “see it” at this level        Computer works at this level
Robert Collins
CSE486, Penn State
                     Bridging the Gap

    Motivation: we want to visualize images at a level high
    enough to retain human insight, but low enough to allow
    us to readily translate our insights into mathematical
    notation and, ultimately, computer algorithms that
    operate on arrays of numbers.
Robert Collins
CSE486, Penn State
                               Images as Surfaces




      Surface height
     proportional to
     pixel grey value
      (dark=low, light=high)
Robert Collins
CSE486, Penn State
                                  Examples




  Note: see demoImSurf.m in
  matlab examples directory on
  course web site if you want
  to generate plots like these.
                                     Mean = 164 Std = 1.8
Robert Collins
CSE486, Penn State
                     Examples
Robert Collins
CSE486, Penn State
                     Examples
Robert Collins
CSE486, Penn State
                     Examples
Robert Collins
CSE486, Penn State
                     Examples




                        Mean = 111 Std = 15.4
Robert Collins
CSE486, Penn State
                     Examples
Robert Collins
CSE486, Penn State




                     How does this visualization help us?
Robert Collins
CSE486, Penn State
                     Terrain Concepts
Robert Collins
CSE486, Penn State
                     Terrain Concepts




                                 www.pianoladynancy.com/ wallpapers/1003/wp86.jpg
Robert Collins
CSE486, Penn State
                         Terrain Concepts
            Basic notions:
              Uphill / downhill
              Contour lines (curves of constant elevation)
              Steepness of slope
              Peaks/Valleys (local extrema)
            More mathematical notions:
              Tangent Plane
              Normal vectors
              Curvature
            Gradient vectors (vectors of partial derivatives)
             will help us define/compute all of these.
Robert Collins
CSE486, Penn State
                     Math Example : 1D Gradient
         Consider function f(x) = 100 - 0.5 * x^2
Robert Collins
CSE486, Penn State
                      Math Example : 1D Gradient
         Consider function f(x) = 100 - 0.5 * x^2

         Gradient is df(x)/dx = - 2 * 0.5 * x = - x

         Geometric interpretation:
             gradient at x0 is slope of tangent line to curve at point x0


                     tangent line
                                    Δy     slope = Δy / Δx
              x0
                              Δx                   = df(x)/dx x
                                                                0


          f(x)
Robert Collins
CSE486, Penn State
                     Math Example : 1D Gradient
         f(x) = 100 - 0.5 * x^2   df(x)/dx = - x




                         grad = slope = 2




                                  x0 = -2
Robert Collins
CSE486, Penn State
                     Math Example : 1D Gradient
         f(x) = 100 - 0.5 * x^2       df(x)/dx = - x




                                           0
                                       1
                                                   -1
                                  2
                                                        -2
                          3
                                       gradients             -3
Robert Collins
CSE486, Penn State
                      Math Example : 1D Gradient
         f(x) = 100 - 0.5 * x^2       df(x)/dx = - x



 Gradients                                                        Gradients
                                           0
 on this side                          1                          on this side
                                                   -1
 of peak are                      2                               of peak are
 positive                                               -2        negative
                              3
                                       gradients             -3


                     Note: Sign of gradient at point tells you
                      what direction to go to travel “uphill”
Robert Collins
CSE486, Penn State
                     Math Example : 2D Gradient
                   f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2
                   df(x,y)/dx = - x df(x,y)/dy = - y
              Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y]




            Gradient is vector of partial derivs wrt x and y axes
Robert Collins
CSE486, Penn State
                     Math Example : 2D Gradient
                   f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2
              Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y]

                                       Plotted as a vector field,
                                       the gradient vector at each
                                       pixel points “uphill”

                                       The gradient indicates the
                                       direction of steepest ascent.

                                       The gradient is 0 at the peak
                                       (also at any flat spots, and local minima,…but
                                       there are none of those for this function)
Robert Collins
CSE486, Penn State
                     Math Example : 2D Gradient
                   f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2
              Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y]
                                        Let g=[gx,gy] be the gradient
                                        vector at point/pixel (x0,y0)
                                        Vector g points uphill
                                          (direction of steepest ascent)

                                        Vector - g points downhill
                                          (direction of steepest descent)

                                        Vector [gy, -gx] is perpendicular,
                                        and denotes direction of constant
                                        elevation. i.e. normal to contour
                                        line passing through point (x0,y0)
Robert Collins
CSE486, Penn State
                     Math Example : 2D Gradient
                   f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2
              Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y]



                                        And so on for all points
Robert Collins
CSE486, Penn State
                     Image Gradient
                        The same is true of 2D image gradients.
                        The underlying function is numerical
                        (tabulated) rather than algebraic. So
                        need numerical derivatives.
Robert Collins
CSE486, Penn State
                      Numerical Derivatives
                         See also T&V, Appendix A.2

       Taylor Series expansion



        Manipulate:




                                       Finite forward difference
Robert Collins
CSE486, Penn State
                      Numerical Derivatives
                         See also T&V, Appendix A.2

       Taylor Series expansion



        Manipulate:




                                       Finite backward difference
Robert Collins
CSE486, Penn State
                     Numerical Derivatives
                        See also T&V, Appendix A.2

       Taylor Series expansion


  subtract




                                         Finite central difference
Robert Collins
CSE486, Penn State
                     Numerical Derivatives
                        See also T&V, Appendix A.2

     Finite forward difference


     Finite backward difference



     Finite central difference
                                                     More
                                                     accurate
Robert Collins
CSE486, Penn State
                     Example: Temporal Gradient
             A video is a sequence of image frames I(x,y,t).




                                  df(x)/dx
                                  derivative




              Each frame has two spatial indices x, y and
              one temporal (time) index t.
Robert Collins
CSE486, Penn State
                     Example: Temporal Gradient

                                         Consider the sequence of
                                         intensity values observed
                                         at a single pixel over time.


         1D function f(x) over time          1D gradient (deriv) of f(x) over time
                                      df(x)/dx
                                      derivative
Robert Collins
CSE486, Penn State
                      Temporal Gradient (cont)
         What does the temporal intensity gradient at each pixel look like over time?
Robert Collins

               Example: Spatial Image Gradients
CSE486, Penn State


                                                    Ix=dI(x,y)/dx
                        I(x+1,y) - I(x-1,y)
                                    2
    I(x,y)
                         Partial derivative wrt x




                        I(x,y+1) - I(x,y-1)
                                    2               Iy=dI(x,y)/dy
                         Partial derivative wrt y
Robert Collins
CSE486, Penn State
                        Functions of Gradients
               I(x,y)         Ix       Iy




     Magnitude of gradient
      sqrt(Ix.^2 + Iy.^2)

     Measures steepness of
     slope at each pixel
Robert Collins
CSE486, Penn State
                        Functions of Gradients
               I(x,y)         Ix       Iy




       Angle of gradient
        atan2(Iy, Ix)

       Denotes similarity
       of orientation of slope
Robert Collins
CSE486, Penn State
                        Functions of Gradients
               I(x,y)                    atan2(Iy, Ix)




  What else do
  we observe in
  this image?

  Enhanced detail
  in low contrast
  areas (e.g. folds
  in coat; imaging
  artifacts in sky)
Robert Collins
CSE486, Penn State
                     Next Time: Linear Operators

            Gradients are an example of linear operators,
            i.e. value at a pixel is computed as a linear
            combination of values of neighboring pixels.

Más contenido relacionado

Andere mochten auch

ICCV2009: MAP Inference in Discrete Models: Part 5
ICCV2009: MAP Inference in Discrete Models: Part 5ICCV2009: MAP Inference in Discrete Models: Part 5
ICCV2009: MAP Inference in Discrete Models: Part 5zukun
 
Powerpoint meds 1101
Powerpoint meds 1101Powerpoint meds 1101
Powerpoint meds 1101aliaalhirsi
 
CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...
CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...
CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...zukun
 
Northlake Mall
Northlake MallNorthlake Mall
Northlake MallEvanXP
 
Lecture07
Lecture07Lecture07
Lecture07zukun
 
Vincent Portfolio 0411
Vincent Portfolio 0411Vincent Portfolio 0411
Vincent Portfolio 0411vincenttse
 
Cvpr2010 open source vision software, intro and training part ii - using git ...
Cvpr2010 open source vision software, intro and training part ii - using git ...Cvpr2010 open source vision software, intro and training part ii - using git ...
Cvpr2010 open source vision software, intro and training part ii - using git ...zukun
 
NIPS2007: structured prediction
NIPS2007: structured predictionNIPS2007: structured prediction
NIPS2007: structured predictionzukun
 

Andere mochten auch (8)

ICCV2009: MAP Inference in Discrete Models: Part 5
ICCV2009: MAP Inference in Discrete Models: Part 5ICCV2009: MAP Inference in Discrete Models: Part 5
ICCV2009: MAP Inference in Discrete Models: Part 5
 
Powerpoint meds 1101
Powerpoint meds 1101Powerpoint meds 1101
Powerpoint meds 1101
 
CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...
CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...
CVPR2010: Learnings from founding a computer vision startup: Chapter 0: Intro...
 
Northlake Mall
Northlake MallNorthlake Mall
Northlake Mall
 
Lecture07
Lecture07Lecture07
Lecture07
 
Vincent Portfolio 0411
Vincent Portfolio 0411Vincent Portfolio 0411
Vincent Portfolio 0411
 
Cvpr2010 open source vision software, intro and training part ii - using git ...
Cvpr2010 open source vision software, intro and training part ii - using git ...Cvpr2010 open source vision software, intro and training part ii - using git ...
Cvpr2010 open source vision software, intro and training part ii - using git ...
 
NIPS2007: structured prediction
NIPS2007: structured predictionNIPS2007: structured prediction
NIPS2007: structured prediction
 

Ähnlich wie Lecture02

Lecture03
Lecture03Lecture03
Lecture03zukun
 
Advanced Functions Unit 1
Advanced Functions Unit 1Advanced Functions Unit 1
Advanced Functions Unit 1leefong2310
 
Lecture14
Lecture14Lecture14
Lecture14zukun
 
1D graphs, kinematics, and calculus
1D graphs, kinematics, and calculus1D graphs, kinematics, and calculus
1D graphs, kinematics, and calculuscpphysics
 
Applied Math 40S May 26, 2008
Applied Math 40S May 26, 2008Applied Math 40S May 26, 2008
Applied Math 40S May 26, 2008Darren Kuropatwa
 
Computer graphics
Computer graphicsComputer graphics
Computer graphicsBala Murali
 
Geo 3.6&7 slope
Geo 3.6&7 slopeGeo 3.6&7 slope
Geo 3.6&7 slopeejfischer
 
3 slopes of lines x
3 slopes of lines x3 slopes of lines x
3 slopes of lines xTzenma
 
Lecture11
Lecture11Lecture11
Lecture11zukun
 
Calculus Review for semester 1 at University
Calculus Review for semester 1 at UniversityCalculus Review for semester 1 at University
Calculus Review for semester 1 at Universitynetaf56543
 
Transformations computer graphics
Transformations computer graphics Transformations computer graphics
Transformations computer graphics Vikram Halder
 
Mining of massive datasets
Mining of massive datasetsMining of massive datasets
Mining of massive datasetsAshic Mahtab
 
36 slopes of lines-x
36 slopes of lines-x36 slopes of lines-x
36 slopes of lines-xmath123a
 
Lecture31
Lecture31Lecture31
Lecture31zukun
 
Two dimensionaltransformations
Two dimensionaltransformationsTwo dimensionaltransformations
Two dimensionaltransformationsNareek
 
Differentiation
DifferentiationDifferentiation
Differentiationtimschmitz
 

Ähnlich wie Lecture02 (20)

Lecture03
Lecture03Lecture03
Lecture03
 
Advanced Functions Unit 1
Advanced Functions Unit 1Advanced Functions Unit 1
Advanced Functions Unit 1
 
2-D Transformations.pdf
2-D Transformations.pdf2-D Transformations.pdf
2-D Transformations.pdf
 
Lecture14
Lecture14Lecture14
Lecture14
 
1D graphs, kinematics, and calculus
1D graphs, kinematics, and calculus1D graphs, kinematics, and calculus
1D graphs, kinematics, and calculus
 
Applied 40S May 25, 2009
Applied 40S May 25, 2009Applied 40S May 25, 2009
Applied 40S May 25, 2009
 
Applied Math 40S May 26, 2008
Applied Math 40S May 26, 2008Applied Math 40S May 26, 2008
Applied Math 40S May 26, 2008
 
Computer graphics
Computer graphicsComputer graphics
Computer graphics
 
Geo 3.6&7 slope
Geo 3.6&7 slopeGeo 3.6&7 slope
Geo 3.6&7 slope
 
3 slopes of lines x
3 slopes of lines x3 slopes of lines x
3 slopes of lines x
 
Lecture11
Lecture11Lecture11
Lecture11
 
Alg2 lesson 7-1
Alg2 lesson 7-1Alg2 lesson 7-1
Alg2 lesson 7-1
 
10 linescan
10 linescan10 linescan
10 linescan
 
Calculus Review for semester 1 at University
Calculus Review for semester 1 at UniversityCalculus Review for semester 1 at University
Calculus Review for semester 1 at University
 
Transformations computer graphics
Transformations computer graphics Transformations computer graphics
Transformations computer graphics
 
Mining of massive datasets
Mining of massive datasetsMining of massive datasets
Mining of massive datasets
 
36 slopes of lines-x
36 slopes of lines-x36 slopes of lines-x
36 slopes of lines-x
 
Lecture31
Lecture31Lecture31
Lecture31
 
Two dimensionaltransformations
Two dimensionaltransformationsTwo dimensionaltransformations
Two dimensionaltransformations
 
Differentiation
DifferentiationDifferentiation
Differentiation
 

Mehr von zukun

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009zukun
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVzukun
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Informationzukun
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statisticszukun
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibrationzukun
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionzukun
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluationzukun
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-softwarezukun
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptorszukun
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectorszukun
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-introzukun
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video searchzukun
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video searchzukun
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video searchzukun
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learningzukun
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionzukun
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick startzukun
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysiszukun
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structureszukun
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities zukun
 

Mehr von zukun (20)

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
 

Lecture02

  • 1. Robert Collins CSE486, Penn State Lecture 2: Intensity Surfaces and Gradients
  • 2. Robert Collins CSE486, Penn State Visualizing Images Recall two ways of visualizing an image Intensity pattern 2d array of numbers We “see it” at this level Computer works at this level
  • 3. Robert Collins CSE486, Penn State Bridging the Gap Motivation: we want to visualize images at a level high enough to retain human insight, but low enough to allow us to readily translate our insights into mathematical notation and, ultimately, computer algorithms that operate on arrays of numbers.
  • 4. Robert Collins CSE486, Penn State Images as Surfaces Surface height proportional to pixel grey value (dark=low, light=high)
  • 5. Robert Collins CSE486, Penn State Examples Note: see demoImSurf.m in matlab examples directory on course web site if you want to generate plots like these. Mean = 164 Std = 1.8
  • 9. Robert Collins CSE486, Penn State Examples Mean = 111 Std = 15.4
  • 10. Robert Collins CSE486, Penn State Examples
  • 11. Robert Collins CSE486, Penn State How does this visualization help us?
  • 12. Robert Collins CSE486, Penn State Terrain Concepts
  • 13. Robert Collins CSE486, Penn State Terrain Concepts www.pianoladynancy.com/ wallpapers/1003/wp86.jpg
  • 14. Robert Collins CSE486, Penn State Terrain Concepts Basic notions: Uphill / downhill Contour lines (curves of constant elevation) Steepness of slope Peaks/Valleys (local extrema) More mathematical notions: Tangent Plane Normal vectors Curvature Gradient vectors (vectors of partial derivatives) will help us define/compute all of these.
  • 15. Robert Collins CSE486, Penn State Math Example : 1D Gradient Consider function f(x) = 100 - 0.5 * x^2
  • 16. Robert Collins CSE486, Penn State Math Example : 1D Gradient Consider function f(x) = 100 - 0.5 * x^2 Gradient is df(x)/dx = - 2 * 0.5 * x = - x Geometric interpretation: gradient at x0 is slope of tangent line to curve at point x0 tangent line Δy slope = Δy / Δx x0 Δx = df(x)/dx x 0 f(x)
  • 17. Robert Collins CSE486, Penn State Math Example : 1D Gradient f(x) = 100 - 0.5 * x^2 df(x)/dx = - x grad = slope = 2 x0 = -2
  • 18. Robert Collins CSE486, Penn State Math Example : 1D Gradient f(x) = 100 - 0.5 * x^2 df(x)/dx = - x 0 1 -1 2 -2 3 gradients -3
  • 19. Robert Collins CSE486, Penn State Math Example : 1D Gradient f(x) = 100 - 0.5 * x^2 df(x)/dx = - x Gradients Gradients 0 on this side 1 on this side -1 of peak are 2 of peak are positive -2 negative 3 gradients -3 Note: Sign of gradient at point tells you what direction to go to travel “uphill”
  • 20. Robert Collins CSE486, Penn State Math Example : 2D Gradient f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2 df(x,y)/dx = - x df(x,y)/dy = - y Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y] Gradient is vector of partial derivs wrt x and y axes
  • 21. Robert Collins CSE486, Penn State Math Example : 2D Gradient f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2 Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y] Plotted as a vector field, the gradient vector at each pixel points “uphill” The gradient indicates the direction of steepest ascent. The gradient is 0 at the peak (also at any flat spots, and local minima,…but there are none of those for this function)
  • 22. Robert Collins CSE486, Penn State Math Example : 2D Gradient f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2 Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y] Let g=[gx,gy] be the gradient vector at point/pixel (x0,y0) Vector g points uphill (direction of steepest ascent) Vector - g points downhill (direction of steepest descent) Vector [gy, -gx] is perpendicular, and denotes direction of constant elevation. i.e. normal to contour line passing through point (x0,y0)
  • 23. Robert Collins CSE486, Penn State Math Example : 2D Gradient f(x,y) = 100 - 0.5 * x^2 - 0.5 * y^2 Gradient = [df(x,y)/dx , df(x,y)/dy] = [- x , - y] And so on for all points
  • 24. Robert Collins CSE486, Penn State Image Gradient The same is true of 2D image gradients. The underlying function is numerical (tabulated) rather than algebraic. So need numerical derivatives.
  • 25. Robert Collins CSE486, Penn State Numerical Derivatives See also T&V, Appendix A.2 Taylor Series expansion Manipulate: Finite forward difference
  • 26. Robert Collins CSE486, Penn State Numerical Derivatives See also T&V, Appendix A.2 Taylor Series expansion Manipulate: Finite backward difference
  • 27. Robert Collins CSE486, Penn State Numerical Derivatives See also T&V, Appendix A.2 Taylor Series expansion subtract Finite central difference
  • 28. Robert Collins CSE486, Penn State Numerical Derivatives See also T&V, Appendix A.2 Finite forward difference Finite backward difference Finite central difference More accurate
  • 29. Robert Collins CSE486, Penn State Example: Temporal Gradient A video is a sequence of image frames I(x,y,t). df(x)/dx derivative Each frame has two spatial indices x, y and one temporal (time) index t.
  • 30. Robert Collins CSE486, Penn State Example: Temporal Gradient Consider the sequence of intensity values observed at a single pixel over time. 1D function f(x) over time 1D gradient (deriv) of f(x) over time df(x)/dx derivative
  • 31. Robert Collins CSE486, Penn State Temporal Gradient (cont) What does the temporal intensity gradient at each pixel look like over time?
  • 32. Robert Collins Example: Spatial Image Gradients CSE486, Penn State Ix=dI(x,y)/dx I(x+1,y) - I(x-1,y) 2 I(x,y) Partial derivative wrt x I(x,y+1) - I(x,y-1) 2 Iy=dI(x,y)/dy Partial derivative wrt y
  • 33. Robert Collins CSE486, Penn State Functions of Gradients I(x,y) Ix Iy Magnitude of gradient sqrt(Ix.^2 + Iy.^2) Measures steepness of slope at each pixel
  • 34. Robert Collins CSE486, Penn State Functions of Gradients I(x,y) Ix Iy Angle of gradient atan2(Iy, Ix) Denotes similarity of orientation of slope
  • 35. Robert Collins CSE486, Penn State Functions of Gradients I(x,y) atan2(Iy, Ix) What else do we observe in this image? Enhanced detail in low contrast areas (e.g. folds in coat; imaging artifacts in sky)
  • 36. Robert Collins CSE486, Penn State Next Time: Linear Operators Gradients are an example of linear operators, i.e. value at a pixel is computed as a linear combination of values of neighboring pixels.