SlideShare ist ein Scribd-Unternehmen logo
1 von 80
Downloaden Sie, um offline zu lesen
IRIS BIOMETRIC SYSTEM
                     CS635




     Dept. of Computer Science & Engineering
                  NIT Rourkela
Iris Biometrics
 Iris is externally-visible, colored ring around the pupil

 The flowery pattern is unique for each individual

 The right and left eye of any given individual, have
  unrelated iris patterns

 Iris is stable throughout life

 Randomness
Anatomical Structure of Iris

                               Eyelash

                               Iris Boundary
                               Pupil Boundary
                               Pupil
                               Iris

                               Sclera

                               E lid
                               Eyelid
Advantages of Iris Recognition
 Hi hl protected, i
  Highly        d internal organ of the eye
                         l        f h

 Externally visible patterns imaged from a distance

 Patterns apparently stable throughout life

 Iris shape is far more predictable than that of the face

 No need for a person to touch any equipment
Disadvantages of Iris Recognition
 L
  Localization f il f d k i i
      li i fails for dark iris

 Highly susceptible for changes in weather or due to infection

 Obscured by eyelashes, lenses, reflections

 Well trained and co-operative user is required

 Expensive Acquisition Devices

                  Occlusion due to eyelashes
The remarkable story of Sharbat Gula, first photographed in 1984 aged 12 in
a refugee camp in Pakistan by National Geographic photographer Steve
      g                      y              g             g
McCurry, and traced 18 years later to a remote part of Afghanistan where
                                                                         6
she was again photographed by McCurry, is told by National Geographic in
their magazine (April 2002 issue)
Geographic turned to the inventor of automatic iris recognition, John Daugman, a professor of
computer science at England’s University of Cambridge His biometric technique uses
                     England s              Cambridge.
mathematical calculations, and the numbers Daugman got left no question in his mind that the
haunted eyes of the young Afghan refugee and the eyes of the adult Sharbat Gula belong to the
same person.


                                                                                                7
Generic Iris Biometric System
Literature Review
 Fl
  Flom and Safir
         d S fi

 Daugman’s Approach
  Daugman s

 Wildes Approach

 Proposed Implementation
Flom and Safir
 I 1987 the authors obtained a patent f an unimplemented
  In      h     h      b i d             for     i l    d
  conceptual design of an iris biometrics system

 Their description suggested
      highly controlled conditions
      headrest
      target image to direct the subject’s gaze
      manual operator
                 p
      Pupil expansion and contraction was controlled by changing the
       illumination to force the pupil to a predetermined size
 T detect the pupil,
  To d      h     il
      Threshold based approach


   Extraction of Iris Descriptors
      Pattern recognition tools
      Ed d t ti algorithms
       Edge detection l ith
      Hough transform


   Iris features could be stored on a credit card or identification card to
    support a verification task
Daugman’s Approach
 Daugman’s 1994 patent described an operational iris
  Daugman s
  recognition system in some detail.

 I
  Improvements over Fl
            t       Flom and safir’s approach
                           d fi ’           h

 Image Acquisition
    Image should use near-infrared illumination

 Iris Localization
     s oca a o
    An integro-differential operator for detecting the iris boundary by
     searching the parameter space.

 Iris Normalization
    mapping the extracted iris region into polar coordinate system
 F
  Feature Encoding
          E   di
    2D Wavelet demodulation


 Matching
    Hamming distance, which measures the fraction of bits for which two iris
     codes disagree
Wildes Approach
 Wildes describes an iris biometrics system developed at Sarnoff
  Labs

 Image Acquisition
     a diffuse light source
     low light level camera

 Iris Localization
     Computing an binary edge map
     Hough transform to detect circles

 Feature Extraction
     Laplacian of Gaussian filter at multiple scales

 Matching
     normalized correlation
Proposed Implementation
 The iris recognition system developed consists of
    Image Acquisition
    Preprocessing
    Iris Localization
       Pupil Detection
       Iris Detection
    Iris Normalization
    Feature Extraction
       Haar Wavelet
    Matching
Image Acquisition
 Th iris i
  The i i image i acquired f
                is    i d from a
  CCD based iris camera

 Camera is placed 9 cm a a from
                        away
  subjects eye

 The source of light is placed at a
  distance of 12 cm (approx) from
  the user eye

 The distance between source of
  light and CCD camera is found to
  be approximately 8 cm
       pp        y
Image Acquisition System: (a) System with frame grabber
       (b) CCD Camera (c) Light Source (d) User
Preprocessing
 The detection of pupil fails whenever there is a spot on
  the pupil area

 Preprocessing removes the effect of spots/holes lying on
  the pupillary area
                area.

 The preprocessing module first transforms the true color
  (RGB) into intensity image
Steps involved in preprocessing
 Bi i i
  Binarization

 Find the complement of binary image

 Hole filling using four connected approach

 Complement of hole filled image
Preprocessing and noise removal
Iris Localization
 The important steps involved in iris localization are
    Pupil Detection

    Iris Detection
Pupil Detection
 The steps involved in pupil detection are
    Thresholding

    Edge Detection

    Circular Hough Transform
Thresholding
 P il i the d k
  Pupil is h darkest portion of the eye
                         i    f h

 The pupil area is obtained after thresholding the input image
Edge Detection
 Af thresholding the i
  After h h ldi    h image edge i obtained using C
                            d is b i d i Canny edged
  Detector
Circular Hough Transform (CHT)
 CHT i used to transform a set of edge points i the i
        is    d        f           f d      i   in h image space
  into a set of accumulated votes in a parameter space

 For each edge point, votes are accumulated in an accumulator array
  for all parameter combinations.

 The array elements that contain the highest number of votes
  indicate the presence of the shape
               p                  p
 F every edge pixel ( ) fi d the candidate center point using
  For      d    i l (p) find h       did             i     i

                            xt  x p  r  cos( )
                            yt  y p  r  sin( )

  where xp and yp is the location of edge point p
  r є [rmin rmax]
  xt and yt is the determined circle center
 F range of radius
  For      f di
    The center point is computed
    The Accumulator array is incremented by one for calculated center p
                           y               y                           point
    Accum[xt,yt,r]=Accum[xt,yt,r]+1


 The point with maximum value in the accumulator is denoted as
  circle center with radius r
Iris Detection
 Steps involved are
    Histogram Equalization

    Concentric Circles of different radii are drawn from the detected
     pupil center

    The intensities lying over the perimeter of the circle are summed
     up

    Among the candidate iris circles, the one having a maximum
            g                         ,              g
     change in intensity with respect to the previous drawn circle is
     the iris outer boundary
(a) Histogram Equalization   (b) Concentric Circles   (c) Iris Detected
Iris Normalization
 L
  Localizing i i f
       li i iris from an i
                         image d li
                               delineates the annular portion f
                                           h      l       i from the
                                                                  h
  rest of the image

 The annular ring is transformed to rectangular ring

 The coordinate system is changed by unwrapping the iris from
  Cartesian coordinate their polar equivalent
I ( x(  ,  ), y (  ,  ))  I (  ,  )
                 with
                 x p (  ,  )  x 0 ( )  rp * cos( )
                 y p (  ,  )  y  0 ( )  rp * sin( )
                                                    i (
                 xi (  ,  )  xi 0 ( )  ri * cos( )
                 yi (  ,  )  xi 0 ( )  ri * sin( )


 where rp and ri are respectively the radius of pupil and the iris
 while (xp(θ), yp(θ)) and (xi(θ), yi(θ)) are the coordinates of the
  pupillary and li bi b
      ill     d limbic boundaries i th di ti θ Th value of θ
                            d i in the direction θ. The l           f
  belongs to [0;2], ρ belongs to [0;1]
Recognition using Haar Wavelet
 O dimensional t
  One di     i    l transformation on each row f ll
                         f    ti         h     followed b one
                                                      d by
  dimensional transformation of each column.

 Extracted coefficients would be
    Approximation
    Vertical
    Horizontal
    Diagonal

 Approximation coefficients are further decomposed into the next
  level

 4 level decomposition is used
Graphical Representation - Wavelet decomposition
                   (level
                   (le el = 2)
Approach
 For a 2X2 matrix

          a b            a  b a  b
        x    
          c d          x            
                           c  d c  d 
                                     


            1 a  b  c  d a b  c  d 
          y                             
            2 a  b  c  d a b  c  d 
                                         
Example
     l
2        2      9    5                              4   14     0     4
                          Column wise Summation     2    7     2     3
2        0      5    2
6        7      4    7                             13   11    -1    -3
3        4      8    8                              7   16    -1     0



                                                             Row wise Summation


Approximation       Horizontal

    3        10.5
             10 5    1     3.5
                           35                       6   21      2     7
                                 Finding Average
    10       13.5   -1    -1.5                     20   27     -2    -3
    1        3.5    -1     0.5                      2    7     -2     1
    3        -2.5    0    -1.5                      6   -5      0    -3
     Vertical       Diagonal
Iris Strip after Decomposition
Feature Template
 A l
  At level 4 coefficient matrices are combined i
         l      ffi i        i           bi d into single f
                                                    i l feature
  matrix or feature template FV= [CD4 CV4 CH4].

                                 1 FV (i )  0
                     Iris (i )  
                                 0 FV (i )  0
 where Iris is the binarized iris code
Matching
 D b
  Database template (S) i matched with the query template (T) using
               l        is    h d ih h               l          i
  Hamming distance approach
                                             n
                                             m
                                      1
                    MS   Iris   
                                    n  m
                                            
                                            i 1
                                                    T i, j  S i, j
                                             j 1



  where n X m is the size of template and  is the bitwise xor
Institutes working - Databases
                 g
           Bath University
                         y



                    MMU



                  UBIRIS



                Casia V3

               IIT Kanpur
Databases Used


       Name                           Images
                 (Subject × Images per subject = Total Images)

     BATH        50 × 20 = 1000

     CASIA V3    249 × 11 ≈ 2655 (approx)

     Iris IITK   600 × 3 = 1800
Performance



                Database          Accuracy (%) FAR (%)   FRR (%)
FAR %




                IIT Kanpur
                IIT Kanpur           94.87       1.06      9.18
  R




                Bath University      95.08       2.33      7.50
                CASIAV3              95.07       4.71      5.12



        FRR %
100
                                                       IITK
                                                       BATH
           90                                          CASIAV3



           80
Accuracy


           70



           60



           50



           40
                 0     0.1   0.2      0.3      0.4   0.5         0.6
                                   Threshold



                     Accuracy Graph vs. Threshold Graph
Some more work on Iris..
 Matching score level f
                       fusion

 D l stage approach f I i f t
  Dual t           h for Iris feature extraction
                                        t ti

 Feature level clustering of large biometric database

 Indexing database using energy histogram

 Local feature descriptor for Iris
                      p

 Use of annular iris images for recognition
Matching Score Level Fusion
 A novel t h i
        l technique f i i using t t
                    for iris i texture and phase f t
                                         d h     features

 Texture features
    Haar Wavelet


 Phase features
    LOG Gabor Wavelet


 Fusion
    Weighted Sum of Score technique
                    MS final    MS Haar    MS Gabor
   where α and β are static value of weights
Results

     Table Showing accuracy values in percentage for BATH and
                 g        y                   g
     CASIA database

      Databases →           BATH                   IITK

      Approaches ↓   FAR    FRR     Acc     FAR    FRR     Acc

     Haar Wavelet    1.61   11.08   93.64   0.33   7.88   95.89

     LOG Gabor       1.63   9.55    94.40   1.30   6.05   96.31

     Fusion          0.36   8.38    95.62   0.16   4.50   97.66
Accuracy Graphs




       (a) BATH   (b) IITK
ROC Curves




       (a) BATH   (b) IITK
Dual Stage Approach for IRIS Feature
         g pp
Extraction
 F t
  Features are detected
               d t t d
    Harris Corner Detector
        Autocorrelation matrix


 For each detected corner (i), following information is recorded
    (x y)
     (x,
        coordinates of ith corner point
    Hi
        entropy information of window (wi ) around the corner


 Matching is done in dual stage
    Stage 1: Pairing corner points using Euclidean distance
    Stage 2: Finding Mutual Information (MI) of potential corners
Results
    l
ROC Curve for Euclidean, MI and Dual Stage approach on CASIA Dataset
                                        g
Feature Level Clustering
 Clustering signature database
   Fuzzy C Means Clustering
Results

      Table: Bin miss rate for different clusters using FCM and K-means

                  No. of Clusters      FCM         K-means
                         2               1             0
                         3               2             0
                         4               3             1
                         5               8             8
                         6              11            12
                         7              12            18
                         8              16            21
                         9              17            25
25
                                 FCM
                                 K Means
                        20




                        15
        Bin Miss Rate
               s




                        10




                        5




                        0
                             2   3         4      5         6       7   8   9
                                               Number of Clusters



Graph showing bin miss rate by varying number of clusters for FCM
and K-Means
Indexing Database
 F
  Features are extracted using bl k i DCT
                       d i blockwise

 Coefficients are reordered into subbands

 Histogram is obtained for each subband (Hi)

 A global key is obtained using histogram binning approach

 B-tree is traversed using global key
System Diagram
Results
Subband
S bb d (#)          CASIA                BATH                 IITK
             Classes (#) BM   PR   Classes BM   PR      Classes BM   PR
1            2         0.00 99.69 5       04    26.14   5      1.5   41.44
2            5         1.60 35.96 23      12    7.69    19     5.0   17.21
3            16        3.60 22.70 66      26    3.04    46     5.5   9.24
4            39        13.2 10.23 130     36    1.42    93     10.0 4.77
5            82        24.0 6.12   197    38    0.92    148    12.5 3.25
6            158       30.8 3.46
                       30 8 3 46   313    56    0.49
                                                0 49    252    15.5 1.56
                                                               15 5 1 56
7            233       35.6 2.63   399    60    0.30    396    20.5 0.92
8            304       40.0 1.77   492    70    0.16    584    29.0 0.50
9            387       42.0 1.22   583    72    0.09    744    37.5 0.27
10           519       43.6 0.63   648    72    0.06    856    44.0 0.20
Local feature descriptor for Iris
 To further improve accuracy
 Features are extracted using Speeded Up Robust
  Features
    Uses Hessian Matrix
    Descriptor is formed using Haar Wavelet responses


 Pairing of features using nearest neighbor approach
Results
 Accuracy has been increased considerably

                       BATH                    CASIA                   IITK

  Approaches   FAR     FRR     Acc     FAR     FRR     Acc     FAR     FRR     Acc

  Gabor        1.63    8.02    95.16   3.47    44.94   75.78   2.64    21.09   88.13

  Harris       29.04   39.03   65.97   17.18   34.73   74.05   24.95   21.95   76.55

  SIFT         0.77
               0 77    16.41
                       16 41   91.54
                               91 54   15.12
                                       15 12   28.22
                                               28 22   78.32
                                                       78 32   1.99
                                                               1 99    31.37
                                                                       31 37   83.31
                                                                               83 31

  SURF         2.66    6.36    95.48   4.58    3.85    95.77   0.02    0.01    99.98
Annular Iris Recognition
Results



 Database →                BATH                   CASIA                   IITK

 Test cases ↓      FAR     FRR     Acc     FAR    FRR     Acc     FAR    FRR     Acc

 Normalized Iris   10.35   21.11   84.26   3.31   5.13    95.77   0.86   5.52    98.60

 Annular Iris      2.37    1.97    97.84   1.44   4.07    97.23   4.65   1.41    97.15
Results
Iris Biometric-Applications
   IBM Looks Airline S
         L k Ai li Security i the E
                             i in h Eye
   IrisGuard, Inc.
   Securimetrics, Inc.
    Securimetrics Inc
   Panasonic
   London Heathrow Airport
   Amsterdam Schiphol Airport
   Charlotte Airport USA
   IrisAccess LG Corp, S th K
    Ii A            C    South Korea
   IrisPass OKI Electric Industries, Japan
   EyeTicket Corp USA
               Corp.
Case studies on Iris




Source: http://www cl cam ac uk/~jgd1000/afghan html
        http://www.cl.cam.ac.uk/~jgd1000/afghan.html
Employees at Albany International Airport (NY)
Frequent Flyers at Schiphol Airport (NL)
Frequent Flyers at Schiphol Airport (NL) may enroll in the "Privium" programme,
enabling them to enter The Netherlands without passport presentation
                                                        presentation.
Condominium residents in Tokyo gain entry to the building by their iris patterns, and the
elevator is automatically called and programmed to bring them to their residential floor.
                        y            p g               g
United Nations High Commission for Refugees administers cash grants
to refugees returning into Afghanistan
Frequent Flyers at Frankfurt/Main Airport can pass quickly through Immigration Control
without passport inspection if their iris patterns have been enrolled for this purpose.
The check-in procedure for passengers at Narita Airport (Japan) is expedited by
recognition of their iris patterns
References
 A K J i A R
  A. K. Jain, A. Ross, and S P bh k S "A i
                         d S. Prabhakar, S., "An introduction to
                                                      d i
  biometric recognition," IEEE Transactions on Circuits and Systems
  for Video Technology, vol.14, no.1, pp. 4-20, Jan. 2004

 A. Ross, A. K. Jain, and J. Z. Qian, "Information Fusion in
  Biometrics Proc.
  Biometrics", Proc 3rd International Conference on Audio and
                                                      Audio-
  Video-Based Biometric Person Authentication, pp. 354-359,
  Sweden, June 6-8, 2001

 Phalguni Gupta, Ajita Rattani, Hunny Mehrotra, Anil K. Kaushik,
  “Multimodal biometrics system for efficient human recognition”, Proc
   Multimodal                                       recognition Proc.
  SPIE International Society of Optical Engineering 6202, 62020Y,
  2006
 L Fl
  L. Flom, A S fi I i recognition system, U S P
           A. Safir, Iris   ii            U.S. Patent 4641394 1987
                                                      4641394,

 J Daugman How iris recognition works Image Processing 2002
  J. Daugman,                      works,        Processing. 2002.
  Proceedings. 2002 International Conference on , vol.1, no., pp. I-33-
  I-36 vol.1, 2002

 R. P. Wildes, Iris recognition: an emerging biometric technology,
  Proceedings of the IEEE , vol.85, no.9, pp.1348-1363, Sep 1997
            g                        ,   , pp            ,  p

 K.W. Bowyer, K. Hollingsworth, P.J. Flynn, Image Understanding for
  Iris Biometrics: A S
  I i Bi    ti       Survey, C
                             Computer Vi i and I
                                  t Vision d Image
  Understanding, 2007
Link for Databases
 C i : htt //
  Casia http://www.cbsr.ia.ac.cn/IrisDatabase.htm
                    b i         /I i D t b    ht

 Bath University: http://www.bath.ac.uk/elec-
                y     p
  eng/research/sipg/irisweb/index.htm

 MMU: http://pesona mmu edu my/~ccteo/
       http://pesona.mmu.edu.my/ ccteo/

 UBIRIS: http://iris.di.ubi.pt/

 IITK: http://www.cse.iitk.ac.in/users/biometrics/
Thank you.
      you

Weitere ähnliche Inhalte

Was ist angesagt?

Was ist angesagt? (20)

Face Recognition
Face RecognitionFace Recognition
Face Recognition
 
Iris scan.ppt 1
Iris scan.ppt 1Iris scan.ppt 1
Iris scan.ppt 1
 
Iris recognition
Iris recognitionIris recognition
Iris recognition
 
Iris scanning
Iris scanningIris scanning
Iris scanning
 
Biometrics
BiometricsBiometrics
Biometrics
 
Face recognition
Face recognitionFace recognition
Face recognition
 
Retinal Recognition
Retinal RecognitionRetinal Recognition
Retinal Recognition
 
Retina recognition biometrics drishtysharma
Retina recognition biometrics drishtysharmaRetina recognition biometrics drishtysharma
Retina recognition biometrics drishtysharma
 
Iris recognition system
Iris recognition systemIris recognition system
Iris recognition system
 
Biometrics Technology In the 21st Century
Biometrics Technology In the 21st CenturyBiometrics Technology In the 21st Century
Biometrics Technology In the 21st Century
 
FACE RECOGNITION TECHNOLOGY
FACE RECOGNITION TECHNOLOGYFACE RECOGNITION TECHNOLOGY
FACE RECOGNITION TECHNOLOGY
 
Iris recognition
Iris recognitionIris recognition
Iris recognition
 
Final iris recognition
Final iris recognitionFinal iris recognition
Final iris recognition
 
iris recognition system as means of unique identification
iris recognition system as means of unique identification iris recognition system as means of unique identification
iris recognition system as means of unique identification
 
Fingerprint recognition
Fingerprint recognitionFingerprint recognition
Fingerprint recognition
 
Multimodal Biometric Systems
Multimodal Biometric SystemsMultimodal Biometric Systems
Multimodal Biometric Systems
 
Iris Recognition Technology
Iris Recognition TechnologyIris Recognition Technology
Iris Recognition Technology
 
Face recognisation system
Face recognisation systemFace recognisation system
Face recognisation system
 
face recognition
face recognitionface recognition
face recognition
 
Face recognition technology
Face recognition technologyFace recognition technology
Face recognition technology
 

Ähnlich wie Iris Recognition

Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001vital vital
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001vital vital
 
Iris Localization - a Biometric Approach Referring Daugman's Algorithm
Iris Localization - a Biometric Approach Referring Daugman's AlgorithmIris Localization - a Biometric Approach Referring Daugman's Algorithm
Iris Localization - a Biometric Approach Referring Daugman's AlgorithmEditor IJCATR
 
Fingerprint High Level Classification
Fingerprint High Level ClassificationFingerprint High Level Classification
Fingerprint High Level ClassificationReza Rahimi
 
WAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATION
WAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATIONWAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATION
WAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATIONsipij
 
Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...
Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...
Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...IJCSIS Research Publications
 
Fisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous DrivingFisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous DrivingYu Huang
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORcsitconf
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORcscpconf
 
Digital_Image_Processing with examples.pdf
Digital_Image_Processing with examples.pdfDigital_Image_Processing with examples.pdf
Digital_Image_Processing with examples.pdfNagabhushanaRao16
 
Computer Vision Course includes deep learning
Computer Vision Course includes deep learningComputer Vision Course includes deep learning
Computer Vision Course includes deep learninggigap29589
 
Pupil Detection Based on Color Difference and Circular Hough Transform
Pupil Detection Based on Color Difference and Circular Hough Transform Pupil Detection Based on Color Difference and Circular Hough Transform
Pupil Detection Based on Color Difference and Circular Hough Transform IJECEIAES
 

Ähnlich wie Iris Recognition (20)

Iris sem
Iris semIris sem
Iris sem
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001
 
Iris Localization - a Biometric Approach Referring Daugman's Algorithm
Iris Localization - a Biometric Approach Referring Daugman's AlgorithmIris Localization - a Biometric Approach Referring Daugman's Algorithm
Iris Localization - a Biometric Approach Referring Daugman's Algorithm
 
Iris recognition
Iris recognitionIris recognition
Iris recognition
 
E017443136
E017443136E017443136
E017443136
 
Iq3116211626
Iq3116211626Iq3116211626
Iq3116211626
 
Fingerprint High Level Classification
Fingerprint High Level ClassificationFingerprint High Level Classification
Fingerprint High Level Classification
 
Defending
DefendingDefending
Defending
 
WAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATION
WAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATIONWAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATION
WAVELET PACKET BASED IRIS TEXTURE ANALYSIS FOR PERSON AUTHENTICATION
 
Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...
Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...
Enhanced Approach to Iris Normalization Using Circular Distribution for Iris ...
 
Optics group research overview
Optics group research overviewOptics group research overview
Optics group research overview
 
Fisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous DrivingFisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous Driving
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
 
E0442328
E0442328E0442328
E0442328
 
Digital_Image_Processing with examples.pdf
Digital_Image_Processing with examples.pdfDigital_Image_Processing with examples.pdf
Digital_Image_Processing with examples.pdf
 
Computer Vision Course includes deep learning
Computer Vision Course includes deep learningComputer Vision Course includes deep learning
Computer Vision Course includes deep learning
 
Pupil Detection Based on Color Difference and Circular Hough Transform
Pupil Detection Based on Color Difference and Circular Hough Transform Pupil Detection Based on Color Difference and Circular Hough Transform
Pupil Detection Based on Color Difference and Circular Hough Transform
 
Lec03 light
Lec03 lightLec03 light
Lec03 light
 

Mehr von Piyush Mittal

Mehr von Piyush Mittal (20)

Power mock
Power mockPower mock
Power mock
 
Design pattern tutorial
Design pattern tutorialDesign pattern tutorial
Design pattern tutorial
 
Reflection
ReflectionReflection
Reflection
 
Gpu archi
Gpu archiGpu archi
Gpu archi
 
Cuda Architecture
Cuda ArchitectureCuda Architecture
Cuda Architecture
 
Intel open mp
Intel open mpIntel open mp
Intel open mp
 
Intro to parallel computing
Intro to parallel computingIntro to parallel computing
Intro to parallel computing
 
Cuda toolkit reference manual
Cuda toolkit reference manualCuda toolkit reference manual
Cuda toolkit reference manual
 
Matrix multiplication using CUDA
Matrix multiplication using CUDAMatrix multiplication using CUDA
Matrix multiplication using CUDA
 
Channel coding
Channel codingChannel coding
Channel coding
 
Basics of Coding Theory
Basics of Coding TheoryBasics of Coding Theory
Basics of Coding Theory
 
Java cheat sheet
Java cheat sheetJava cheat sheet
Java cheat sheet
 
Google app engine cheat sheet
Google app engine cheat sheetGoogle app engine cheat sheet
Google app engine cheat sheet
 
Git cheat sheet
Git cheat sheetGit cheat sheet
Git cheat sheet
 
Vi cheat sheet
Vi cheat sheetVi cheat sheet
Vi cheat sheet
 
Css cheat sheet
Css cheat sheetCss cheat sheet
Css cheat sheet
 
Cpp cheat sheet
Cpp cheat sheetCpp cheat sheet
Cpp cheat sheet
 
Ubuntu cheat sheet
Ubuntu cheat sheetUbuntu cheat sheet
Ubuntu cheat sheet
 
Php cheat sheet
Php cheat sheetPhp cheat sheet
Php cheat sheet
 
oracle 9i cheat sheet
oracle 9i cheat sheetoracle 9i cheat sheet
oracle 9i cheat sheet
 

Iris Recognition

  • 1. IRIS BIOMETRIC SYSTEM CS635 Dept. of Computer Science & Engineering NIT Rourkela
  • 2. Iris Biometrics  Iris is externally-visible, colored ring around the pupil  The flowery pattern is unique for each individual  The right and left eye of any given individual, have unrelated iris patterns  Iris is stable throughout life  Randomness
  • 3. Anatomical Structure of Iris Eyelash Iris Boundary Pupil Boundary Pupil Iris Sclera E lid Eyelid
  • 4. Advantages of Iris Recognition  Hi hl protected, i Highly d internal organ of the eye l f h  Externally visible patterns imaged from a distance  Patterns apparently stable throughout life  Iris shape is far more predictable than that of the face  No need for a person to touch any equipment
  • 5. Disadvantages of Iris Recognition  L Localization f il f d k i i li i fails for dark iris  Highly susceptible for changes in weather or due to infection  Obscured by eyelashes, lenses, reflections  Well trained and co-operative user is required  Expensive Acquisition Devices Occlusion due to eyelashes
  • 6. The remarkable story of Sharbat Gula, first photographed in 1984 aged 12 in a refugee camp in Pakistan by National Geographic photographer Steve g y g g McCurry, and traced 18 years later to a remote part of Afghanistan where 6 she was again photographed by McCurry, is told by National Geographic in their magazine (April 2002 issue)
  • 7. Geographic turned to the inventor of automatic iris recognition, John Daugman, a professor of computer science at England’s University of Cambridge His biometric technique uses England s Cambridge. mathematical calculations, and the numbers Daugman got left no question in his mind that the haunted eyes of the young Afghan refugee and the eyes of the adult Sharbat Gula belong to the same person. 7
  • 9. Literature Review  Fl Flom and Safir d S fi  Daugman’s Approach Daugman s  Wildes Approach  Proposed Implementation
  • 10. Flom and Safir  I 1987 the authors obtained a patent f an unimplemented In h h b i d for i l d conceptual design of an iris biometrics system  Their description suggested  highly controlled conditions  headrest  target image to direct the subject’s gaze  manual operator p  Pupil expansion and contraction was controlled by changing the illumination to force the pupil to a predetermined size
  • 11.  T detect the pupil, To d h il  Threshold based approach  Extraction of Iris Descriptors  Pattern recognition tools  Ed d t ti algorithms Edge detection l ith  Hough transform  Iris features could be stored on a credit card or identification card to support a verification task
  • 12. Daugman’s Approach  Daugman’s 1994 patent described an operational iris Daugman s recognition system in some detail.  I Improvements over Fl t Flom and safir’s approach d fi ’ h  Image Acquisition  Image should use near-infrared illumination  Iris Localization s oca a o  An integro-differential operator for detecting the iris boundary by searching the parameter space.  Iris Normalization  mapping the extracted iris region into polar coordinate system
  • 13.  F Feature Encoding E di  2D Wavelet demodulation  Matching  Hamming distance, which measures the fraction of bits for which two iris codes disagree
  • 14. Wildes Approach  Wildes describes an iris biometrics system developed at Sarnoff Labs  Image Acquisition  a diffuse light source  low light level camera  Iris Localization  Computing an binary edge map  Hough transform to detect circles  Feature Extraction  Laplacian of Gaussian filter at multiple scales  Matching  normalized correlation
  • 15. Proposed Implementation  The iris recognition system developed consists of  Image Acquisition  Preprocessing  Iris Localization  Pupil Detection  Iris Detection  Iris Normalization  Feature Extraction  Haar Wavelet  Matching
  • 16. Image Acquisition  Th iris i The i i image i acquired f is i d from a CCD based iris camera  Camera is placed 9 cm a a from away subjects eye  The source of light is placed at a distance of 12 cm (approx) from the user eye  The distance between source of light and CCD camera is found to be approximately 8 cm pp y
  • 17. Image Acquisition System: (a) System with frame grabber (b) CCD Camera (c) Light Source (d) User
  • 18. Preprocessing  The detection of pupil fails whenever there is a spot on the pupil area  Preprocessing removes the effect of spots/holes lying on the pupillary area area.  The preprocessing module first transforms the true color (RGB) into intensity image
  • 19. Steps involved in preprocessing  Bi i i Binarization  Find the complement of binary image  Hole filling using four connected approach  Complement of hole filled image
  • 21. Iris Localization  The important steps involved in iris localization are  Pupil Detection  Iris Detection
  • 22. Pupil Detection  The steps involved in pupil detection are  Thresholding  Edge Detection  Circular Hough Transform
  • 23. Thresholding  P il i the d k Pupil is h darkest portion of the eye i f h  The pupil area is obtained after thresholding the input image
  • 24. Edge Detection  Af thresholding the i After h h ldi h image edge i obtained using C d is b i d i Canny edged Detector
  • 25. Circular Hough Transform (CHT)  CHT i used to transform a set of edge points i the i is d f f d i in h image space into a set of accumulated votes in a parameter space  For each edge point, votes are accumulated in an accumulator array for all parameter combinations.  The array elements that contain the highest number of votes indicate the presence of the shape p p
  • 26.  F every edge pixel ( ) fi d the candidate center point using For d i l (p) find h did i i xt  x p  r  cos( ) yt  y p  r  sin( ) where xp and yp is the location of edge point p r є [rmin rmax] xt and yt is the determined circle center
  • 27.  F range of radius For f di  The center point is computed  The Accumulator array is incremented by one for calculated center p y y point  Accum[xt,yt,r]=Accum[xt,yt,r]+1  The point with maximum value in the accumulator is denoted as circle center with radius r
  • 28. Iris Detection  Steps involved are  Histogram Equalization  Concentric Circles of different radii are drawn from the detected pupil center  The intensities lying over the perimeter of the circle are summed up  Among the candidate iris circles, the one having a maximum g , g change in intensity with respect to the previous drawn circle is the iris outer boundary
  • 29. (a) Histogram Equalization (b) Concentric Circles (c) Iris Detected
  • 30. Iris Normalization  L Localizing i i f li i iris from an i image d li delineates the annular portion f h l i from the h rest of the image  The annular ring is transformed to rectangular ring  The coordinate system is changed by unwrapping the iris from Cartesian coordinate their polar equivalent
  • 31. I ( x(  ,  ), y (  ,  ))  I (  ,  ) with x p (  ,  )  x 0 ( )  rp * cos( ) y p (  ,  )  y  0 ( )  rp * sin( ) i ( xi (  ,  )  xi 0 ( )  ri * cos( ) yi (  ,  )  xi 0 ( )  ri * sin( )  where rp and ri are respectively the radius of pupil and the iris  while (xp(θ), yp(θ)) and (xi(θ), yi(θ)) are the coordinates of the pupillary and li bi b ill d limbic boundaries i th di ti θ Th value of θ d i in the direction θ. The l f belongs to [0;2], ρ belongs to [0;1]
  • 32.
  • 33. Recognition using Haar Wavelet  O dimensional t One di i l transformation on each row f ll f ti h followed b one d by dimensional transformation of each column.  Extracted coefficients would be  Approximation  Vertical  Horizontal  Diagonal  Approximation coefficients are further decomposed into the next level  4 level decomposition is used
  • 34. Graphical Representation - Wavelet decomposition (level (le el = 2)
  • 35. Approach  For a 2X2 matrix a b a  b a  b x  c d  x  c  d c  d      1 a  b  c  d a b  c  d  y   2 a  b  c  d a b  c  d   
  • 36. Example l 2 2 9 5 4 14 0 4 Column wise Summation 2 7 2 3 2 0 5 2 6 7 4 7 13 11 -1 -3 3 4 8 8 7 16 -1 0 Row wise Summation Approximation Horizontal 3 10.5 10 5 1 3.5 35 6 21 2 7 Finding Average 10 13.5 -1 -1.5 20 27 -2 -3 1 3.5 -1 0.5 2 7 -2 1 3 -2.5 0 -1.5 6 -5 0 -3 Vertical Diagonal
  • 37. Iris Strip after Decomposition
  • 38. Feature Template  A l At level 4 coefficient matrices are combined i l ffi i i bi d into single f i l feature matrix or feature template FV= [CD4 CV4 CH4]. 1 FV (i )  0 Iris (i )   0 FV (i )  0 where Iris is the binarized iris code
  • 39. Matching  D b Database template (S) i matched with the query template (T) using l is h d ih h l i Hamming distance approach n m 1 MS Iris  n  m  i 1 T i, j  S i, j j 1 where n X m is the size of template and  is the bitwise xor
  • 40. Institutes working - Databases g Bath University y MMU UBIRIS Casia V3 IIT Kanpur
  • 41. Databases Used Name Images (Subject × Images per subject = Total Images) BATH 50 × 20 = 1000 CASIA V3 249 × 11 ≈ 2655 (approx) Iris IITK 600 × 3 = 1800
  • 42. Performance Database Accuracy (%) FAR (%) FRR (%) FAR % IIT Kanpur IIT Kanpur 94.87 1.06 9.18 R Bath University 95.08 2.33 7.50 CASIAV3 95.07 4.71 5.12 FRR %
  • 43. 100 IITK BATH 90 CASIAV3 80 Accuracy 70 60 50 40 0 0.1 0.2 0.3 0.4 0.5 0.6 Threshold Accuracy Graph vs. Threshold Graph
  • 44. Some more work on Iris..  Matching score level f fusion  D l stage approach f I i f t Dual t h for Iris feature extraction t ti  Feature level clustering of large biometric database  Indexing database using energy histogram  Local feature descriptor for Iris p  Use of annular iris images for recognition
  • 45. Matching Score Level Fusion  A novel t h i l technique f i i using t t for iris i texture and phase f t d h features  Texture features  Haar Wavelet  Phase features  LOG Gabor Wavelet  Fusion  Weighted Sum of Score technique MS final    MS Haar    MS Gabor where α and β are static value of weights
  • 46. Results Table Showing accuracy values in percentage for BATH and g y g CASIA database Databases → BATH IITK Approaches ↓ FAR FRR Acc FAR FRR Acc Haar Wavelet 1.61 11.08 93.64 0.33 7.88 95.89 LOG Gabor 1.63 9.55 94.40 1.30 6.05 96.31 Fusion 0.36 8.38 95.62 0.16 4.50 97.66
  • 47. Accuracy Graphs (a) BATH (b) IITK
  • 48. ROC Curves (a) BATH (b) IITK
  • 49. Dual Stage Approach for IRIS Feature g pp Extraction  F t Features are detected d t t d  Harris Corner Detector  Autocorrelation matrix  For each detected corner (i), following information is recorded  (x y) (x,  coordinates of ith corner point  Hi  entropy information of window (wi ) around the corner  Matching is done in dual stage  Stage 1: Pairing corner points using Euclidean distance  Stage 2: Finding Mutual Information (MI) of potential corners
  • 50. Results l
  • 51. ROC Curve for Euclidean, MI and Dual Stage approach on CASIA Dataset g
  • 52. Feature Level Clustering  Clustering signature database  Fuzzy C Means Clustering
  • 53. Results Table: Bin miss rate for different clusters using FCM and K-means No. of Clusters FCM K-means 2 1 0 3 2 0 4 3 1 5 8 8 6 11 12 7 12 18 8 16 21 9 17 25
  • 54. 25 FCM K Means 20 15 Bin Miss Rate s 10 5 0 2 3 4 5 6 7 8 9 Number of Clusters Graph showing bin miss rate by varying number of clusters for FCM and K-Means
  • 55. Indexing Database  F Features are extracted using bl k i DCT d i blockwise  Coefficients are reordered into subbands  Histogram is obtained for each subband (Hi)  A global key is obtained using histogram binning approach  B-tree is traversed using global key
  • 56.
  • 58. Results Subband S bb d (#) CASIA BATH IITK Classes (#) BM PR Classes BM PR Classes BM PR 1 2 0.00 99.69 5 04 26.14 5 1.5 41.44 2 5 1.60 35.96 23 12 7.69 19 5.0 17.21 3 16 3.60 22.70 66 26 3.04 46 5.5 9.24 4 39 13.2 10.23 130 36 1.42 93 10.0 4.77 5 82 24.0 6.12 197 38 0.92 148 12.5 3.25 6 158 30.8 3.46 30 8 3 46 313 56 0.49 0 49 252 15.5 1.56 15 5 1 56 7 233 35.6 2.63 399 60 0.30 396 20.5 0.92 8 304 40.0 1.77 492 70 0.16 584 29.0 0.50 9 387 42.0 1.22 583 72 0.09 744 37.5 0.27 10 519 43.6 0.63 648 72 0.06 856 44.0 0.20
  • 59.
  • 60. Local feature descriptor for Iris  To further improve accuracy
  • 61.  Features are extracted using Speeded Up Robust Features  Uses Hessian Matrix  Descriptor is formed using Haar Wavelet responses  Pairing of features using nearest neighbor approach
  • 62. Results  Accuracy has been increased considerably BATH CASIA IITK Approaches FAR FRR Acc FAR FRR Acc FAR FRR Acc Gabor 1.63 8.02 95.16 3.47 44.94 75.78 2.64 21.09 88.13 Harris 29.04 39.03 65.97 17.18 34.73 74.05 24.95 21.95 76.55 SIFT 0.77 0 77 16.41 16 41 91.54 91 54 15.12 15 12 28.22 28 22 78.32 78 32 1.99 1 99 31.37 31 37 83.31 83 31 SURF 2.66 6.36 95.48 4.58 3.85 95.77 0.02 0.01 99.98
  • 63.
  • 65.
  • 66. Results Database → BATH CASIA IITK Test cases ↓ FAR FRR Acc FAR FRR Acc FAR FRR Acc Normalized Iris 10.35 21.11 84.26 3.31 5.13 95.77 0.86 5.52 98.60 Annular Iris 2.37 1.97 97.84 1.44 4.07 97.23 4.65 1.41 97.15
  • 68. Iris Biometric-Applications  IBM Looks Airline S L k Ai li Security i the E i in h Eye  IrisGuard, Inc.  Securimetrics, Inc. Securimetrics Inc  Panasonic  London Heathrow Airport  Amsterdam Schiphol Airport  Charlotte Airport USA  IrisAccess LG Corp, S th K Ii A C South Korea  IrisPass OKI Electric Industries, Japan  EyeTicket Corp USA Corp.
  • 69. Case studies on Iris Source: http://www cl cam ac uk/~jgd1000/afghan html http://www.cl.cam.ac.uk/~jgd1000/afghan.html
  • 70. Employees at Albany International Airport (NY)
  • 71. Frequent Flyers at Schiphol Airport (NL)
  • 72. Frequent Flyers at Schiphol Airport (NL) may enroll in the "Privium" programme, enabling them to enter The Netherlands without passport presentation presentation.
  • 73. Condominium residents in Tokyo gain entry to the building by their iris patterns, and the elevator is automatically called and programmed to bring them to their residential floor. y p g g
  • 74. United Nations High Commission for Refugees administers cash grants to refugees returning into Afghanistan
  • 75. Frequent Flyers at Frankfurt/Main Airport can pass quickly through Immigration Control without passport inspection if their iris patterns have been enrolled for this purpose.
  • 76. The check-in procedure for passengers at Narita Airport (Japan) is expedited by recognition of their iris patterns
  • 77. References  A K J i A R A. K. Jain, A. Ross, and S P bh k S "A i d S. Prabhakar, S., "An introduction to d i biometric recognition," IEEE Transactions on Circuits and Systems for Video Technology, vol.14, no.1, pp. 4-20, Jan. 2004  A. Ross, A. K. Jain, and J. Z. Qian, "Information Fusion in Biometrics Proc. Biometrics", Proc 3rd International Conference on Audio and Audio- Video-Based Biometric Person Authentication, pp. 354-359, Sweden, June 6-8, 2001  Phalguni Gupta, Ajita Rattani, Hunny Mehrotra, Anil K. Kaushik, “Multimodal biometrics system for efficient human recognition”, Proc Multimodal recognition Proc. SPIE International Society of Optical Engineering 6202, 62020Y, 2006
  • 78.  L Fl L. Flom, A S fi I i recognition system, U S P A. Safir, Iris ii U.S. Patent 4641394 1987 4641394,  J Daugman How iris recognition works Image Processing 2002 J. Daugman, works, Processing. 2002. Proceedings. 2002 International Conference on , vol.1, no., pp. I-33- I-36 vol.1, 2002  R. P. Wildes, Iris recognition: an emerging biometric technology, Proceedings of the IEEE , vol.85, no.9, pp.1348-1363, Sep 1997 g , , pp , p  K.W. Bowyer, K. Hollingsworth, P.J. Flynn, Image Understanding for Iris Biometrics: A S I i Bi ti Survey, C Computer Vi i and I t Vision d Image Understanding, 2007
  • 79. Link for Databases  C i : htt // Casia http://www.cbsr.ia.ac.cn/IrisDatabase.htm b i /I i D t b ht  Bath University: http://www.bath.ac.uk/elec- y p eng/research/sipg/irisweb/index.htm  MMU: http://pesona mmu edu my/~ccteo/ http://pesona.mmu.edu.my/ ccteo/  UBIRIS: http://iris.di.ubi.pt/  IITK: http://www.cse.iitk.ac.in/users/biometrics/
  • 80. Thank you. you