SlideShare ist ein Scribd-Unternehmen logo
1 von 31
PRESENTED BY :
Ankita Pandey
ME ECE - 112604
CONTENT
             Introduction

             Properties of Hopfield network

             Hopfield network derivation

             Hopfield network example

             Applications

             References

10/31/2012                  PRESENTATION ON HOPFIELD NETWORK   2
INTRODUCTION

             Hopfield neural network is proposed by John
             Hopfield in 1982 can be seen

             • as a network with associative memory
             • can be used for different pattern recognition problems.

             It is a fully connected, single layer auto
             associative network

             • Means it has only one layer, with each neuron connected to
               every other neuron


             All the neurons act as input and output.



10/31/2012                     PRESENTATION ON HOPFIELD NETWORK             3
INTRODUCTION



     The Hopfield network(model)
     consists of a set of neurons and
    corresponding set of unit delays,
        forming a multiple loop
    feedback system as shown in fig.




10/31/2012                 PRESENTATION ON HOPFIELD NETWORK   4
INTRODUCTION

             The number of feedback loops is equal to
             the number of neurons.




             Basically, the output of the neuron is
             feedback, via a unit delay element, to
             each of the other neurons in the network.


             • no self feedback in the network.


10/31/2012                  PRESENTATION ON HOPFIELD NETWORK   5
PROPERTIES OF HOPFIELD
                   NETWORK
             • A recurrent network with all nodes connected to all other nodes.
       1

             • Nodes have binary outputs (either 0,1 or -1,1).
       2

             • Weights between the nodes are symmetric .
       3

             • No connection from a node to itself is allowed.
       4
             • Nodes are updated asynchronously ( i.e. nodes are selected at
       5       random).

             • The network has no hidden nodes or layer.
       6

10/31/2012                    PRESENTATION ON HOPFIELD NETWORK                    6
HOPFIELD NETWORK
             Consider the noiseless, dynamical model of the neuron shown
             in fig. 1

             The synaptic weights w          j1
                                                  ,w    j2
                                                             ,...... w   jn   represents
             conductance’s.

             The respective inputs x1 t , x 2 t ,...... x n t                 represents
             the potentials, N is number of inputs.

             These inputs are applied to a current summing junction
             characterized as follows:
                 • Low input resistance.
                 • Unity current gain.
                 • High output resistance.

10/31/2012           PRESENTATION ON HOPFIELD NETWORK                                7
ADDITIVE MODEL OF A
              NEURON
                                              di




                                                           ei

                                                       Σ
               NEURAL
              NETWORK
               MODEL                              yi



10/31/2012     PRESENTATION ON HOPFIELD NETWORK                 8
HOPFIELD NETWORK
             The total current flowing toward the input node of the nonlinear
             element(activation function) is:
                                  N

                                          w   ji
                                                   xi t                 Ii
                              i       1



             Total current flowing away from the input node of the nonlinear
             element as follows:
                                  v   j
                                          t                dv   j
                                                                    t
                                                   C   j
                                      R   j
                                                            dt
                •    Where first term due to leakage resistance
                •    And second term due to leakage capacitance.



10/31/2012                   PRESENTATION ON HOPFIELD NETWORK                   9
HOPFIELD NETWORK
             • By applying KCL to the input node of the nonlinearity , we
             get
                                          N
                    dv j t    vj t
                 Cj                          w ji x i t   Ij
                      dt        Rj       i 1
                                                        ………..(1)
                                               dv       t
             • The capacitive term C     dt
                                           j
                                                    j
                                               add dynamics to the model
             of a neuron.
             • Output of the neuron j determined by using the non linear
             relation
                            xj t                    v j (t )
             • The RC model described by the eq. (1) is referred to the
             additive model

10/31/2012           PRESENTATION ON HOPFIELD NETWORK                10
HOPFIELD NETWORK
             A feature of Additive model is that the signal xᵢ(t) applied to the
             neuron j by adjoining neuron i

             • is a slowly varying function of the time t.



             Thus, a recurrent network consisting of an interconnection of N
             neurons,

             • each one of which is assumed to have the same mathematical
               model described by thev j t
                      dv j t           equation : N
                  Cj                                 w ji x i t I j,
                         dt             Rj       i 1

             •                                    j     1, 2 ,....., N
             •                                                           …….(2)
10/31/2012                PRESENTATION ON HOPFIELD NETWORK                     11
HOPFIELD NETWORK
             Now, we use eq (2) which is based on the additive model of the
             neuron.



             Assumptions:

                                w ji    w ij
             • The matrix of synaptic weights is symmetric, as shown by:
             •                                for all i and j.
             • Each neuron has a nonlinear activation of its own, hence use
               of i        in eq.(2)
             • The inverse of the nonlinear activation function exists, so we
               can write
                                        1
               •               v      i
                                            x
                 ……….(3)
10/31/2012                PRESENTATION ON HOPFIELD NETWORK                    12
HOPFIELD
                        NETWORK hyperbolic tangent
    • Let the sigmoid function          v
                               be defined by the
                                    i

    function
                                        aiv               1       exp   aiv
      x        i
                   v        tanh
                                            2             1       exp   aiv
    • Which has slope of a i / 2.
        •    a i refers as the gain of neuron i.
    The inverse I/O relation of eq.(3) may be written as
                        1                       1             1     x
               v             x                      log
                                            ai                1     x
    ………..(4)




10/31/2012                   PRESENTATION ON HOPFIELD NETWORK                 13
HOPFIELD
    •
                          NETWORK
        Standard form of the inverse I/O relation for a neuron of unity gain is:
                         1                            1      x
                             x              log
                                                      1      x
    • We can rewrite the eq. (4) in terms of standard relation as
                        1             1       1
                      i
                           x                     x
                                     ai




10/31/2012                       PRESENTATION ON HOPFIELD NETWORK                  14
Plot of (a) Sigmoidal Nonlinearity and (b) its
                   inverse


                                                     Error
                                                      ei


                                                     Model
             UNKNOW                                  Output       xi
                N
             SYSTEM                                           Σ
               f(.)




                     (a)                                               (b)
10/31/2012
             yi   PRESENTATION ON HOPFIELD NETWORK                           15
HOPFIELD
                        NETWORK
    • The energy function of the Hopfield network is defined by:
                                                                                                     x   j
                      N       N                                                  N                                                           N
              1                                                                              1                       1
     E                                    w        ji
                                                        xi x            j                                        j
                                                                                                                             x dx                    I jx    j
              2   i       1   j       1                                          j       1   R   j       0                               j       1


    • Differentiating E w.r.t. time , we get
                                  N                         N
             dE                                                                                      v   j
                                                                                                                                 dx      j
                                                                    w       ji
                                                                                 xi                                      I   j
             dt                   j       1             i       1                                R           j
                                                                                                                                    dt

    • by putting the value in parentheses from eq.2, we get
                                               N
                  dE                                                        dv       j
                                                                                             dx          j

                  dt
                                                            C       j
                                                                            dt                   dt
                                                                                                                             …………..(5)
                                               j   1




10/31/2012                                    PRESENTATION ON HOPFIELD NETWORK                                                                          16
HOPFIELD
  • The inverse relation NETWORK is
                         that defines in terms of x        v   j                           j
                                                       1
                           v                   i
                                                           x
  • By using above relation in eq. (5), we have
                               N                                       1
             dE                                            d       j
                                                                           x   j
                                                                                               dx   j
                                           C       j
             dt                j       1                           dt                          dt
                                                               2               1
                   N
                                           dx          j
                                                                   d       j
                                                                                       x   j
                           C       j
                                            dt                             dx                           …………..(6)
                   j   1                                                           j


  • From fig. (b) we see that the inverse I/O relation is monotonically
  increasing function of the output
  Therefore,



10/31/2012                             PRESENTATION ON HOPFIELD NETWORK                                             17
HOPFIELD
     Also,
                         NETWORK    2
                       dx    j
                                           0        for all x   j   .
                        dt
     Hence all the factors that make up the sum on R.H.S. of eq(6) are non-
     negative.
     Thus the energy function E defined as
                                   dE
                                                0
                                    dt




10/31/2012                       PRESENTATION ON HOPFIELD NETWORK             18
HOPFIELD
                          NETWORK
                                     • The energy function E is a Lyapunov
                                       funtion of the continuous Hopfield model.
       We may make the               • The model is stable in accordance with
        following two                  Lyapunov’s Theorem 1.
         statements:



      The time evolution of the      • Which seeks the minima of the energy
     continuous Hopfield model         function E and comes to stop at fixed
     described by the system of        points.
        nonlinear first order
        differential equations
     represents the trajectory in
            the state space

10/31/2012                    PRESENTATION ON HOPFIELD NETWORK                     19
HOPFIELD
                          NETWORK       dE
     • From eq.(6) the derivative            vanishes only if
                                        dt

                           dx   j
                                    t
                                             0         for all j.
                              dt
     • Thus we can say,
             dE
                  0   expect at fixed point       ………(7)
             dt

     • The eq.(7) forms the basis for following theorem
         • The energy function E of a Hopfield network is a monotonically
         decreasing function of time.



10/31/2012                  PRESENTATION ON HOPFIELD NETWORK                20
HOPFIELD NETWORK
                         EXAMPLE
 Connection of Hopfield Neural Network                          A Hopfield Neural network:

         Neuron     Neuron   Neuron       Neuron
         1 (N1)     2 (N2)   3 (N3)       4 (N4)


Neuron
         (N/A)      N2->N1   N3->N1       N4->N1
1 (N1)


Neuron
         N1->N2     (N/A)    N3->N2       N4->N2
2 (N2)


Neuron
         N1->N3     N2->N3   (N/A)        N4->N3
3 (N3)


Neuron
           N1->N4   N2->N4   N3->N4       (N/A)
4 (N4)10/31/2012                     PRESENTATION ON HOPFIELD NETWORK                    21
HOPFIELD NETWORK
                       EXAMPLE
• The connection weights put into this array, also called a weight matrix, allow
the neural network to recall certain patterns when presented.
• For example, the values shown in Table below show the correct values to use to
recall the patterns 0101 .

               Neuron 1   Neuron 2       Neuron 3       Neuron 4
                                                                     Weight Matrix used
                 (N1)       (N2)           (N3)           (N4)
                                                                     to recall 0101.
Neuron 1
                  0          -1               1             -1
  (N1)

Neuron 2
                  -1         0               -1              1
  (N2)

Neuron 3
                  1          -1               0             -1
  (N3)

Neuron 4
                  -1         1               -1              0
  (N4)
  10/31/2012                      PRESENTATION ON HOPFIELD NETWORK                   22
Calculating The Weight Matrix

    Step 1: Convert 0101 to bipolar

    • Bipolar is nothing more than a way to represent binary
    values as –1’s and 1’s rather than zero and 1’s.
    • To convert 0101 to bipolar we convert all of the zeros to –
    1’s. This results in:
    • 0 = -1
       1=1
       0 = -1
       1=1
    • The final result is the array (-1, 1, -1, 1)

10/31/2012              PRESENTATION ON HOPFIELD NETWORK            23
Calculating The Weight Matrix
  Step 2: Multiply (-1, 1, -1, 1) by its Inverse

 For this step we will consider -1, 1, -1, 1 to be a matrix.




 Taking the inverse of this matrix we have.


 Now, multiply these two matrices
               -1 X (-1) = 1    1 X (-1) = -1     -1 X (-1) = 1   1 X (-1) = -1
               -1 X 1 = -1      1X1=1             -1 X 1 = -1     1X1=1
               -1 X (-1) = 1    1 X (-1) = -1     -1 X (-1) = 1   1 X (-1) = -1
               -1 X 1 = -1      1X1=1             -1 X 1 = -1     1X1=1
10/31/2012                     PRESENTATION ON HOPFIELD NETWORK                   24
Calculating The Weight Matrix
 • And the matrix is:




  Step 3: Set the Northwest diagonal to zero




       • The reason behind this is, in Hopfield networks do not have their neurons
       connected to themselves.
       • So positions [1][1], [2][2], [3][3] and [4][4] in our two dimensional array
       or matrix, get set to zero. This results in the weight matrix for the bit pattern
       0101.
10/31/2012                    PRESENTATION ON HOPFIELD NETWORK                     25
Recalling Pattern
• To do this we present each input neuron, with the pattern. Each neuron will
activate based upon the input pattern.
• For example, when neuron 1 is presented with 0101 its activation will be the
sum of all weights that have a 1 in input pattern.
• The activation of each neuron is:
                   a              b                c           d    a+b+c+
                                                                      d
       N1          0             -1                0           -1     -2
       N2          0              1                0           0      1
       N3          0             -1                0           -1     -2
       N4          0              1                0           0      1
The final output vector then (-2,1,-2,1)
10/31/2012                  PRESENTATION ON HOPFIELD NETWORK                 26
Recalling Pattern
  Now, Threshold value determines what range of values will cause the
  neuron to fire.


  The threshold usually used for a Hopfield network, is any value greater
  than zero.


  So the following neurons would fire.



  N1 activation is –2, would not fire (0)
  N2 activation is 1, would fire (1)
  N3 activation is –2, would not fire(0)
  N4 activation is 1 would fire (1).

10/31/2012                 PRESENTATION ON HOPFIELD NETWORK             27
Recalling Pattern

  We assign a binary 1 to all neurons that fired, and a binary 0 to all
  neurons that do not fire.



  The final binary output from the Hopfield network would be 0101.

  This is the same as the input pattern.




  An auto associative neural network, such as a Hopfield network

  Will echo a pattern back if the pattern is recognized.
10/31/2012                PRESENTATION ON HOPFIELD NETWORK                28
APPLICATION

              Image Detection and Recognition



                  Enhancing X-Ray Images



               In Medical Image Restoration




10/31/2012      PRESENTATION ON HOPFIELD NETWORK   29
References

     • Jacek M. Zurada, Introduction To Artificial Neural Systems (10th
     edition)
     • Simon Haykin, Neural Networks (2nd edition)
     • Satish Kumar, Neural Networks; A Classroom Approach (2nd
     Edition)
     • http://www.learnartificialneuralnetworks.com/hopfield.html
     • http://www.heatonresearch.com/articles/2/page6.html
     • http://www.thebigblob.com/hopfield-network/#associative-memory
     •http://www.dsi.unive.it/~pelillo/Didattica/RetiNeurali/Introduction_To
     _ANN_lesson_6.pdf.
     • http://en.wikipedia.org/wiki/Hopfield_network.


10/31/2012                  PRESENTATION ON HOPFIELD NETWORK                   30
10/31/2012   PRESENTATION ON HOPFIELD NETWORK   31

Weitere ähnliche Inhalte

Was ist angesagt?

Neural network
Neural networkNeural network
Neural networkSilicon
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Gaurav Mittal
 
Activation function
Activation functionActivation function
Activation functionAstha Jain
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesMohammed Bennamoun
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmMostafa G. M. Mostafa
 
Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural networkNagarajan
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing Sivagowry Shathesh
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networksSi Haem
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
 
Convolutional neural network
Convolutional neural network Convolutional neural network
Convolutional neural network Yan Xu
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronomaraldabash
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep LearningOswald Campesato
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applicationsSangeeta Tiwari
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation NetworkAkshay Dhole
 

Was ist angesagt? (20)

Neural network
Neural networkNeural network
Neural network
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)
 
Activation function
Activation functionActivation function
Activation function
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) Algorithm
 
Lecture11 - neural networks
Lecture11 - neural networksLecture11 - neural networks
Lecture11 - neural networks
 
Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural network
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networks
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Convolutional neural network
Convolutional neural network Convolutional neural network
Convolutional neural network
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Mc culloch pitts neuron
Mc culloch pitts neuronMc culloch pitts neuron
Mc culloch pitts neuron
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applications
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation Network
 

Ähnlich wie HOPFIELD NETWORK

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkDEEPASHRI HK
 
Mathematical Foundation of Discrete time Hopfield Networks
Mathematical Foundation of Discrete time Hopfield NetworksMathematical Foundation of Discrete time Hopfield Networks
Mathematical Foundation of Discrete time Hopfield NetworksAkhil Upadhyay
 
Mapping the discrete logarithm
Mapping the discrete logarithmMapping the discrete logarithm
Mapping the discrete logarithmJoshua Holden
 
Arquitecturas Basicas Slides
Arquitecturas Basicas SlidesArquitecturas Basicas Slides
Arquitecturas Basicas SlidesESCOM
 
The International Journal of Engineering and Science
The International Journal of Engineering and ScienceThe International Journal of Engineering and Science
The International Journal of Engineering and Sciencetheijes
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
Cancellation of Zigbee interference in OFDM based WLAN for multipath channel
Cancellation of Zigbee interference in OFDM based WLAN for multipath channelCancellation of Zigbee interference in OFDM based WLAN for multipath channel
Cancellation of Zigbee interference in OFDM based WLAN for multipath channelIDES Editor
 
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)Universitat Politècnica de Catalunya
 
ZhaoLangIGARSS2011.pdf
ZhaoLangIGARSS2011.pdfZhaoLangIGARSS2011.pdf
ZhaoLangIGARSS2011.pdfgrssieee
 
Drude Lorentz circuit Gonano Zich
Drude Lorentz circuit Gonano ZichDrude Lorentz circuit Gonano Zich
Drude Lorentz circuit Gonano ZichCarlo Andrea Gonano
 
Neural networks
Neural networksNeural networks
Neural networksSlideshare
 
(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2Serhii Havrylov
 
Feed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxFeed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxneelamsanjeevkumar
 

Ähnlich wie HOPFIELD NETWORK (20)

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Mathematical Foundation of Discrete time Hopfield Networks
Mathematical Foundation of Discrete time Hopfield NetworksMathematical Foundation of Discrete time Hopfield Networks
Mathematical Foundation of Discrete time Hopfield Networks
 
ANN-lecture9
ANN-lecture9ANN-lecture9
ANN-lecture9
 
Mapping the discrete logarithm
Mapping the discrete logarithmMapping the discrete logarithm
Mapping the discrete logarithm
 
Arquitecturas Basicas Slides
Arquitecturas Basicas SlidesArquitecturas Basicas Slides
Arquitecturas Basicas Slides
 
The International Journal of Engineering and Science
The International Journal of Engineering and ScienceThe International Journal of Engineering and Science
The International Journal of Engineering and Science
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
call for papers, research paper publishing, where to publish research paper, ...
call for papers, research paper publishing, where to publish research paper, ...call for papers, research paper publishing, where to publish research paper, ...
call for papers, research paper publishing, where to publish research paper, ...
 
Cancellation of Zigbee interference in OFDM based WLAN for multipath channel
Cancellation of Zigbee interference in OFDM based WLAN for multipath channelCancellation of Zigbee interference in OFDM based WLAN for multipath channel
Cancellation of Zigbee interference in OFDM based WLAN for multipath channel
 
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
 
Unit iii update
Unit iii updateUnit iii update
Unit iii update
 
ZhaoLangIGARSS2011.pdf
ZhaoLangIGARSS2011.pdfZhaoLangIGARSS2011.pdf
ZhaoLangIGARSS2011.pdf
 
Neural network
Neural networkNeural network
Neural network
 
Drude Lorentz circuit Gonano Zich
Drude Lorentz circuit Gonano ZichDrude Lorentz circuit Gonano Zich
Drude Lorentz circuit Gonano Zich
 
Neural networks
Neural networksNeural networks
Neural networks
 
(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2
 
Feed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxFeed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptx
 
Neural
NeuralNeural
Neural
 
Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Deep Learning for Computer Vision: Deep Networks (UPC 2016)Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Deep Learning for Computer Vision: Deep Networks (UPC 2016)
 
30
3030
30
 

Kürzlich hochgeladen

Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...anjaliyadav012327
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room servicediscovermytutordmt
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 

Kürzlich hochgeladen (20)

Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 

HOPFIELD NETWORK

  • 1. PRESENTED BY : Ankita Pandey ME ECE - 112604
  • 2. CONTENT Introduction Properties of Hopfield network Hopfield network derivation Hopfield network example Applications References 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 2
  • 3. INTRODUCTION Hopfield neural network is proposed by John Hopfield in 1982 can be seen • as a network with associative memory • can be used for different pattern recognition problems. It is a fully connected, single layer auto associative network • Means it has only one layer, with each neuron connected to every other neuron All the neurons act as input and output. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 3
  • 4. INTRODUCTION The Hopfield network(model) consists of a set of neurons and corresponding set of unit delays, forming a multiple loop feedback system as shown in fig. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 4
  • 5. INTRODUCTION The number of feedback loops is equal to the number of neurons. Basically, the output of the neuron is feedback, via a unit delay element, to each of the other neurons in the network. • no self feedback in the network. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 5
  • 6. PROPERTIES OF HOPFIELD NETWORK • A recurrent network with all nodes connected to all other nodes. 1 • Nodes have binary outputs (either 0,1 or -1,1). 2 • Weights between the nodes are symmetric . 3 • No connection from a node to itself is allowed. 4 • Nodes are updated asynchronously ( i.e. nodes are selected at 5 random). • The network has no hidden nodes or layer. 6 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 6
  • 7. HOPFIELD NETWORK Consider the noiseless, dynamical model of the neuron shown in fig. 1 The synaptic weights w j1 ,w j2 ,...... w jn represents conductance’s. The respective inputs x1 t , x 2 t ,...... x n t represents the potentials, N is number of inputs. These inputs are applied to a current summing junction characterized as follows: • Low input resistance. • Unity current gain. • High output resistance. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 7
  • 8. ADDITIVE MODEL OF A NEURON di ei Σ NEURAL NETWORK MODEL yi 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 8
  • 9. HOPFIELD NETWORK The total current flowing toward the input node of the nonlinear element(activation function) is: N w ji xi t Ii i 1 Total current flowing away from the input node of the nonlinear element as follows: v j t dv j t C j R j dt • Where first term due to leakage resistance • And second term due to leakage capacitance. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 9
  • 10. HOPFIELD NETWORK • By applying KCL to the input node of the nonlinearity , we get N dv j t vj t Cj w ji x i t Ij dt Rj i 1 ………..(1) dv t • The capacitive term C dt j j add dynamics to the model of a neuron. • Output of the neuron j determined by using the non linear relation xj t v j (t ) • The RC model described by the eq. (1) is referred to the additive model 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 10
  • 11. HOPFIELD NETWORK A feature of Additive model is that the signal xᵢ(t) applied to the neuron j by adjoining neuron i • is a slowly varying function of the time t. Thus, a recurrent network consisting of an interconnection of N neurons, • each one of which is assumed to have the same mathematical model described by thev j t dv j t equation : N Cj w ji x i t I j, dt Rj i 1 • j 1, 2 ,....., N • …….(2) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 11
  • 12. HOPFIELD NETWORK Now, we use eq (2) which is based on the additive model of the neuron. Assumptions: w ji w ij • The matrix of synaptic weights is symmetric, as shown by: • for all i and j. • Each neuron has a nonlinear activation of its own, hence use of i in eq.(2) • The inverse of the nonlinear activation function exists, so we can write 1 • v i x ……….(3) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 12
  • 13. HOPFIELD NETWORK hyperbolic tangent • Let the sigmoid function v be defined by the i function aiv 1 exp aiv x i v tanh 2 1 exp aiv • Which has slope of a i / 2. • a i refers as the gain of neuron i. The inverse I/O relation of eq.(3) may be written as 1 1 1 x v x log ai 1 x ………..(4) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 13
  • 14. HOPFIELD • NETWORK Standard form of the inverse I/O relation for a neuron of unity gain is: 1 1 x x log 1 x • We can rewrite the eq. (4) in terms of standard relation as 1 1 1 i x x ai 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 14
  • 15. Plot of (a) Sigmoidal Nonlinearity and (b) its inverse Error ei Model UNKNOW Output xi N SYSTEM Σ f(.) (a) (b) 10/31/2012 yi PRESENTATION ON HOPFIELD NETWORK 15
  • 16. HOPFIELD NETWORK • The energy function of the Hopfield network is defined by: x j N N N N 1 1 1 E w ji xi x j j x dx I jx j 2 i 1 j 1 j 1 R j 0 j 1 • Differentiating E w.r.t. time , we get N N dE v j dx j w ji xi I j dt j 1 i 1 R j dt • by putting the value in parentheses from eq.2, we get N dE dv j dx j dt C j dt dt …………..(5) j 1 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 16
  • 17. HOPFIELD • The inverse relation NETWORK is that defines in terms of x v j j 1 v i x • By using above relation in eq. (5), we have N 1 dE d j x j dx j C j dt j 1 dt dt 2 1 N dx j d j x j C j dt dx …………..(6) j 1 j • From fig. (b) we see that the inverse I/O relation is monotonically increasing function of the output Therefore, 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 17
  • 18. HOPFIELD Also, NETWORK 2 dx j 0 for all x j . dt Hence all the factors that make up the sum on R.H.S. of eq(6) are non- negative. Thus the energy function E defined as dE 0 dt 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 18
  • 19. HOPFIELD NETWORK • The energy function E is a Lyapunov funtion of the continuous Hopfield model. We may make the • The model is stable in accordance with following two Lyapunov’s Theorem 1. statements: The time evolution of the • Which seeks the minima of the energy continuous Hopfield model function E and comes to stop at fixed described by the system of points. nonlinear first order differential equations represents the trajectory in the state space 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 19
  • 20. HOPFIELD NETWORK dE • From eq.(6) the derivative vanishes only if dt dx j t 0 for all j. dt • Thus we can say, dE 0 expect at fixed point ………(7) dt • The eq.(7) forms the basis for following theorem • The energy function E of a Hopfield network is a monotonically decreasing function of time. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 20
  • 21. HOPFIELD NETWORK EXAMPLE Connection of Hopfield Neural Network A Hopfield Neural network: Neuron Neuron Neuron Neuron 1 (N1) 2 (N2) 3 (N3) 4 (N4) Neuron (N/A) N2->N1 N3->N1 N4->N1 1 (N1) Neuron N1->N2 (N/A) N3->N2 N4->N2 2 (N2) Neuron N1->N3 N2->N3 (N/A) N4->N3 3 (N3) Neuron N1->N4 N2->N4 N3->N4 (N/A) 4 (N4)10/31/2012 PRESENTATION ON HOPFIELD NETWORK 21
  • 22. HOPFIELD NETWORK EXAMPLE • The connection weights put into this array, also called a weight matrix, allow the neural network to recall certain patterns when presented. • For example, the values shown in Table below show the correct values to use to recall the patterns 0101 . Neuron 1 Neuron 2 Neuron 3 Neuron 4 Weight Matrix used (N1) (N2) (N3) (N4) to recall 0101. Neuron 1 0 -1 1 -1 (N1) Neuron 2 -1 0 -1 1 (N2) Neuron 3 1 -1 0 -1 (N3) Neuron 4 -1 1 -1 0 (N4) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 22
  • 23. Calculating The Weight Matrix Step 1: Convert 0101 to bipolar • Bipolar is nothing more than a way to represent binary values as –1’s and 1’s rather than zero and 1’s. • To convert 0101 to bipolar we convert all of the zeros to – 1’s. This results in: • 0 = -1 1=1 0 = -1 1=1 • The final result is the array (-1, 1, -1, 1) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 23
  • 24. Calculating The Weight Matrix  Step 2: Multiply (-1, 1, -1, 1) by its Inverse For this step we will consider -1, 1, -1, 1 to be a matrix. Taking the inverse of this matrix we have. Now, multiply these two matrices -1 X (-1) = 1 1 X (-1) = -1 -1 X (-1) = 1 1 X (-1) = -1 -1 X 1 = -1 1X1=1 -1 X 1 = -1 1X1=1 -1 X (-1) = 1 1 X (-1) = -1 -1 X (-1) = 1 1 X (-1) = -1 -1 X 1 = -1 1X1=1 -1 X 1 = -1 1X1=1 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 24
  • 25. Calculating The Weight Matrix • And the matrix is:  Step 3: Set the Northwest diagonal to zero • The reason behind this is, in Hopfield networks do not have their neurons connected to themselves. • So positions [1][1], [2][2], [3][3] and [4][4] in our two dimensional array or matrix, get set to zero. This results in the weight matrix for the bit pattern 0101. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 25
  • 26. Recalling Pattern • To do this we present each input neuron, with the pattern. Each neuron will activate based upon the input pattern. • For example, when neuron 1 is presented with 0101 its activation will be the sum of all weights that have a 1 in input pattern. • The activation of each neuron is: a b c d a+b+c+ d N1 0 -1 0 -1 -2 N2 0 1 0 0 1 N3 0 -1 0 -1 -2 N4 0 1 0 0 1 The final output vector then (-2,1,-2,1) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 26
  • 27. Recalling Pattern Now, Threshold value determines what range of values will cause the neuron to fire. The threshold usually used for a Hopfield network, is any value greater than zero. So the following neurons would fire. N1 activation is –2, would not fire (0) N2 activation is 1, would fire (1) N3 activation is –2, would not fire(0) N4 activation is 1 would fire (1). 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 27
  • 28. Recalling Pattern We assign a binary 1 to all neurons that fired, and a binary 0 to all neurons that do not fire. The final binary output from the Hopfield network would be 0101. This is the same as the input pattern. An auto associative neural network, such as a Hopfield network Will echo a pattern back if the pattern is recognized. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 28
  • 29. APPLICATION Image Detection and Recognition Enhancing X-Ray Images In Medical Image Restoration 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 29
  • 30. References • Jacek M. Zurada, Introduction To Artificial Neural Systems (10th edition) • Simon Haykin, Neural Networks (2nd edition) • Satish Kumar, Neural Networks; A Classroom Approach (2nd Edition) • http://www.learnartificialneuralnetworks.com/hopfield.html • http://www.heatonresearch.com/articles/2/page6.html • http://www.thebigblob.com/hopfield-network/#associative-memory •http://www.dsi.unive.it/~pelillo/Didattica/RetiNeurali/Introduction_To _ANN_lesson_6.pdf. • http://en.wikipedia.org/wiki/Hopfield_network. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 30
  • 31. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 31