SlideShare ist ein Scribd-Unternehmen logo
1 von 58
An Introduction to
Artificial Neural Network
Dr Iman Ardekani
Biological Neurons
Modeling Neurons
McCulloch-Pitts Neuron
Network Architecture
Learning Process
Perceptron
Linear Neuron
Multilayer Perceptron
Content
Ramón-y-Cajal (Spanish Scientist, 1852~1934):
1. Brain is composed of individual cells called
neurons.
2. Neurons are connected to each others by
synopses.
Biological Neurons
Neurons Structure (Biology)
Biological Neurons
Dendrites Cell body
Nucleus
Axon
Axon
Terminals
Synaptic Junction (Biology)
Biological Neurons
Synapse
Neurons Function (Biology)
1. Dendrites receive signal from other neurons.
2. Neurons can process (or transfer) the received signals.
3. Axon terminals deliverer the processed signal to other
tissues.
Biological Neurons
What kind of signals? Electrical Impulse Signals
Modeling Neurons (Computer Science)
Biological Neurons
Input Outputs
Process
Modeling Neurons
Net input signal is a linear combination of input signals xi.
Each Output is a function of the net input signal.
Biological Neurons
x1x2
x3
x4
x5
x6
x7
x8
x9
x10
y
- McCulloch and Pitts (1943) for introducing the
idea of neural networks as computing machines
- Hebb (1949) for inventing the first rule for self-
organized learning
- Rosenblass (1958) for proposing the perceptron
as the first model for learning with a teacher
Modelling Neurons
Net input signal received through synaptic junctions is
net = b + Σwixi = b + WT
X
Weight vector: W =[w1 w2 … wm]T
Input vector: X = [x1 x2 … xm]T
Each output is a function of the net stimulus signal (f is called the
activation function)
y = f (net) = f(b + WTX)
Modelling Neurons
General Model for Neurons
Modelling Neurons
y= f(b+ Σwnxn)
x1
x2
xm
.
.
. y
w1
w2
wm
b
 f
Activation functions
Modelling Neurons
1 1 1
1 1 1
-1 -1 -1
Good for classification Simple computation Continuous & Differentiable
Threshold Function/ Hard Limiter Linear Function sigmoid Function
Sigmoid Function
Modelling Neurons
1
sigmoid Function
f(x) =
1
1 + e-ax
= threshold function
when a goes to infinity
McCulloch-Pitts Neuron
Modelling Neurons
y {0,1}
y
x1
x2
xm
.
.
. y
w1
w2
wm
b

Modelling Neurons
y [0,1]

x1
x2
xm
.
.
. y
w1
w2
wm
b
Single-input McCulloch-Pitts neurode with b=0, w1=-1 for binary
inputs:
Conclusion?
McCulloch-Pitts Neuron
x1 net y
0 0 1
1 -1 0 x1 y
w1
b

Two-input McCulloch-Pitts neurode with b=-1,
w1=w2=1 for binary inputs:
McCulloch-Pitts Neuron
x1 x2 net y
0 0 ? ?
0 1 ? ?
1 0 ? ?
1 1 ? ?
x1
y
w1
b

x2
w2
Two-input McCulloch-Pitts neurode with b=-1,
w1=w2=1 for binary inputs:
McCulloch-Pitts Neuron
x1 x2 net y
0 0 -1 0
0 1 0 1
1 0 0 1
1 1 1 1
x1
y
w1
b

x2
w2
Two-input McCulloch-Pitts neurode with b=-2,
w1=w2=1 for binary inputs :
McCulloch-Pitts Neuron
x1 x2 net y
0 0 ? ?
0 1 ? ?
1 0 ? ?
1 1 ? ?
x1
y
w1
b

x2
w2
Two-input McCulloch-Pitts neurode with b=-2,
w1=w2=1 for binary inputs :
McCulloch-Pitts Neuron
x1 x2 net y
0 0 -2 0
0 1 -1 0
1 0 -1 0
1 1 0 1
x1
y
w1
b

x2
w2
Every basic Boolean function can be
implemented using combinations of McCulloch-
Pitts Neurons.
McCulloch-Pitts Neuron
the McCulloch-Pitts neuron can be used as a
classifier that separate the input signals into two
classes (perceptron):
McCulloch-Pitts Neuron
X1
X2
Class A (y=1)
Class B (y=0)
Class A  y=?
y = 1  net = ?
net  0  ?
b+w1x1+w2x2  0
Class B  y=?
y = 0  net = ?
net < 0  ?
b+w1x1+w2x2 < 0
Class A  b+w1x1+w2x2  0
Class B  b+w1x1+w2x2 < 0
Therefore, the decision boundary is a hyperline given by
b+w1x1+w2x2 = 0
Where w1 and w2 come from?
McCulloch-Pitts Neuron
Solution: More Neurons Required
McCulloch-Pitts Neuron
X2
X1
X1 and x2
X2
X1
X1 or x2
X2
X1
X1 xor x2
?
Nonlinear Classification
McCulloch-Pitts Neuron
X1
X2
Class A (y=1)
Class B (y=0)
Single Layer Feed-forward Network
Network Architecture
Single Layer:
There is only one
computational
layer.
Feed-forward:
Input layer projects
to the output layer
not vice versa.
 f
 f
 f
Input layer
(sources)
Output layer
(Neurons)
Multi Layer Feed-forward Network
Network Architecture
 f
 f
 f
Input layer
(sources)
Hidden layer
(Neurons)
 f
 f
Output layer
(Neurons)
Single Layer Recurrent Network
Network Architecture
 f
 f
 f
Multi Layer Recurrent Network
Network Architecture
 f
 f
 f
 f
 f
The mechanism based on which a neural network
can adjust its weights (synaptic junctions weights):
- Supervised learning: having a teacher
- Unsupervised learning: without teacher
Learning Processes
Supervised Learning
Learning Processes
Teacher
Learner
Environment 
Input
data Desired
response
Actual
response
Error Signal
+
-
Unsupervised Learning
Learning Processes
LearnerEnvironment
Input
data
Neurons learn based on a competitive task.
A competition rule is required (competitive-learning rule).
- The goal of the perceptron to classify input data into two classes A and B
- Only when the two classes can be separated by a linear boundary
- The perceptron is built around the McCulloch-Pitts Neuron model
- A linear combiner followed by a hard limiter
- Accordingly the neuron can produce +1 and 0
Perceptron
x1
x2
xm
.
.
. y
wk
wk
wk
b

Equivalent Presentation
Perceptron
x1
x2
xm
.
.
. y
w1
w2
wm
b

w0
1
net = WTX
Weight vector: W =[w0 w1 … wm]T
Input vector: X = [1 x1 x2 … xm]T
There exist a weight vector w such that we may state
WTx > 0 for every input vector x belonging to A
WTx ≤ 0 for every input vector x belonging to B
Perceptron
Elementary perceptron Learning Algorithm
1) Initiation: w(0) = a random weight vector
2) At time index n, form the input vector x(n)
3) IF (wTx > 0 and x belongs to A) or (wTx ≤ 0 and x
belongs to B) THEN w(n)=w(n-1) Otherwise
w(n)=w(n-1)-ηx(n)
4) Repeat 2 until w(n) converges
1)
Perceptron
- when the activation function simply is f(x)=x the
neuron acts similar to an adaptive filter.
- In this case: y=net=wTx
- w=[w1 w2 … wm]T
- x=[x1 x2 … xm]T
Linear Neuron
x1
x2
xm
.
.
. y
w1
w2
wm
b

Linear Neuron Learning (Adaptive Filtering)
Linear Neuron
Teacher
Learner (Linear
Neuron)
Environment 
Input
data Desired
Response
d(n)
Actual
Response
y(n)
Error Signal
+
-
LMS Algorithm
To minimize the value of the cost function defined as
E(w) = 0.5 e2(n)
where e(n) is the error signal
e(n)=d(n)-y(n)=d(n)-wT(n)x(n)
In this case, the weight vector can be updated as follows
wi(n+1)=wi(n-1) - μ( )
Linear Neuron
dE
dwi
LMS Algorithm (continued)
= e(n) = e(n) {d(n)-wT(n)x(n)}
= -e(n) xi(n)
wi(n+1)=wi(n)+μe(n)xi(n)
w(n+1)=w(n)+μe(n)x(n)
Linear Neuron
dE
dwi
de(n)
dwi
d
dwi
Summary of LMS Algorithm
1) Initiation: w(0) = a random weight vector
2) At time index n, form the input vector x(n)
3) y(n)=wT(n)x(n)
4) e(n)=d(n)-y(n)
5) w(n+1)=w(n)+μe(n)x(n)
6) Repeat 2 until w(n) converges
Linear Neuron
- To solve the xor problem
- To solve the nonlinear classification problem
- To deal with more complex problems
Multilayer Perceptron
- The activation function in multilayer perceptron is
usually a Sigmoid function.
- Because Sigmoid is differentiable function, unlike
the hard limiter function used in the elementary
perceptron.
Multilayer Perceptron
Architecture
Multilayer Perceptron
 f
 f
 f
Input layer
(sources)
Hidden layer
(Neurons)
 f
 f
Output layer
(Neurons)
Architecture
Multilayer Perceptron
 f
 f
 f
Inputs (from layer k-1) layer k Outputs (to layer k+1)
Consider a single neuron in a multilayer perceptron (neuron k)
Multilayer Perceptron
 f
yk
y0=1
y1
ym
Wk,0
Wk,1
Wk,m
.
.
.
netk = Σwk,iyi
yk= f (netk)
Multilayer perceptron Learning (Back Propagation Algorithm)
Multilayer Perceptron
Teacher
Neuron j
of layer k
layer k-1 
Input
data Desired
Response
dk(n)
Actual
Response
yk(n)
Error Signal
ek(n)
+
-
Back Propagation Algorithm:
Cost function of neuron j:
Ek = 0.5 e2
j
Cost function of network:
E = ΣEj = 0.5Σe2
j
ek = dk – yk
ek = dk – f(netk)
ek = dk – f(Σwk,iyi)
Multilayer Perceptron
Back Propagation Algorithm:
Cost function of neuron k:
Ek = 0.5 e2
k
ek = dk – yk
ek = dk – f(netk)
ek = dk – f(Σwk,iyi)
Multilayer Perceptron
Cost function of network:
E = ΣEj = 0.5Σe2
j
Multilayer Perceptron
Back Propagation Algorithm:
To minimize the value of the cost function E(wk,i), the
weight vector can be updated using a gradient based
algorithm as follows
wk,i(n+1)=wk,i (n) - μ( )
=?
Multilayer Perceptron
dE
dwk,i
dE
dwk,j
Back Propagation Algorithm:
= =
= δk yk
δk = = -ekf’(netk)
Multilayer Perceptron
dE
dwk,i
dE
dek
dek
dyk
dyk
dnetk
dnetk
dwk,i
dE
dnetk
dnetk
dwk,i
dE
dek
dek
dyk
dyk
dnetk
dE
dek
= ek
dek
dnetk
= f’(netk)
dek
dyk
= -1
dnetk
dwk,i
= yi
Local Gradient
Back Propagation Algorithm:
Substituting into the gradient-based algorithm:
wk,i(n+1)=wk,i (n) – μ δk yk
δk = -ekf’(netk) = - {dk- yk} f’(netk) = ?
If k is an output neuron we have all the terms of δk
When k is a hidden neuron?
Multilayer Perceptron
dE
dwk,i
Back Propagation Algorithm:
When k is hidden
δk = =
= f’(netk)
=?
Multilayer Perceptron
dE
dek
dek
dyk
dyk
dnetk
dE
dyk
dyk
dnetk
dek
dnetk
= f’(netk)
dE
dyk
dE
dyk
= ?
E = 0.5Σe2
j
Multilayer Perceptron
dE
dyk
dE
dyk
dej
dyk
dej
dnetj
dnetj
dyk
wjk
-f’(netj)
= -Σ ej f’(netj) wjk = -Σ δj wjk
= Σ ej
= Σ ej
We had δk =-
Substituting = Σ δj wjk into δk results in
δk = - f’(netk) Σ δj wjk
which gives the local gradient for the hidden neuron k
Multilayer Perceptron
dE
dyk
f’(netk)
dE
dyk
Summary of Back Propagation Algorithm
1) Initiation: w(0) = a random weight vector
At time index n, get the input data and
2) Calculate netk and yk for all the neurons
3) For output neurons, calculate ek=dk-yk
4) For output neurons, δk = -ekf’(netk)
5) For hidden neurons, δk = - f’(netk) Σ δj wjk
6) Update every neuron weights by
wk,i(NEW)=wk,i (OLD) – μ δk yk
7. Repeat steps 2~6 for the next data set (next time index)
Multilayer Perceptron
THE END

Weitere ähnliche Inhalte

Was ist angesagt?

Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkKnoldus Inc.
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkMuhammad Ishaq
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learninggrinu
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesMohammed Bennamoun
 
Artificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKSArtificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKSREHMAT ULLAH
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networksstellajoseph
 
artificial neural network
artificial neural networkartificial neural network
artificial neural networkPallavi Yadav
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Simplilearn
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationMohammed Bennamoun
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkNainaBhatt1
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmMostafa G. M. Mostafa
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksNelson Piedra
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural NetworksDatabricks
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networksAkash Goel
 

Was ist angesagt? (20)

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Associative memory network
Associative memory networkAssociative memory network
Associative memory network
 
Artificial Neural Network.pptx
Artificial Neural Network.pptxArtificial Neural Network.pptx
Artificial Neural Network.pptx
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learning
 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 
Artificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKSArtificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKS
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
artificial neural network
artificial neural networkartificial neural network
artificial neural network
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) Algorithm
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural Networks
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural Networks
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networks
 

Andere mochten auch

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkDEEPASHRI HK
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
neural network
neural networkneural network
neural networkSTUDENT
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1sravanthi computers
 
14. mohsin dalvi artificial neural networks presentation
14. mohsin dalvi   artificial neural networks presentation14. mohsin dalvi   artificial neural networks presentation
14. mohsin dalvi artificial neural networks presentationPurnesh Aloni
 
Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
Neural network
Neural networkNeural network
Neural networkSilicon
 
Neural networks
Neural networksNeural networks
Neural networksSlideshare
 
Keynote Session 3: Learning to See
Keynote Session 3: Learning to SeeKeynote Session 3: Learning to See
Keynote Session 3: Learning to SeeNHSScotlandEvent2013
 
mohsin dalvi artificial neural networks questions
mohsin dalvi   artificial neural networks questionsmohsin dalvi   artificial neural networks questions
mohsin dalvi artificial neural networks questionsAkash Maurya
 
Neural Networks - Types of Neurons
Neural Networks - Types of NeuronsNeural Networks - Types of Neurons
Neural Networks - Types of NeuronsChristopher Sharkey
 
Two-step Classification method for Spatial Decision Tree
Two-step Classification method for Spatial Decision TreeTwo-step Classification method for Spatial Decision Tree
Two-step Classification method for Spatial Decision TreeAbhishek Agrawal
 
Du binary signalling
Du binary signallingDu binary signalling
Du binary signallingsrkrishna341
 
Svm implementation for Health Data
Svm implementation for Health DataSvm implementation for Health Data
Svm implementation for Health DataAbhishek Agrawal
 
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSArtificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSMohammed Bennamoun
 

Andere mochten auch (20)

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
neural network
neural networkneural network
neural network
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
 
14. mohsin dalvi artificial neural networks presentation
14. mohsin dalvi   artificial neural networks presentation14. mohsin dalvi   artificial neural networks presentation
14. mohsin dalvi artificial neural networks presentation
 
Perceptron
PerceptronPerceptron
Perceptron
 
Back propagation
Back propagationBack propagation
Back propagation
 
Neural network
Neural networkNeural network
Neural network
 
Neural networks
Neural networksNeural networks
Neural networks
 
Keynote Session 3: Learning to See
Keynote Session 3: Learning to SeeKeynote Session 3: Learning to See
Keynote Session 3: Learning to See
 
mohsin dalvi artificial neural networks questions
mohsin dalvi   artificial neural networks questionsmohsin dalvi   artificial neural networks questions
mohsin dalvi artificial neural networks questions
 
neural networks
 neural networks neural networks
neural networks
 
Neural Networks - Types of Neurons
Neural Networks - Types of NeuronsNeural Networks - Types of Neurons
Neural Networks - Types of Neurons
 
Two-step Classification method for Spatial Decision Tree
Two-step Classification method for Spatial Decision TreeTwo-step Classification method for Spatial Decision Tree
Two-step Classification method for Spatial Decision Tree
 
Du binary signalling
Du binary signallingDu binary signalling
Du binary signalling
 
Classification ANN
Classification ANNClassification ANN
Classification ANN
 
Svm implementation for Health Data
Svm implementation for Health DataSvm implementation for Health Data
Svm implementation for Health Data
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
 
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSArtificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
 

Ähnlich wie Artificial Neural Network

Classification using perceptron.pptx
Classification using perceptron.pptxClassification using perceptron.pptx
Classification using perceptron.pptxsomeyamohsen3
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience hirokazutanaka
 
MLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningMLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningCharles Deledalle
 
latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxMdMahfoozAlam5
 
Neural network
Neural networkNeural network
Neural networkmarada0033
 
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxArtificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxMDYasin34
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2sravanthi computers
 
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
Multilayer Backpropagation Neural Networks for Implementation of Logic GatesMultilayer Backpropagation Neural Networks for Implementation of Logic Gates
Multilayer Backpropagation Neural Networks for Implementation of Logic GatesIJCSES Journal
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Networkssuserab4f3e
 
Artificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competitionArtificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competitionMohammed Bennamoun
 

Ähnlich wie Artificial Neural Network (20)

Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks
 
Annintro
AnnintroAnnintro
Annintro
 
Classification using perceptron.pptx
Classification using perceptron.pptxClassification using perceptron.pptx
Classification using perceptron.pptx
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
 
10-Perceptron.pdf
10-Perceptron.pdf10-Perceptron.pdf
10-Perceptron.pdf
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
MLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningMLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learning
 
6
66
6
 
latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptx
 
tutorial.ppt
tutorial.ppttutorial.ppt
tutorial.ppt
 
Neural network
Neural networkNeural network
Neural network
 
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxArtificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
 
UofT_ML_lecture.pptx
UofT_ML_lecture.pptxUofT_ML_lecture.pptx
UofT_ML_lecture.pptx
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2
 
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
Multilayer Backpropagation Neural Networks for Implementation of Logic GatesMultilayer Backpropagation Neural Networks for Implementation of Logic Gates
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
19_Learning.ppt
19_Learning.ppt19_Learning.ppt
19_Learning.ppt
 
neural networksNnf
neural networksNnfneural networksNnf
neural networksNnf
 
Artificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competitionArtificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competition
 

Mehr von Iman Ardekani

Introduction to Quantitative Research Methods
Introduction to Quantitative Research MethodsIntroduction to Quantitative Research Methods
Introduction to Quantitative Research MethodsIman Ardekani
 
Introduction to Research Methods
Introduction to Research Methods Introduction to Research Methods
Introduction to Research Methods Iman Ardekani
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial IntelligenceIman Ardekani
 
Remote Active Noise Control
Remote Active Noise Control Remote Active Noise Control
Remote Active Noise Control Iman Ardekani
 
Adaptive Active Control of Sound in Smart Rooms (2014)
Adaptive Active Control of Sound in Smart Rooms (2014)Adaptive Active Control of Sound in Smart Rooms (2014)
Adaptive Active Control of Sound in Smart Rooms (2014)Iman Ardekani
 

Mehr von Iman Ardekani (8)

Introduction to Quantitative Research Methods
Introduction to Quantitative Research MethodsIntroduction to Quantitative Research Methods
Introduction to Quantitative Research Methods
 
Introduction to Research Methods
Introduction to Research Methods Introduction to Research Methods
Introduction to Research Methods
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial Intelligence
 
Expert Systems
Expert SystemsExpert Systems
Expert Systems
 
Genetic Agorithm
Genetic AgorithmGenetic Agorithm
Genetic Agorithm
 
ANC Tutorial (2013)
ANC Tutorial (2013)ANC Tutorial (2013)
ANC Tutorial (2013)
 
Remote Active Noise Control
Remote Active Noise Control Remote Active Noise Control
Remote Active Noise Control
 
Adaptive Active Control of Sound in Smart Rooms (2014)
Adaptive Active Control of Sound in Smart Rooms (2014)Adaptive Active Control of Sound in Smart Rooms (2014)
Adaptive Active Control of Sound in Smart Rooms (2014)
 

Kürzlich hochgeladen

LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.Silpa
 
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort ServiceCall Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort Serviceshivanisharma5244
 
Cyanide resistant respiration pathway.pptx
Cyanide resistant respiration pathway.pptxCyanide resistant respiration pathway.pptx
Cyanide resistant respiration pathway.pptxSilpa
 
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body Areesha Ahmad
 
Genome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptxGenome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptxSilpa
 
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Silpa
 
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsTransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsSérgio Sacani
 
Reboulia: features, anatomy, morphology etc.
Reboulia: features, anatomy, morphology etc.Reboulia: features, anatomy, morphology etc.
Reboulia: features, anatomy, morphology etc.Silpa
 
module for grade 9 for distance learning
module for grade 9 for distance learningmodule for grade 9 for distance learning
module for grade 9 for distance learninglevieagacer
 
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIACURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIADr. TATHAGAT KHOBRAGADE
 
Factory Acceptance Test( FAT).pptx .
Factory Acceptance Test( FAT).pptx       .Factory Acceptance Test( FAT).pptx       .
Factory Acceptance Test( FAT).pptx .Poonam Aher Patil
 
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsBiogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsSérgio Sacani
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceAlex Henderson
 
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...Scintica Instrumentation
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsOrtegaSyrineMay
 
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryFAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryAlex Henderson
 
Genetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditionsGenetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditionsbassianu17
 

Kürzlich hochgeladen (20)

LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.
 
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort ServiceCall Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
 
Cyanide resistant respiration pathway.pptx
Cyanide resistant respiration pathway.pptxCyanide resistant respiration pathway.pptx
Cyanide resistant respiration pathway.pptx
 
PATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICE
PATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICEPATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICE
PATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICE
 
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
 
Genome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptxGenome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptx
 
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
 
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsTransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
 
Site Acceptance Test .
Site Acceptance Test                    .Site Acceptance Test                    .
Site Acceptance Test .
 
Reboulia: features, anatomy, morphology etc.
Reboulia: features, anatomy, morphology etc.Reboulia: features, anatomy, morphology etc.
Reboulia: features, anatomy, morphology etc.
 
module for grade 9 for distance learning
module for grade 9 for distance learningmodule for grade 9 for distance learning
module for grade 9 for distance learning
 
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIACURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
 
Factory Acceptance Test( FAT).pptx .
Factory Acceptance Test( FAT).pptx       .Factory Acceptance Test( FAT).pptx       .
Factory Acceptance Test( FAT).pptx .
 
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsBiogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical Science
 
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its Functions
 
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryFAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Genetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditionsGenetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditions
 

Artificial Neural Network

  • 1. An Introduction to Artificial Neural Network Dr Iman Ardekani
  • 2. Biological Neurons Modeling Neurons McCulloch-Pitts Neuron Network Architecture Learning Process Perceptron Linear Neuron Multilayer Perceptron Content
  • 3. Ramón-y-Cajal (Spanish Scientist, 1852~1934): 1. Brain is composed of individual cells called neurons. 2. Neurons are connected to each others by synopses. Biological Neurons
  • 4. Neurons Structure (Biology) Biological Neurons Dendrites Cell body Nucleus Axon Axon Terminals
  • 6. Neurons Function (Biology) 1. Dendrites receive signal from other neurons. 2. Neurons can process (or transfer) the received signals. 3. Axon terminals deliverer the processed signal to other tissues. Biological Neurons What kind of signals? Electrical Impulse Signals
  • 7. Modeling Neurons (Computer Science) Biological Neurons Input Outputs Process
  • 8. Modeling Neurons Net input signal is a linear combination of input signals xi. Each Output is a function of the net input signal. Biological Neurons x1x2 x3 x4 x5 x6 x7 x8 x9 x10 y
  • 9. - McCulloch and Pitts (1943) for introducing the idea of neural networks as computing machines - Hebb (1949) for inventing the first rule for self- organized learning - Rosenblass (1958) for proposing the perceptron as the first model for learning with a teacher Modelling Neurons
  • 10. Net input signal received through synaptic junctions is net = b + Σwixi = b + WT X Weight vector: W =[w1 w2 … wm]T Input vector: X = [x1 x2 … xm]T Each output is a function of the net stimulus signal (f is called the activation function) y = f (net) = f(b + WTX) Modelling Neurons
  • 11. General Model for Neurons Modelling Neurons y= f(b+ Σwnxn) x1 x2 xm . . . y w1 w2 wm b  f
  • 12. Activation functions Modelling Neurons 1 1 1 1 1 1 -1 -1 -1 Good for classification Simple computation Continuous & Differentiable Threshold Function/ Hard Limiter Linear Function sigmoid Function
  • 13. Sigmoid Function Modelling Neurons 1 sigmoid Function f(x) = 1 1 + e-ax = threshold function when a goes to infinity
  • 14. McCulloch-Pitts Neuron Modelling Neurons y {0,1} y x1 x2 xm . . . y w1 w2 wm b 
  • 16. Single-input McCulloch-Pitts neurode with b=0, w1=-1 for binary inputs: Conclusion? McCulloch-Pitts Neuron x1 net y 0 0 1 1 -1 0 x1 y w1 b 
  • 17. Two-input McCulloch-Pitts neurode with b=-1, w1=w2=1 for binary inputs: McCulloch-Pitts Neuron x1 x2 net y 0 0 ? ? 0 1 ? ? 1 0 ? ? 1 1 ? ? x1 y w1 b  x2 w2
  • 18. Two-input McCulloch-Pitts neurode with b=-1, w1=w2=1 for binary inputs: McCulloch-Pitts Neuron x1 x2 net y 0 0 -1 0 0 1 0 1 1 0 0 1 1 1 1 1 x1 y w1 b  x2 w2
  • 19. Two-input McCulloch-Pitts neurode with b=-2, w1=w2=1 for binary inputs : McCulloch-Pitts Neuron x1 x2 net y 0 0 ? ? 0 1 ? ? 1 0 ? ? 1 1 ? ? x1 y w1 b  x2 w2
  • 20. Two-input McCulloch-Pitts neurode with b=-2, w1=w2=1 for binary inputs : McCulloch-Pitts Neuron x1 x2 net y 0 0 -2 0 0 1 -1 0 1 0 -1 0 1 1 0 1 x1 y w1 b  x2 w2
  • 21. Every basic Boolean function can be implemented using combinations of McCulloch- Pitts Neurons. McCulloch-Pitts Neuron
  • 22. the McCulloch-Pitts neuron can be used as a classifier that separate the input signals into two classes (perceptron): McCulloch-Pitts Neuron X1 X2 Class A (y=1) Class B (y=0) Class A  y=? y = 1  net = ? net  0  ? b+w1x1+w2x2  0 Class B  y=? y = 0  net = ? net < 0  ? b+w1x1+w2x2 < 0
  • 23. Class A  b+w1x1+w2x2  0 Class B  b+w1x1+w2x2 < 0 Therefore, the decision boundary is a hyperline given by b+w1x1+w2x2 = 0 Where w1 and w2 come from? McCulloch-Pitts Neuron
  • 24. Solution: More Neurons Required McCulloch-Pitts Neuron X2 X1 X1 and x2 X2 X1 X1 or x2 X2 X1 X1 xor x2 ?
  • 26. Single Layer Feed-forward Network Network Architecture Single Layer: There is only one computational layer. Feed-forward: Input layer projects to the output layer not vice versa.  f  f  f Input layer (sources) Output layer (Neurons)
  • 27. Multi Layer Feed-forward Network Network Architecture  f  f  f Input layer (sources) Hidden layer (Neurons)  f  f Output layer (Neurons)
  • 28. Single Layer Recurrent Network Network Architecture  f  f  f
  • 29. Multi Layer Recurrent Network Network Architecture  f  f  f  f  f
  • 30. The mechanism based on which a neural network can adjust its weights (synaptic junctions weights): - Supervised learning: having a teacher - Unsupervised learning: without teacher Learning Processes
  • 31. Supervised Learning Learning Processes Teacher Learner Environment  Input data Desired response Actual response Error Signal + -
  • 32. Unsupervised Learning Learning Processes LearnerEnvironment Input data Neurons learn based on a competitive task. A competition rule is required (competitive-learning rule).
  • 33. - The goal of the perceptron to classify input data into two classes A and B - Only when the two classes can be separated by a linear boundary - The perceptron is built around the McCulloch-Pitts Neuron model - A linear combiner followed by a hard limiter - Accordingly the neuron can produce +1 and 0 Perceptron x1 x2 xm . . . y wk wk wk b 
  • 34. Equivalent Presentation Perceptron x1 x2 xm . . . y w1 w2 wm b  w0 1 net = WTX Weight vector: W =[w0 w1 … wm]T Input vector: X = [1 x1 x2 … xm]T
  • 35. There exist a weight vector w such that we may state WTx > 0 for every input vector x belonging to A WTx ≤ 0 for every input vector x belonging to B Perceptron
  • 36. Elementary perceptron Learning Algorithm 1) Initiation: w(0) = a random weight vector 2) At time index n, form the input vector x(n) 3) IF (wTx > 0 and x belongs to A) or (wTx ≤ 0 and x belongs to B) THEN w(n)=w(n-1) Otherwise w(n)=w(n-1)-ηx(n) 4) Repeat 2 until w(n) converges 1) Perceptron
  • 37. - when the activation function simply is f(x)=x the neuron acts similar to an adaptive filter. - In this case: y=net=wTx - w=[w1 w2 … wm]T - x=[x1 x2 … xm]T Linear Neuron x1 x2 xm . . . y w1 w2 wm b 
  • 38. Linear Neuron Learning (Adaptive Filtering) Linear Neuron Teacher Learner (Linear Neuron) Environment  Input data Desired Response d(n) Actual Response y(n) Error Signal + -
  • 39. LMS Algorithm To minimize the value of the cost function defined as E(w) = 0.5 e2(n) where e(n) is the error signal e(n)=d(n)-y(n)=d(n)-wT(n)x(n) In this case, the weight vector can be updated as follows wi(n+1)=wi(n-1) - μ( ) Linear Neuron dE dwi
  • 40. LMS Algorithm (continued) = e(n) = e(n) {d(n)-wT(n)x(n)} = -e(n) xi(n) wi(n+1)=wi(n)+μe(n)xi(n) w(n+1)=w(n)+μe(n)x(n) Linear Neuron dE dwi de(n) dwi d dwi
  • 41. Summary of LMS Algorithm 1) Initiation: w(0) = a random weight vector 2) At time index n, form the input vector x(n) 3) y(n)=wT(n)x(n) 4) e(n)=d(n)-y(n) 5) w(n+1)=w(n)+μe(n)x(n) 6) Repeat 2 until w(n) converges Linear Neuron
  • 42. - To solve the xor problem - To solve the nonlinear classification problem - To deal with more complex problems Multilayer Perceptron
  • 43. - The activation function in multilayer perceptron is usually a Sigmoid function. - Because Sigmoid is differentiable function, unlike the hard limiter function used in the elementary perceptron. Multilayer Perceptron
  • 44. Architecture Multilayer Perceptron  f  f  f Input layer (sources) Hidden layer (Neurons)  f  f Output layer (Neurons)
  • 45. Architecture Multilayer Perceptron  f  f  f Inputs (from layer k-1) layer k Outputs (to layer k+1)
  • 46. Consider a single neuron in a multilayer perceptron (neuron k) Multilayer Perceptron  f yk y0=1 y1 ym Wk,0 Wk,1 Wk,m . . . netk = Σwk,iyi yk= f (netk)
  • 47. Multilayer perceptron Learning (Back Propagation Algorithm) Multilayer Perceptron Teacher Neuron j of layer k layer k-1  Input data Desired Response dk(n) Actual Response yk(n) Error Signal ek(n) + -
  • 48. Back Propagation Algorithm: Cost function of neuron j: Ek = 0.5 e2 j Cost function of network: E = ΣEj = 0.5Σe2 j ek = dk – yk ek = dk – f(netk) ek = dk – f(Σwk,iyi) Multilayer Perceptron
  • 49. Back Propagation Algorithm: Cost function of neuron k: Ek = 0.5 e2 k ek = dk – yk ek = dk – f(netk) ek = dk – f(Σwk,iyi) Multilayer Perceptron
  • 50. Cost function of network: E = ΣEj = 0.5Σe2 j Multilayer Perceptron
  • 51. Back Propagation Algorithm: To minimize the value of the cost function E(wk,i), the weight vector can be updated using a gradient based algorithm as follows wk,i(n+1)=wk,i (n) - μ( ) =? Multilayer Perceptron dE dwk,i dE dwk,j
  • 52. Back Propagation Algorithm: = = = δk yk δk = = -ekf’(netk) Multilayer Perceptron dE dwk,i dE dek dek dyk dyk dnetk dnetk dwk,i dE dnetk dnetk dwk,i dE dek dek dyk dyk dnetk dE dek = ek dek dnetk = f’(netk) dek dyk = -1 dnetk dwk,i = yi Local Gradient
  • 53. Back Propagation Algorithm: Substituting into the gradient-based algorithm: wk,i(n+1)=wk,i (n) – μ δk yk δk = -ekf’(netk) = - {dk- yk} f’(netk) = ? If k is an output neuron we have all the terms of δk When k is a hidden neuron? Multilayer Perceptron dE dwk,i
  • 54. Back Propagation Algorithm: When k is hidden δk = = = f’(netk) =? Multilayer Perceptron dE dek dek dyk dyk dnetk dE dyk dyk dnetk dek dnetk = f’(netk) dE dyk dE dyk
  • 55. = ? E = 0.5Σe2 j Multilayer Perceptron dE dyk dE dyk dej dyk dej dnetj dnetj dyk wjk -f’(netj) = -Σ ej f’(netj) wjk = -Σ δj wjk = Σ ej = Σ ej
  • 56. We had δk =- Substituting = Σ δj wjk into δk results in δk = - f’(netk) Σ δj wjk which gives the local gradient for the hidden neuron k Multilayer Perceptron dE dyk f’(netk) dE dyk
  • 57. Summary of Back Propagation Algorithm 1) Initiation: w(0) = a random weight vector At time index n, get the input data and 2) Calculate netk and yk for all the neurons 3) For output neurons, calculate ek=dk-yk 4) For output neurons, δk = -ekf’(netk) 5) For hidden neurons, δk = - f’(netk) Σ δj wjk 6) Update every neuron weights by wk,i(NEW)=wk,i (OLD) – μ δk yk 7. Repeat steps 2~6 for the next data set (next time index) Multilayer Perceptron

Hinweis der Redaktion

  1. - Historically important