SlideShare a Scribd company logo
1 of 109
Download to read offline
Artificial Neural Networks (ANNs)
XOR Step-By-Step
MENOUFIA UNIVERSITY
FACULTY OF COMPUTERS AND INFORMATION
ALL DEPARTMENTS
ARTIFICIAL INTELLIGENCE
‫المنوفية‬ ‫جامعة‬
‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬
‫األقسام‬ ‫جميع‬
‫الذكاء‬‫اإلصطناعي‬
‫المنوفية‬ ‫جامعة‬
Ahmed Fawzy Gad
ahmed.fawzy@ci.menofia.edu.eg
Classification Example
BA
01
1
10
00
0
11
Neural Networks
Input Hidden Output
BA
01
1
10
00
0
11
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Can`t be Solved Linearly.
Single Layer Perceptron Can`t Work.
Use Hidden Layer.
Neural Networks
BA
01
1
10
00
0
11
Input OutputHidden
Input Layer
Input Output
A
B
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Hidden Layer
Start by Two Neurons
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
A
B
Output Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
1/0
𝒀𝒋
A
B
Weights
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
A
B
1/0
𝒀𝒋
Weights
Input Layer – Hidden Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
A
B
1/0
𝒀𝒋
Weights
Hidden Layer – Output Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟓
𝑾 𝟔
A
B
1/0
𝒀𝒋
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
All Layers
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟓
𝑾 𝟔
1/0
𝒀𝒋
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
Input Hidden
1/0
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
Input Hidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Components
Output
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
s=SOP(𝑿𝒊, 𝑾𝒊)
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Each Hidden/Output Layer
Neuron has its SOP.
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Outputs
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
Class Label
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Outputs
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
Class Label
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
Each Hidden/Output
Layer Neuron has its
Activation Function.
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Functions
Piecewise
Linear Sigmoid Binary
Activation Functions
Which activation function to use?
Outputs
Class
Labels
Activation
Function
TWO Class
Labels
TWO
Outputs
One that gives two outputs.
Which activation function to use?
𝑪𝒋𝒀𝒋
BA
01
1
10
00
0
11
BA
01
1 10
00
0 11
Activation Functions
Piecewise
Linear Sigmoid BinaryBinary
Activation Function
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Hidden Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
Bias
Output Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟑
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
All Bias Values
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
1/0
+1
𝒃 𝟑
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
1/0
+1
𝒃 𝟑
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
1/0
+1
𝒃 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
Y-Intercept
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
Y-Intercept
b=0
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y
Y-Intercept
b=0
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=-v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
Same Concept Applies to Bias
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Learning Rate
𝟎 ≤ η ≤ 𝟏
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Inputs 𝑿 𝒎
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Weights 𝑾 𝒎
𝟎 ≤ η ≤ 𝟏
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Bias
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Sum Of Products (SOP) 𝒔
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Activation Function 𝒃𝒊𝒏
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Outputs 𝒀𝒋
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Summary of Parameters
Learning Rate η
𝟎 ≤ η ≤ 𝟏
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Other Parameters
Step n
𝒏 = 𝟎, 𝟏, 𝟐, …
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Other Parameters
Desired Output 𝒅𝒋
𝒏 = 𝟎, 𝟏, 𝟐, …
𝒅 𝒏 =
𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝟏)
𝟎, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝟎)
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Neural Networks Training Steps
Weights Initialization
Inputs Application
Sum of Inputs-Weights Products
Activation Function Response Calculation
Weights Adaptation
Back to Step 2
1
2
3
4
5
6
Regarding 5th Step: Weights Adaptation
• If the predicted output Y is not the same as the desired output d,
then weights are to be adapted according to the following equation:
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
Where
𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
Neural Networks
Training Example
Step n=0
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=0:
η = .001
𝑋 𝑛 = 𝑋 0 = +1, +1, +1,1, 0
𝑊 𝑛 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 0 = 1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+1*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+1*1+0*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 - Output
𝒀 𝒏 = 𝒀 𝟎 = 𝒀 𝑺3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟎 = 1
𝐝 𝒏 = 𝒅 𝟎 = 1
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=1:
η = .001
𝑋 𝑛 = 𝑋 1 = +1, +1, +1,0, 1
𝑊 𝑛 = 𝑊 1 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 1 = +1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+0*1+1*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+0*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 - Output
𝒀 𝒏 = 𝒀 𝟏 = 𝒀 𝑺3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟏 = 1
𝐝 𝒏 = 𝒅 𝟏 = 1
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=2:
η = .001
𝑋 𝑛 = 𝑋 2 = +1, +1, +1,0, 0
𝑊 𝑛 = 𝑊 2 = 𝑊 1 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 2 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+0*1+0*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −𝟏. 𝟓
= 𝟎
𝒃𝒊n 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+0*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑺𝑮𝑵 𝑺2
= 𝑺𝑮𝑵 −. 𝟓
=0
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 - Output
𝒀 𝒏 = 𝒀 𝟐 = 𝒀 𝑺3
= 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟐 = 𝟎
𝐝 𝒏 = 𝒅 𝟐 = 𝟎
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=3:
η = .001
𝑋 𝑛 = 𝑋 3 = +1, +1, +1,1, 1
𝑊 𝑛 = 𝑊 3 = 𝑊 2 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 3 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+1*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+1*1+1*1
=1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 𝟏. 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+1*-2+1*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 −𝟏. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 - Output
𝒀 𝒏 = 𝒀 𝟑 = 𝒀 𝑺3
= 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟑 = 𝟎
𝐝 𝒏 = 𝒅 𝟑 = 𝟎
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Final Weights
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Current weights predicted
the desired outputs.

More Related Content

What's hot

Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Simplilearn
 
Stuart russell and peter norvig artificial intelligence - a modern approach...
Stuart russell and peter norvig   artificial intelligence - a modern approach...Stuart russell and peter norvig   artificial intelligence - a modern approach...
Stuart russell and peter norvig artificial intelligence - a modern approach...Lê Anh Đạt
 
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...Mahbubur Rahman
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagationKrish_ver2
 
Asymptotic notations
Asymptotic notationsAsymptotic notations
Asymptotic notationsNikhil Sharma
 
Activation functions
Activation functionsActivation functions
Activation functionsPRATEEK SAHU
 
Hands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousryHands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousryAhmed Yousry
 
Linear models and multiclass classification
Linear models and multiclass classificationLinear models and multiclass classification
Linear models and multiclass classificationNdSv94
 
Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...
Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...
Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...Simplilearn
 
Sequence Modelling with Deep Learning
Sequence Modelling with Deep LearningSequence Modelling with Deep Learning
Sequence Modelling with Deep LearningNatasha Latysheva
 
Symbol table management and error handling in compiler design
Symbol table management and error handling in compiler designSymbol table management and error handling in compiler design
Symbol table management and error handling in compiler designSwati Chauhan
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
 
Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3Chariza Pladin
 
Artificial Intelligence -- Search Algorithms
Artificial Intelligence-- Search Algorithms Artificial Intelligence-- Search Algorithms
Artificial Intelligence -- Search Algorithms Syed Ahmed
 
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Simplilearn
 
Reinforcement learning
Reinforcement learning Reinforcement learning
Reinforcement learning Chandra Meena
 

What's hot (20)

Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
Asymptotic Notation
Asymptotic NotationAsymptotic Notation
Asymptotic Notation
 
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
 
Stuart russell and peter norvig artificial intelligence - a modern approach...
Stuart russell and peter norvig   artificial intelligence - a modern approach...Stuart russell and peter norvig   artificial intelligence - a modern approach...
Stuart russell and peter norvig artificial intelligence - a modern approach...
 
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagation
 
Asymptotic notations
Asymptotic notationsAsymptotic notations
Asymptotic notations
 
Activation functions
Activation functionsActivation functions
Activation functions
 
Restricted boltzmann machine
Restricted boltzmann machineRestricted boltzmann machine
Restricted boltzmann machine
 
Hands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousryHands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousry
 
Linear models and multiclass classification
Linear models and multiclass classificationLinear models and multiclass classification
Linear models and multiclass classification
 
Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...
Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...
Deep Learning Frameworks 2019 | Which Deep Learning Framework To Use | Deep L...
 
Sequence Modelling with Deep Learning
Sequence Modelling with Deep LearningSequence Modelling with Deep Learning
Sequence Modelling with Deep Learning
 
Symbol table management and error handling in compiler design
Symbol table management and error handling in compiler designSymbol table management and error handling in compiler design
Symbol table management and error handling in compiler design
 
Lstm
LstmLstm
Lstm
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3
 
Artificial Intelligence -- Search Algorithms
Artificial Intelligence-- Search Algorithms Artificial Intelligence-- Search Algorithms
Artificial Intelligence -- Search Algorithms
 
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
 
Reinforcement learning
Reinforcement learning Reinforcement learning
Reinforcement learning
 

Similar to Artificial Neural Networks (ANNs) - XOR - Step-By-Step

Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals Darwin Nesakumar
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmMartin Opdam
 
Combinational logic 2
Combinational logic 2Combinational logic 2
Combinational logic 2Heman Pathak
 
Digital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxDigital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxssuser6feece1
 
Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5JalpaMaheshwari1
 
Logic gates and logic circuits
Logic gates and logic circuitsLogic gates and logic circuits
Logic gates and logic circuitsjyoti_lakhani
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplifiedLovelyn Rose
 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tablesnorthernkiwi
 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in indiaEdhole.com
 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Hsien-Hsin Sean Lee, Ph.D.
 
Computer archi&mp
Computer archi&mpComputer archi&mp
Computer archi&mpMSc CST
 
Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)ali9753
 
How does a Neural Network work?
How does a Neural Network work?How does a Neural Network work?
How does a Neural Network work?Nikolay Kostadinov
 

Similar to Artificial Neural Networks (ANNs) - XOR - Step-By-Step (20)

Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation Algorithm
 
Fourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasersFourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasers
 
Combinational logic 2
Combinational logic 2Combinational logic 2
Combinational logic 2
 
Digital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxDigital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptx
 
CH11-Digital Logic.pptx
CH11-Digital Logic.pptxCH11-Digital Logic.pptx
CH11-Digital Logic.pptx
 
Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5
 
Mod 3.pptx
Mod 3.pptxMod 3.pptx
Mod 3.pptx
 
Logic circuit2017
Logic circuit2017Logic circuit2017
Logic circuit2017
 
Feedback amplifier
Feedback amplifierFeedback amplifier
Feedback amplifier
 
Logic gates and logic circuits
Logic gates and logic circuitsLogic gates and logic circuits
Logic gates and logic circuits
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplified
 
Logic Equation Simplification
Logic Equation SimplificationLogic Equation Simplification
Logic Equation Simplification
 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tables
 
Logic gates presentation
Logic gates presentationLogic gates presentation
Logic gates presentation
 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in india
 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
 
Computer archi&mp
Computer archi&mpComputer archi&mp
Computer archi&mp
 
Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)
 
How does a Neural Network work?
How does a Neural Network work?How does a Neural Network work?
How does a Neural Network work?
 

More from Ahmed Gad

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmAhmed Gad
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...Ahmed Gad
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionAhmed Gad
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesAhmed Gad
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Ahmed Gad
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Ahmed Gad
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Ahmed Gad
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with RegularizationAhmed Gad
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleAhmed Gad
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisAhmed Gad
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepAhmed Gad
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientAhmed Gad
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - RevisionAhmed Gad
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAhmed Gad
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleAhmed Gad
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingAhmed Gad
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...Ahmed Gad
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Ahmed Gad
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Ahmed Gad
 

More from Ahmed Gad (20)

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic Algorithm
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd Edition
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with Regularization
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step Example
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and Gradient
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - Revision
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by Example
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
 

Recently uploaded

Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...Sapna Thakur
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 

Recently uploaded (20)

Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 

Artificial Neural Networks (ANNs) - XOR - Step-By-Step

  • 1. Artificial Neural Networks (ANNs) XOR Step-By-Step MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE ‫المنوفية‬ ‫جامعة‬ ‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬ ‫األقسام‬ ‫جميع‬ ‫الذكاء‬‫اإلصطناعي‬ ‫المنوفية‬ ‫جامعة‬ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  • 3. Neural Networks Input Hidden Output BA 01 1 10 00 0 11
  • 5. Neural Networks BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Can`t be Solved Linearly. Single Layer Perceptron Can`t Work. Use Hidden Layer.
  • 8. Hidden Layer Start by Two Neurons Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden A B
  • 9. Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden 1/0 𝒀𝒋 A B
  • 10. Weights Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 A B 1/0 𝒀𝒋
  • 11. Weights Input Layer – Hidden Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 A B 1/0 𝒀𝒋
  • 12. Weights Hidden Layer – Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟓 𝑾 𝟔 A B 1/0 𝒀𝒋 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 13. All Layers Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟓 𝑾 𝟔 1/0 𝒀𝒋 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 14. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 Input Hidden 1/0 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 15. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 Input Hidden 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 16. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 17. Activation Function Components Output 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 18. Activation Function Inputs Output s 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 19. Activation Function Inputs Output s 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 20. Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 21. Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 22. 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 s=SOP(𝑿𝒊, 𝑾𝒊) 1/0
  • 23. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0 Each Hidden/Output Layer Neuron has its SOP.
  • 24. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 25. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 26. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 27. Activation Function Outputs Output F(s)s 𝑿 𝟏 𝑿 𝟐 Class Label 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 28. Activation Function Outputs Output F(s)s 𝑿 𝟏 𝑿 𝟐 Class Label 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 Each Hidden/Output Layer Neuron has its Activation Function. 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 30. Activation Functions Which activation function to use? Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. Which activation function to use? 𝑪𝒋𝒀𝒋 BA 01 1 10 00 0 11 BA 01 1 10 00 0 11
  • 32. Activation Function Output F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 33. Bias Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 34. Bias Hidden Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0
  • 35. Bias Output Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟑 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 36. All Bias Values Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 37. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 38. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 1/0 +1 𝒃 𝟑 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 39. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 1/0 +1 𝒃 𝟑 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 40. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 1/0 +1 𝒃 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 41. Bias Importance Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 42. Bias Importance Input Output X Y BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 43. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 44. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 45. Bias Importance Input Output X Y y=ax+b Y-Intercept BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 46. Bias Importance Input Output X Y y=ax+b Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 47. Bias Importance Input Output X Y Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 48. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 49. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 50. Bias Importance Input Output X Y Y-Intercept b=-v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 51. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 52. Bias Importance Input Output Same Concept Applies to Bias S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 53. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 54. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 55. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 56. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 57. Learning Rate 𝟎 ≤ η ≤ 𝟏 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 58. Summary of Parameters Inputs 𝑿 𝒎 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 59. Summary of Parameters Weights 𝑾 𝒎 𝟎 ≤ η ≤ 𝟏 W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 60. Summary of Parameters Bias 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 61. Summary of Parameters Sum Of Products (SOP) 𝒔 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 62. Summary of Parameters Activation Function 𝒃𝒊𝒏 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 63. Summary of Parameters Outputs 𝒀𝒋 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 64. Summary of Parameters Learning Rate η 𝟎 ≤ η ≤ 𝟏 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 65. Other Parameters Step n 𝒏 = 𝟎, 𝟏, 𝟐, … F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 66. Other Parameters Desired Output 𝒅𝒋 𝒏 = 𝟎, 𝟏, 𝟐, … 𝒅 𝒏 = 𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝟏) 𝟎, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝟎) BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 67. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  • 68. Regarding 5th Step: Weights Adaptation • If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) Where 𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
  • 69. Neural Networks Training Example Step n=0 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=0: η = .001 𝑋 𝑛 = 𝑋 0 = +1, +1, +1,1, 0 𝑊 𝑛 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 0 = 1 BA 01 1 => 1 10 00 0 => 0 11
  • 70. Neural Networks Training Example Step n=0 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 71. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+1*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 72. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 73. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+1*1+0*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 74. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 75. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 76. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 77. Neural Networks Training Example Step n=0 - Output 𝒀 𝒏 = 𝒀 𝟎 = 𝒀 𝑺3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 78. Neural Networks Training Example Step n=0 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟎 = 1 𝐝 𝒏 = 𝒅 𝟎 = 1 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 79. Neural Networks Training Example Step n=1 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=1: η = .001 𝑋 𝑛 = 𝑋 1 = +1, +1, +1,0, 1 𝑊 𝑛 = 𝑊 1 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 1 = +1 BA 01 1 => 1 10 00 0 => 0 11
  • 80. Neural Networks Training Example Step n=1 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 81. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+0*1+1*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 82. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 83. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+0*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 84. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 85. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 86. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 87. Neural Networks Training Example Step n=1 - Output 𝒀 𝒏 = 𝒀 𝟏 = 𝒀 𝑺3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 88. Neural Networks Training Example Step n=1 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟏 = 1 𝐝 𝒏 = 𝒅 𝟏 = 1 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 89. Neural Networks Training Example Step n=2 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=2: η = .001 𝑋 𝑛 = 𝑋 2 = +1, +1, +1,0, 0 𝑊 𝑛 = 𝑊 2 = 𝑊 1 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 2 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 90. Neural Networks Training Example Step n=2 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 91. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+0*1+0*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 92. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −𝟏. 𝟓 = 𝟎 𝒃𝒊n 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 93. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+0*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 94. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑺𝑮𝑵 𝑺2 = 𝑺𝑮𝑵 −. 𝟓 =0 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 95. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 96. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 97. Neural Networks Training Example Step n=2 - Output 𝒀 𝒏 = 𝒀 𝟐 = 𝒀 𝑺3 = 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 98. Neural Networks Training Example Step n=2 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟐 = 𝟎 𝐝 𝒏 = 𝒅 𝟐 = 𝟎 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 99. Neural Networks Training Example Step n=3 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=3: η = .001 𝑋 𝑛 = 𝑋 3 = +1, +1, +1,1, 1 𝑊 𝑛 = 𝑊 3 = 𝑊 2 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 3 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 100. Neural Networks Training Example Step n=3 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 101. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+1*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 102. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 103. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+1*1+1*1 =1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 104. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 𝟏. 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 105. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+1*-2+1*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 106. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 −𝟏. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 107. Neural Networks Training Example Step n=3 - Output 𝒀 𝒏 = 𝒀 𝟑 = 𝒀 𝑺3 = 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 108. Neural Networks Training Example Step n=3 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟑 = 𝟎 𝐝 𝒏 = 𝒅 𝟑 = 𝟎 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 109. Final Weights s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin Current weights predicted the desired outputs.