Neural Network - Feed Forward - Back Propagation Visualization
1. Neural Network - Forward and Back
Propagation, Gradient Descent
Visualize Forward and Back Propagation in a
simple Neural Network using IBM Rational
Rhapsody MDA tool to design the Neural
Network and execute the model. Minimize
the Cost (Loss) using Gradient Descent
2. The Problem: wX + b = Y
•Question: Given a set of X inputs, a set of
Y targets (expected, desired), and initial
values for the Weights, can the Neural
Network model compute the outputs,
Yhat, (actual, computed, predicted), to
match the Y targets within a predefined
error(i.e. Y – Yhat <= 10E-6)?
3. The Problem: wX + b = Y (cont.)
•Answer: Yes! Use gradient descent
algorithm to find the minimum of a
function.
•Use the Forward(FP) and Back
Propagation(BP) capability of the NN
model to minimize the Cost or Loss of the
NN and thus find the right weights.
4. The Problem: wX + b = Y (cont.)
•In FP compute the Outputs for each
neuron in each Hidden Layer
•the output of the Input Layer is the
transferred input, X, and does not
need to be computed
•compute the Cost at the output Layer
5. The Problem: wX + b = Y (cont.)
•In BP compute the Gradients and update the
Weights(coefficients) to minimize the cost
•Iterate again (FPBP) until the Cost
approaches the predefined error.
• Xs and Ys are randomly generated within (-1, 1)
• Weights are randomly initialized between (-0.1, 0.1)
6. An Artificial Neural Network is suitable to an
Object Oriented design:
•ANNModel (“layer manager”) has Layers
• Maintains a list of Layers
• Delegates operations to the Layers
•Layer (“neuron manager”) has Neurons
• Maintains a list of Neurons
• Delegates operations to the Neurons
• ANNModel does not have DIRECT access to the
Neurons but through its Layers
7. Artificial Neural Network is suitable to an
Object Oriented design (cont):
•Neuron maintains a list of Weights that is
indexed by the Ids of the Previous Layer
Neurons
• This indexing constitutes the connection between
a current layer and the previous layer in a NN
• A Neuron encapsulates data and operations to:
• Activate a neuron (i.e. compute its output)
• Compute its Gradient
• Update its Weights (compute momentum, and
delta weight)
8. Artificial Neural Network is suitable to an
Object Oriented design (cont):
•Activator (“singleton”)
•Encapsulates the activation functions and
their corresponding derivatives
•The Neuron interacts with the Activator
when it needs to use the activation function
and the corresponding derivatives
•It is Instantiated by the ANNModel at
startup
9. USE Rational Rhapsody MDA tool to design a NN
and visualize the two major states of execution in
a NN:
•Forward Propagation state
• Compute the Outputs
• Compute Error(Cost, Loss)
•Back Propagation state
• Minimize Error(Cost) by using gradient descent
• Compute gradients - For Output layer, For Hidden layer
• Update weights - For Output layer, For Hidden layer
10. The NN Model has two major states of execution:
1. Forward_Propagation state:
•Activation sub state - compute the Outputs
•Compute_Cost sub state - compute Error
2. Back_Propagation state:
•Compute_Gradients sub state
•Update_Weights sub state
x. Configuration state - the inputs, target outputs, and the
weights are generated before training.