An artificial neuron network (ANN) is a computational model based on the structure and functions of biological neural networks. It works on real-valued, discrete-valued and vector valued.
Knoldus Inc.CTO & Co-Founder at Knoldus Software um Knoldus Inc.
4. ● Dendrites: A short branched extension of a nerve cell, along which impulses
received from other cells at synapses are transmitted to the cell body.
● Axon: An axon is a long, slender projection of a nerve cell, or neuron, that
typically conducts electrical impulses away from the neuron's cell body.
● Synapse: A junction between two nerve cells, consisting of a minute gap
across which impulses pass by diffusion of a neurotransmitter.
● Cell body or Soma: The soma or "cell body" is the bulbous, non-process
portion of a neuron or other brain cell type, containing the cell nucleus.
● Nucleus: The nucleus is a membrane-enclosed organelle found in eukaryotic
cells.
5. Brain facts
●
Having density of neurons around approx 10 ^ 11 neurons
●
Each neuron is connected to other approx 10 ^ 4 neurons
●
Switching time of biological neuron is approx 10 ^ -3
seconds
●
And it just take around 10 ^ -1 seconds to identify someone
known.
●
Switching time of computer is 10 ^ -10 seconds
●
Not efficient enough to take complex decisions like human.
●
Brain’s neural net is embarrassingly parallel.
7. Artificial Neural network
●
ANN works on real-valued, discrete-valued and vector
valued.
●
An artificial neuron is a mathematical function conceived as
a model of biological neurons
●
The artificial neuron receives one or more inputs and sums
them to produce an output.
●
The sums of each node are weighted, and the sum is
passed through a non-linear function known as an
activation function or transfer function.
●
The activation functions usually have a sigmoid shape, but
they may also take the form of other non-linear
functions, piecewise linear functions, or step functions.
11. Neural net with Sigmoid function
● Weights for first layer of first feature = W11, W12, W13
● Weights for first layer of second feature = W21, W22, W23
● Input 1st
Neuron, Input 2nd
Neuron, Input 3rd
Neuron = (X1W11 + X2W21),
(X1W12 + X2W22), (X1W13 + X2W23)
e.g. For input (3, 5), Z21 = 3W11 + 5W21, Z22 = 3W12 + 5W22, Z3 = 3W13 +
5W23
● def sigmoid(z) = 1/(1 + e ^ -z)
● Z1 = sigmoid(Z21), Z2 = sigmoid(Z22), Z3 = sigmoid(Z23)
● Weights for output layer = W31, W32, W33
● Weighted input for output layer, Z = Z1W31 + Z2W32 + Z3W33
● Y = sigmoid(z) ---→ your predicted output
12. See what Tom Mitchell has to say
"A computer program is said to learn from experience E with respect to
some class of tasks T and performance measure P, if its performance
at tasks in T, as measured by P, improves with experience E“ –
T. Michell (1997)
13. Backpropagation
● Inspired from biological neural net
● The backward propagation of errors
● Use Gradient descent for weight update
● Its Supervised learning
14. Gradient descent
● What is gradient??
●
Ans: An increase or decrease in the magnitude of a property observed in
passing from one point or moment to another
●
Or
●
In mathematics, the gradient is a multi-variable generalization of the
derivative.
● Error = - Y
● Squared error function E(w) = 1/2
● Gradient
● Weight update: where
17. Stochastic Gradient descent
●
The Local minima problem
●
Gradient descent training rule computes
weight updates after summing over all
the training examples.
●
Stochastic gradient descent
approximates the gradient descent
search by updating weights
incrementally, following the calculation
of the error for each individual example.
So this is what the biological neural network looks like. It has all this stuff like dendrites, axon, synapse cell body etc.
So what does dendrites do then.
What does axon do?
What does synapse do?
What does soma or cell body do??
What is neucleus??
For example, it requires approximately lo-' seconds to visually recognize
your mother. Notice the sequence of neuron firings that can take place during this
10-'-second interval cannot possibly be longer than a few hundred steps, given
the switching speed of single neurons. This observation has led many to speculate
that the information-processing abilities of biological neural systems must follow
from highly parallel processes operating on representations that are distributed
over many neurons. One motivation for ANN systems is to capture this kind
of highly parallel computation based on distributed representations.
The hidden layer, don't ask me what is hidden layer or why there is a hidden layer. Why there is a these many neurons. These all the dark secret of finding artificial intelligence.
In mathematics, a discrete valuation is an integer valuation on a field K; that is, a function
The artificial neuron receives one or more inputs (representing dendrites) and sums them to produce an output (or activation) (representing a neuron's axon)
Linear functions are those whose graph is a straight line. A linear function has the following form. y = f(x) = a + bx. A linear function has one independent variable and one dependent variable. The independent variable is x and the dependent variable is y.
In mathematics, a piecewise linear function is a real-valued function defined on the real numbers or a segment thereof, whose graph is composed of straight-line sections. It is a piecewise-defined function, each of whose pieces is an affine function.
a function that increases or decreases abruptly from one constant value to another.
I know this slide looks horrible
1. Feature branch are red dots, these are dots that has been changed time to time.
2. In scrum, we create engineering tasks, so each engineering task can be a feature hear.
3. You write code, write test cases , take a pull from develop and send a pull request.
Once the pull request is merged the codes are in the develop branch and a yellow dot gets created.
This is the branch where all your codes stored until we make a next release.
And you can see the release here with green dots. So this is the place where all QA parts are being tested, if there is any bug occurs, it must be fixed in this branch. And once everything is fine and possible to show a green flag, we merge it to both develop and master.
Master branch is the branch which is always for the production.
Now the situation when QA people also couldn't catch a bug and it directly appears in the production we create the hot fix branch. This branch is merged directly to develop as well as develop and if if there is release into it too.
So now you see this can be copared to DTAP. Development Testing Acceptance Production.
Development is the local machine
Testing works on develop
Acceptance works on release
Production works on master.
Now explain the open source libraries with it as well.
At the end of thi slide I just like to tell you that.
For effective development with Git use gitflow, for everything else contact Mayank sir.