SlideShare a Scribd company logo
RIZVI COLLEGEOF ENGINEERING
“A PROJECT ON
ARTIFCIAL NEURAL NETWORK”
By
Arafat Shaikh
• What is Artificial Neural Network ?
• Biological Neuron
• Comparing ANN & BNN
• Model
• Architecture
• Learning
Overview
What is Artificial Neural Network ?
What is Artificial Neural Network ?
• Artificial Neural Network (ANN) is an efficient computing system whose
central theme is borrowed from the analogy of biological neural networks.
• ANNs are also named as “artificial neural systems,” or “parallel distributed
processing systems,” or “connectionist systems.”
• ANN acquires a large collection of units that are interconnected in some
pattern to allow communication between the units.
• These units, also referred to as nodes or neurons, are simple processors
which operate in parallel.
• Every neuron is connected with other neuron through a connection link.
Each connection link is associated with a weight that has information
about the input signal.
• This is the most useful information for neurons to solve a particular
problem because the weight usually excites or inhibits the signal that is
being communicated.
• Each neuron has an internal state, which is called an activation signal.
Output signals, which are produced after combining the input signals and
activation rule, may be sent to other units.
Biological Neuron
• A neuron (also known as a neurone or nerve cell) is a cell that carries
electrical impulses. Neurons are the basic units of the nervous system and
its most important part is the brain.
• Every neuron is made of a cell body (also called a soma), dendrites and an
axon. Dendrites and axons are nerve fibres. There are about 86 billion
neurons in the human brain, which comprises roughly 10% of all brain
cells. The neurons are supported by glial cells and astrocytes.
• Neurons are connected to one another and tissues. They do not touch and
instead form tiny gaps called synapses. These gaps can be chemical
synapses or electrical synapses and pass the signal from one neuron to the
next.
Biological Neuron
Biological Neuron
• Dendrite — It receives signals from other neurons.
• Soma (cell body) — It sums all the incoming signals to generate input.
• Axon — When the sum reaches a threshold value, neuron fires and the
signal travels down the axon to the other neurons.
• Synapses — The point of interconnection of one neuron with other
neurons. The amount of signal transmitted depend upon the strength
(synaptic weights) of the connections.
• The connections can be inhibitory (decreasing strength) or excitatory
(increasing strength) in nature.
• So, neural network, in general, is a highly interconnected network of
billions of neuron with trillion of interconnections between them.
How is Brain Different from Computers
BRAIN COMPUTERS
Biological Neurons or Nerve Cells. Silicon Transistor.
200 Billion Neurons, 32 trillion
interconnections.
1 Billion bytes RAM, trillion of bytes on
disk.
Neuron Size: 10-6m. Single Transistor Size: 10-9m.
Energy Consumption: 6-10 Joules
operation per second.
Energy Consumption: 10-16 Joules
Operation per second
Learning Capability Programming capability
Comparing ANN with BNN
• As this concept borrowed from ANN there are lot of similarities though
there are differences too.
• Similarities are in the following table
Biological Neural Network Artificial Neural Network
Soma Node
Dendrites Input
Synapse Weights or Interconnections
Axon Output
Following table Shows some differences between ANN and BNN :-
Comparing ANN with BNN
Criteria BNN ANN
Processing Massively parallel, slow but
superior than ANN
Massively parallel, fast but
inferior than BNN
Size 1011 neurons and 1015
interconnections
102 to 104 nodes (mainly
depends on the type of
application and network
designer)
Learning They can tolerate ambiguity Very precise, structured and
formatted data is required to
tolerate ambiguity
Fault tolerance Performance degrades with
even partial damage
It is capable of robust
performance, hence has the
potential to be fault tolerant
Storage capacity Stores the information in the
synapse
Stores the information in
continuous memory
locations
Analogy of ANN with BNN
• The dendrites in biological neural network is analogous to the
weighted inputs based on their synaptic interconnection in artificial
neural network.
• Cell body is analogous to the artificial neuron unit in artificial neural
network which also comprises of summation and threshold unit.
• Axon carry output that is analogous to the output unit in case of
artificial neural network. So, ANN are modelled using the working
of basic biological neurons.
Analogy of ANN with BNN
Model of Artificial Neural Network
Model of Artificial Neural Network
• Artificial neural networks can be viewed as weighted directed graphs in
which artificial neurons are nodes and directed edges with weights are
connections between neuron outputs and neuron inputs.
• The Artificial Neural Network receives input from the external world in the
form of pattern and image in vector form. These inputs are mathematically
designated by the notation x(n) for n number of inputs.
• Each input is multiplied by its corresponding weights. Weights are the
information used by the neural network to solve a problem. Typically
weight represents the strength of the interconnection between neurons
inside the neural network.
• The weighted inputs are all summed up inside computing unit (artificial
neuron). In case the weighted sum is zero, bias is added to make the output
not- zero or to scale up the system response. Bias has the weight and input
always equal to ‘1’.
• The sum corresponds to any numerical value ranging from 0 to infinity.
• In order to limit the response to arrive at desired value, the threshold
value is set up. For this, the sum is passed through activation function.
• The activation function is set of the transfer function used to get desired
output. There are linear as well as the non-linear activation function.
• Some of the commonly used activation function are — binary, sigmoidal
(linear) and tan hyperbolic sigmoidal functions(nonlinear).
• Binary — The output has only two values either 0 and 1. For this, the
threshold value is set up. If the net weighted input is greater than 1, an
output is assumed 1 otherwise zero.
• Sigmoidal Hyperbolic — This function has ‘S’ shaped curve. Here tan
hyperbolic function is used to approximate output from net input. The
function is defined as — f (x) = (1/1+ exp(-𝝈x)) where 𝝈— steepness
parameter.
Model of Artificial Neural Network
Architecture
A typical neural network contains a large number of artificial neurons called units arranged
in a series of layers.
In typical artificial neural network, comprises different layers -
Architecture
• Input layer — It contains those units (artificial neurons) which receive
input from the outside world on which network will learn, recognize about
or otherwise process.
• Output layer — It contains units that respond to the information about
how it’s learned any task.
• Hidden layer — These units are in between input and output layers. The
job of hidden layer is to transform the input into something that output
unit can use in some way.
• Most neural networks are fully connected that means to say each hidden
neuron is fully connected to the every neuron in its previous layer(input)
and to the next layer (output) layer
Learning in Biology(Human)
• Learning = learning by adaptation
• The young animal learns that the green fruits are sour, while the
yellowish/reddish ones are sweet. The learning happens by adapting the
fruit picking behaviour.
• At the neural level the learning happens by changing of the synaptic
strengths, eliminating some synapses, and building new ones.
• The objective of adapting the responses on the basis of the information
received from the environment is to achieve a better state. E.g., the
animal likes to eat many energy rich, juicy fruits that make its stomach full,
and makes it feel happy.
• In other words, the objective of learning in biological organisms is to
optimise the amount of available resources, happiness, or in general to
achieve a closer to optimal state
Learning in Artificial Neural Networks
Types of Learning in Neural Network
• Supervised Learning— In supervised learning, the training data is input to
the network, and the desired output is known weights are adjusted until
output yields desired value.
• Unsupervised Learning— The input data is used to train the network
whose output is known. The network classifies the input data and adjusts
the weight by feature extraction in input data.
• Reinforcement Learning— Here the value of the output is unknown, but
the network provides the feedback whether the output is right or wrong.
It is semi-supervised learning.
• Offline Learning— The adjustment of the weight vector and threshold is
done only after all the training set is presented to the network. it is also
called batch learning.
• Online Learning— The adjustment of the weight and threshold is done
after presenting each training sample to the network.
Learning Data sets in ANN
• Training set: A set of examples used for learning, that is to fit the
parameters [i.e. weights] of the network. One Epoch comprises of one full
training cycle on the training set.
• Validation set: A set of examples used to tune the parameters [i.e.
architecture] of the network. For example to choose the number of hidden
units in a neural network.
• Test set: A set of examples used only to assess the performance
[generalization] of a fully specified network or to apply successfully in
predicting output whose input is known.
Learning principle for
artificial neural networks
• Learning occurs when the weights inside the network get updated after
many iterations.
• For example — Suppose we have inputs in the form of patterns for two
different class of patterns — I & 0 as shown and b -bias and y as desired
output.
• We want to classify input patterns into either pattern ‘I’ & ‘O’.
• Following are the steps performed:
• 9 inputs from x1 — x9 along with bias b (input having weight value 1) is fed
to the network for the first pattern.
• Initially, weights are initialized to zero.
• Then weights are updated for each neuron using the formulae: Δ wi = xi y
for i = 1 to 9 (Hebb’s Rule)
• Finally, new weights are found using the formulae:
• wi(new) = wi(old) + Δwi
• Wi(new) = [111–11–1 1111]
• The second pattern is input to the network. This time, weights are not
initialized to zero. The initial weights used here are the final weights
obtained after presenting the first pattern. By doing so, the network
• The steps from 1–4 are repeated for second inputs.
• The new weights are Wi(new) = [0 0 0 -2 -2 -2 000]
• So, these weights correspond to the learning ability of network to classify
the input patterns successfully.
Learning principle for
artificial neural networks
Uses of ANN
Using ANNs requires an understanding of their characteristics.
• Choice of model: This depends on the data representation and the
application. Overly complex models slow learning.
• Learning algorithm: Numerous trade-offs exist between learning
algorithms. Almost any algorithm will work well with the correct hyper
parameters for training on a particular data set. However, selecting and
tuning an algorithm for training on unseen data requires significant
experimentation.
• Robustness: If the model, cost function and learning algorithm are
selected appropriately, the resulting ANN can become robust.
ANN capabilities fall within the following broad categories
• Function approximation, or regression analysis, including time series
prediction, fitness approximation and modelling.
• Classification, including pattern and sequence recognition, novelty
detection and sequential decision making.
• Data processing, including filtering, clustering, blind source separation and
compression.
• Robotics, including directing manipulators and prostheses.
• Control, including computer numerical control.
Uses of ANN
• Classification — A neural network can be trained to classify given pattern
or data set into predefined class. It uses feed forward networks.
• Prediction — A neural network can be trained to produce outputs that are
expected from given input. E.g.: — Stock market prediction.
• Clustering — The Neural network can be used to identify a special feature
of the data and classify them into different categories without any prior
knowledge of the data.
• Following networks are used for clustering -
• Competitive networks
• Adaptive Resonance Theory Networks
• Kohonen Self-Organizing Maps.
• Association — A neural network can be trained to remember the certain
pattern, so that when the noise pattern is presented to the network, the
network associates it with the closest one in the memory or discard it.
E.g. — Hopfield Networks which performs recognition, classification, and
clustering etc.
Uses of ANN
Applications
• Because of their ability to reproduce and model nonlinear processes,
ANNs have found many applications in a wide range of disciplines.
• Application areas include system identification and control (vehicle
control, trajectory prediction, process control, natural
resources management), quantum chemistry,[game-playing and decision
making (backgammon, chess, poker), pattern recognition (radar
systems, face identification, signal classification, object recognition and
more), sequence recognition (gesture, speech, handwritten text
recognition), medical diagnosis, finance (e.g. automated trading
systems), data mining, visualization, machine translation, social network
filtering and e-mail spam filtering.
• ANNs have been used to diagnose cancers, including lung cancer, prostate
cancer, colorectal cancer and to distinguish highly invasive cancer cell lines
from less invasive lines using only cell shape information.
• ANNs have been used for building black-box models in geosciences:
hydrology ocean modelling and coastal engineering, and geomorphology,
are just few examples of this kind.
Since neural networks are best at identifying patterns or trends in data,
they are well suited for prediction or forecasting needs including:
• sales forecasting
• industrial process control
• customer research
• data validation
• risk management
• target marketing
Applications
Summary
• Artificial neural networks are inspired by the learning processes that take
place in biological systems.
• Artificial neurons and neural networks try to imitate the working
mechanisms of their biological counterparts.
• Learning can be perceived as an optimisation process.
• Biological neural learning happens by the modification of the synaptic
strength. Artificial neural networks learn in the same way.
• The synapse strength modification rules for artificial neural networks can
be derived by applying mathematical optimisation methods.
Summary
• Learning tasks of artificial neural networks can be reformulated as
function approximation tasks.
• Neural networks can be considered as nonlinear function approximating
tools (i.e., linear combinations of nonlinear basis functions), where the
parameters of the networks should be found by applying optimisation
methods.
• The optimisation is done with respect to the approximation error
measure.
• In general it is enough to have a single hidden layer neural network (MLP,
RBF or other) to learn the approximation of a nonlinear function. In such
cases general optimisation can be applied to find the change rules for the
synaptic weights.
Artificial neural network

More Related Content

What's hot

Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural network
Nagarajan
 
Artificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksArtificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural Networks
The Integral Worm
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
Mohamed Talaat
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
Knoldus Inc.
 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural Networks
Databricks
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
omaraldabash
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.
Mohd Faiz
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
Sopheaktra YONG
 
Supervised learning and Unsupervised learning
Supervised learning and Unsupervised learning Supervised learning and Unsupervised learning
Supervised learning and Unsupervised learning
Usama Fayyaz
 
Self-organizing map
Self-organizing mapSelf-organizing map
Self-organizing map
Tarat Diloksawatdikul
 
Bayes Belief Networks
Bayes Belief NetworksBayes Belief Networks
Bayes Belief Networks
Sai Kumar Kodam
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
EdutechLearners
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics Course
Mohaiminur Rahman
 
neural network
neural networkneural network
neural network
STUDENT
 
Principles of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksPrinciples of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksSivagowry Shathesh
 
Introduction to artificial neural network
Introduction to artificial neural networkIntroduction to artificial neural network
Introduction to artificial neural network
Dr. C.V. Suresh Babu
 

What's hot (20)

Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural network
 
Artificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksArtificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural Networks
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Soft computing
Soft computingSoft computing
Soft computing
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural Networks
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Supervised learning and Unsupervised learning
Supervised learning and Unsupervised learning Supervised learning and Unsupervised learning
Supervised learning and Unsupervised learning
 
Self-organizing map
Self-organizing mapSelf-organizing map
Self-organizing map
 
Bayes Belief Networks
Bayes Belief NetworksBayes Belief Networks
Bayes Belief Networks
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics Course
 
neural network
neural networkneural network
neural network
 
Principles of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksPrinciples of soft computing-Associative memory networks
Principles of soft computing-Associative memory networks
 
Introduction to artificial neural network
Introduction to artificial neural networkIntroduction to artificial neural network
Introduction to artificial neural network
 
Hidden markov model ppt
Hidden markov model pptHidden markov model ppt
Hidden markov model ppt
 
Hopfield Networks
Hopfield NetworksHopfield Networks
Hopfield Networks
 

Similar to Artificial neural network

02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
Tamer Ahmed Farrag, PhD
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
SherinRappai1
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
SherinRappai
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
alldesign
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
Adityendra Kumar Singh
 
Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17 Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17
Prof. Neeta Awasthy
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
madhu sudhakar
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
Karthik Rohan
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
Mhd Khaled Alhalai
 
Artificial Neural Networks Artificial Neural Networks
Artificial Neural Networks Artificial Neural NetworksArtificial Neural Networks Artificial Neural Networks
Artificial Neural Networks Artificial Neural Networks
MajdDassan
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
pratik610182
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf
gnans Kgnanshek
 
UNIT-3 .PPTX
UNIT-3 .PPTXUNIT-3 .PPTX
UNIT-3 .PPTX
BobyBhagora
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdf
neelamsanjeevkumar
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
nainabhatt2
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
NainaBhatt1
 
Lec 1-2-3-intr.
Lec 1-2-3-intr.Lec 1-2-3-intr.
Lec 1-2-3-intr.
Taymoor Nazmy
 
Artificial Neural Network Learning Algorithm.ppt
Artificial Neural Network Learning Algorithm.pptArtificial Neural Network Learning Algorithm.ppt
Artificial Neural Network Learning Algorithm.ppt
NJUSTAiMo
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Deepu Gupta
 
7 nn1-intro.ppt
7 nn1-intro.ppt7 nn1-intro.ppt
7 nn1-intro.ppt
Sambit Satpathy
 

Similar to Artificial neural network (20)

02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17 Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Artificial Neural Networks Artificial Neural Networks
Artificial Neural Networks Artificial Neural NetworksArtificial Neural Networks Artificial Neural Networks
Artificial Neural Networks Artificial Neural Networks
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf
 
UNIT-3 .PPTX
UNIT-3 .PPTXUNIT-3 .PPTX
UNIT-3 .PPTX
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdf
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Lec 1-2-3-intr.
Lec 1-2-3-intr.Lec 1-2-3-intr.
Lec 1-2-3-intr.
 
Artificial Neural Network Learning Algorithm.ppt
Artificial Neural Network Learning Algorithm.pptArtificial Neural Network Learning Algorithm.ppt
Artificial Neural Network Learning Algorithm.ppt
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02
 
7 nn1-intro.ppt
7 nn1-intro.ppt7 nn1-intro.ppt
7 nn1-intro.ppt
 

Recently uploaded

一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理
zwunae
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
bakpo1
 
Tutorial for 16S rRNA Gene Analysis with QIIME2.pdf
Tutorial for 16S rRNA Gene Analysis with QIIME2.pdfTutorial for 16S rRNA Gene Analysis with QIIME2.pdf
Tutorial for 16S rRNA Gene Analysis with QIIME2.pdf
aqil azizi
 
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
ihlasbinance2003
 
ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...
ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...
ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...
Mukeshwaran Balu
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Sreedhar Chowdam
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 
Unbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptxUnbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptx
ChristineTorrepenida1
 
spirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptxspirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptx
Madan Karki
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
Aditya Rajan Patra
 
A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...
nooriasukmaningtyas
 
14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application
SyedAbiiAzazi1
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
Dr Ramhari Poudyal
 
PPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testingPPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testing
anoopmanoharan2
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
zwunae
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
Victor Morales
 
Swimming pool mechanical components design.pptx
Swimming pool  mechanical components design.pptxSwimming pool  mechanical components design.pptx
Swimming pool mechanical components design.pptx
yokeleetan1
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
NidhalKahouli2
 

Recently uploaded (20)

一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单专业办理
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
 
Tutorial for 16S rRNA Gene Analysis with QIIME2.pdf
Tutorial for 16S rRNA Gene Analysis with QIIME2.pdfTutorial for 16S rRNA Gene Analysis with QIIME2.pdf
Tutorial for 16S rRNA Gene Analysis with QIIME2.pdf
 
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
 
ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...
ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...
ACRP 4-09 Risk Assessment Method to Support Modification of Airfield Separat...
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 
Unbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptxUnbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptx
 
spirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptxspirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptx
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
 
A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...
 
14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
 
PPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testingPPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testing
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
 
Swimming pool mechanical components design.pptx
Swimming pool  mechanical components design.pptxSwimming pool  mechanical components design.pptx
Swimming pool mechanical components design.pptx
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
 

Artificial neural network

  • 1. RIZVI COLLEGEOF ENGINEERING “A PROJECT ON ARTIFCIAL NEURAL NETWORK” By Arafat Shaikh
  • 2. • What is Artificial Neural Network ? • Biological Neuron • Comparing ANN & BNN • Model • Architecture • Learning Overview
  • 3. What is Artificial Neural Network ?
  • 4. What is Artificial Neural Network ? • Artificial Neural Network (ANN) is an efficient computing system whose central theme is borrowed from the analogy of biological neural networks. • ANNs are also named as “artificial neural systems,” or “parallel distributed processing systems,” or “connectionist systems.” • ANN acquires a large collection of units that are interconnected in some pattern to allow communication between the units. • These units, also referred to as nodes or neurons, are simple processors which operate in parallel. • Every neuron is connected with other neuron through a connection link. Each connection link is associated with a weight that has information about the input signal. • This is the most useful information for neurons to solve a particular problem because the weight usually excites or inhibits the signal that is being communicated. • Each neuron has an internal state, which is called an activation signal. Output signals, which are produced after combining the input signals and activation rule, may be sent to other units.
  • 6. • A neuron (also known as a neurone or nerve cell) is a cell that carries electrical impulses. Neurons are the basic units of the nervous system and its most important part is the brain. • Every neuron is made of a cell body (also called a soma), dendrites and an axon. Dendrites and axons are nerve fibres. There are about 86 billion neurons in the human brain, which comprises roughly 10% of all brain cells. The neurons are supported by glial cells and astrocytes. • Neurons are connected to one another and tissues. They do not touch and instead form tiny gaps called synapses. These gaps can be chemical synapses or electrical synapses and pass the signal from one neuron to the next. Biological Neuron
  • 7. Biological Neuron • Dendrite — It receives signals from other neurons. • Soma (cell body) — It sums all the incoming signals to generate input. • Axon — When the sum reaches a threshold value, neuron fires and the signal travels down the axon to the other neurons. • Synapses — The point of interconnection of one neuron with other neurons. The amount of signal transmitted depend upon the strength (synaptic weights) of the connections. • The connections can be inhibitory (decreasing strength) or excitatory (increasing strength) in nature. • So, neural network, in general, is a highly interconnected network of billions of neuron with trillion of interconnections between them.
  • 8. How is Brain Different from Computers BRAIN COMPUTERS Biological Neurons or Nerve Cells. Silicon Transistor. 200 Billion Neurons, 32 trillion interconnections. 1 Billion bytes RAM, trillion of bytes on disk. Neuron Size: 10-6m. Single Transistor Size: 10-9m. Energy Consumption: 6-10 Joules operation per second. Energy Consumption: 10-16 Joules Operation per second Learning Capability Programming capability
  • 9. Comparing ANN with BNN • As this concept borrowed from ANN there are lot of similarities though there are differences too. • Similarities are in the following table Biological Neural Network Artificial Neural Network Soma Node Dendrites Input Synapse Weights or Interconnections Axon Output
  • 10. Following table Shows some differences between ANN and BNN :- Comparing ANN with BNN Criteria BNN ANN Processing Massively parallel, slow but superior than ANN Massively parallel, fast but inferior than BNN Size 1011 neurons and 1015 interconnections 102 to 104 nodes (mainly depends on the type of application and network designer) Learning They can tolerate ambiguity Very precise, structured and formatted data is required to tolerate ambiguity Fault tolerance Performance degrades with even partial damage It is capable of robust performance, hence has the potential to be fault tolerant Storage capacity Stores the information in the synapse Stores the information in continuous memory locations
  • 11. Analogy of ANN with BNN
  • 12. • The dendrites in biological neural network is analogous to the weighted inputs based on their synaptic interconnection in artificial neural network. • Cell body is analogous to the artificial neuron unit in artificial neural network which also comprises of summation and threshold unit. • Axon carry output that is analogous to the output unit in case of artificial neural network. So, ANN are modelled using the working of basic biological neurons. Analogy of ANN with BNN
  • 13. Model of Artificial Neural Network
  • 14. Model of Artificial Neural Network • Artificial neural networks can be viewed as weighted directed graphs in which artificial neurons are nodes and directed edges with weights are connections between neuron outputs and neuron inputs. • The Artificial Neural Network receives input from the external world in the form of pattern and image in vector form. These inputs are mathematically designated by the notation x(n) for n number of inputs. • Each input is multiplied by its corresponding weights. Weights are the information used by the neural network to solve a problem. Typically weight represents the strength of the interconnection between neurons inside the neural network. • The weighted inputs are all summed up inside computing unit (artificial neuron). In case the weighted sum is zero, bias is added to make the output not- zero or to scale up the system response. Bias has the weight and input always equal to ‘1’.
  • 15. • The sum corresponds to any numerical value ranging from 0 to infinity. • In order to limit the response to arrive at desired value, the threshold value is set up. For this, the sum is passed through activation function. • The activation function is set of the transfer function used to get desired output. There are linear as well as the non-linear activation function. • Some of the commonly used activation function are — binary, sigmoidal (linear) and tan hyperbolic sigmoidal functions(nonlinear). • Binary — The output has only two values either 0 and 1. For this, the threshold value is set up. If the net weighted input is greater than 1, an output is assumed 1 otherwise zero. • Sigmoidal Hyperbolic — This function has ‘S’ shaped curve. Here tan hyperbolic function is used to approximate output from net input. The function is defined as — f (x) = (1/1+ exp(-𝝈x)) where 𝝈— steepness parameter. Model of Artificial Neural Network
  • 16. Architecture A typical neural network contains a large number of artificial neurons called units arranged in a series of layers. In typical artificial neural network, comprises different layers -
  • 17. Architecture • Input layer — It contains those units (artificial neurons) which receive input from the outside world on which network will learn, recognize about or otherwise process. • Output layer — It contains units that respond to the information about how it’s learned any task. • Hidden layer — These units are in between input and output layers. The job of hidden layer is to transform the input into something that output unit can use in some way. • Most neural networks are fully connected that means to say each hidden neuron is fully connected to the every neuron in its previous layer(input) and to the next layer (output) layer
  • 18. Learning in Biology(Human) • Learning = learning by adaptation • The young animal learns that the green fruits are sour, while the yellowish/reddish ones are sweet. The learning happens by adapting the fruit picking behaviour. • At the neural level the learning happens by changing of the synaptic strengths, eliminating some synapses, and building new ones. • The objective of adapting the responses on the basis of the information received from the environment is to achieve a better state. E.g., the animal likes to eat many energy rich, juicy fruits that make its stomach full, and makes it feel happy. • In other words, the objective of learning in biological organisms is to optimise the amount of available resources, happiness, or in general to achieve a closer to optimal state
  • 19. Learning in Artificial Neural Networks
  • 20. Types of Learning in Neural Network • Supervised Learning— In supervised learning, the training data is input to the network, and the desired output is known weights are adjusted until output yields desired value. • Unsupervised Learning— The input data is used to train the network whose output is known. The network classifies the input data and adjusts the weight by feature extraction in input data. • Reinforcement Learning— Here the value of the output is unknown, but the network provides the feedback whether the output is right or wrong. It is semi-supervised learning. • Offline Learning— The adjustment of the weight vector and threshold is done only after all the training set is presented to the network. it is also called batch learning. • Online Learning— The adjustment of the weight and threshold is done after presenting each training sample to the network.
  • 21. Learning Data sets in ANN • Training set: A set of examples used for learning, that is to fit the parameters [i.e. weights] of the network. One Epoch comprises of one full training cycle on the training set. • Validation set: A set of examples used to tune the parameters [i.e. architecture] of the network. For example to choose the number of hidden units in a neural network. • Test set: A set of examples used only to assess the performance [generalization] of a fully specified network or to apply successfully in predicting output whose input is known.
  • 22. Learning principle for artificial neural networks • Learning occurs when the weights inside the network get updated after many iterations. • For example — Suppose we have inputs in the form of patterns for two different class of patterns — I & 0 as shown and b -bias and y as desired output. • We want to classify input patterns into either pattern ‘I’ & ‘O’. • Following are the steps performed: • 9 inputs from x1 — x9 along with bias b (input having weight value 1) is fed to the network for the first pattern. • Initially, weights are initialized to zero. • Then weights are updated for each neuron using the formulae: Δ wi = xi y for i = 1 to 9 (Hebb’s Rule)
  • 23. • Finally, new weights are found using the formulae: • wi(new) = wi(old) + Δwi • Wi(new) = [111–11–1 1111] • The second pattern is input to the network. This time, weights are not initialized to zero. The initial weights used here are the final weights obtained after presenting the first pattern. By doing so, the network • The steps from 1–4 are repeated for second inputs. • The new weights are Wi(new) = [0 0 0 -2 -2 -2 000] • So, these weights correspond to the learning ability of network to classify the input patterns successfully. Learning principle for artificial neural networks
  • 24. Uses of ANN Using ANNs requires an understanding of their characteristics. • Choice of model: This depends on the data representation and the application. Overly complex models slow learning. • Learning algorithm: Numerous trade-offs exist between learning algorithms. Almost any algorithm will work well with the correct hyper parameters for training on a particular data set. However, selecting and tuning an algorithm for training on unseen data requires significant experimentation. • Robustness: If the model, cost function and learning algorithm are selected appropriately, the resulting ANN can become robust.
  • 25. ANN capabilities fall within the following broad categories • Function approximation, or regression analysis, including time series prediction, fitness approximation and modelling. • Classification, including pattern and sequence recognition, novelty detection and sequential decision making. • Data processing, including filtering, clustering, blind source separation and compression. • Robotics, including directing manipulators and prostheses. • Control, including computer numerical control. Uses of ANN
  • 26. • Classification — A neural network can be trained to classify given pattern or data set into predefined class. It uses feed forward networks. • Prediction — A neural network can be trained to produce outputs that are expected from given input. E.g.: — Stock market prediction. • Clustering — The Neural network can be used to identify a special feature of the data and classify them into different categories without any prior knowledge of the data. • Following networks are used for clustering - • Competitive networks • Adaptive Resonance Theory Networks • Kohonen Self-Organizing Maps. • Association — A neural network can be trained to remember the certain pattern, so that when the noise pattern is presented to the network, the network associates it with the closest one in the memory or discard it. E.g. — Hopfield Networks which performs recognition, classification, and clustering etc. Uses of ANN
  • 27. Applications • Because of their ability to reproduce and model nonlinear processes, ANNs have found many applications in a wide range of disciplines. • Application areas include system identification and control (vehicle control, trajectory prediction, process control, natural resources management), quantum chemistry,[game-playing and decision making (backgammon, chess, poker), pattern recognition (radar systems, face identification, signal classification, object recognition and more), sequence recognition (gesture, speech, handwritten text recognition), medical diagnosis, finance (e.g. automated trading systems), data mining, visualization, machine translation, social network filtering and e-mail spam filtering. • ANNs have been used to diagnose cancers, including lung cancer, prostate cancer, colorectal cancer and to distinguish highly invasive cancer cell lines from less invasive lines using only cell shape information. • ANNs have been used for building black-box models in geosciences: hydrology ocean modelling and coastal engineering, and geomorphology, are just few examples of this kind.
  • 28. Since neural networks are best at identifying patterns or trends in data, they are well suited for prediction or forecasting needs including: • sales forecasting • industrial process control • customer research • data validation • risk management • target marketing Applications
  • 29. Summary • Artificial neural networks are inspired by the learning processes that take place in biological systems. • Artificial neurons and neural networks try to imitate the working mechanisms of their biological counterparts. • Learning can be perceived as an optimisation process. • Biological neural learning happens by the modification of the synaptic strength. Artificial neural networks learn in the same way. • The synapse strength modification rules for artificial neural networks can be derived by applying mathematical optimisation methods.
  • 30. Summary • Learning tasks of artificial neural networks can be reformulated as function approximation tasks. • Neural networks can be considered as nonlinear function approximating tools (i.e., linear combinations of nonlinear basis functions), where the parameters of the networks should be found by applying optimisation methods. • The optimisation is done with respect to the approximation error measure. • In general it is enough to have a single hidden layer neural network (MLP, RBF or other) to learn the approximation of a nonlinear function. In such cases general optimisation can be applied to find the change rules for the synaptic weights.