SlideShare ist ein Scribd-Unternehmen logo
1 von 40
Downloaden Sie, um offline zu lesen
1
Statistical Pattern
Recognition
A Review
Presented by : SYED ATIF CHISHTI
The Review Paper is divided in to 9 section:
 Introduction.
 Statistical Pattern Recognition.
 The Curse of Dimensionality
 Dimensionality Reduction.
 Classifiers.
 Classifier Combination.
 Error Estimation.
 Unsupervised Classification.
 Frontiers of Pattern Recognition.
2
3
Introduction
Topics covered:
Pattern Recognition & Example.
Template Plating.
Statistical Approach
Syntactic Approach
Neural Networks.
4
Objective
 To summarize and compare well known methods
Goal
 Goal of PR is to supervised or unsupervised Classification.
Pattern
 As opposite of a Coas
 It is an Entity
 vaguely defined
 Example: Finger Print image, Human Face, Speech signal,
hand written cursive
 Selection of Training and Test samples.
 Definition of pattern classes
 Sensing environment
 Pattern representation
 Feature extraction and selection
 Cluster analysis
 Classifier design
5
6
7
 A template having 2 D shape or prototype of
pattern is matched against the stored
template.
 Determines the similarity between 2 entities
 Correlation.
Disadvantage
 Patterns are distorted.
8
 Each pattern is represented in D features in d
dimensional space as a point.
 Objective to establish decision boundaries in
the feature space which separate pattern of
different classes.
 Discriminate analysis based approach for
classification
 Using mean squared error criteria
 Construct the decision boundaries of the
specified form
9
 Simplest/Elementary sub patterns are called
primitives
 Complex pattern are represented as the
interrelation of these primitives
 A formal analogy is drawn between structure
of Patterns and syntax language in which
pattern viewed as sentences and primitives
viewed as alphabet of language.
Challenges
 Segmentation of noisy patterns.
10
 Massively parallel computing system
Consists of an extremely large number of
simple processors with many interconnection.
 Ability to learn complex non linear
input/output relationship.
 Feed forward network, Self-Organizing
map(SOM).
11
12
 Pattern is represented by set of d
features/attributes viewed as D-dimensional
feature space.
 System is operating in two modes i.e Training
and classification.
13
Decision Making Process
 Pattern assign to one of the C categories/Class
W1,W2,...,Wc based on a vector of d features values
x=(x1,x2,...,xd)
 Class conditional Probability = P(x|wi)
 Conditional Risk = R(wi|x)=∑L(wi,wj).P(wj|X)
where L(wi,wj) is loss in curred in deciding wi when true class
is wj.
 Posterior Probability = P(Wj|X)
 For 0/1 loss function = L(wi,wj)={0,i=j
{1,i≠j
 Assign input pattern x to class wi if
P(Wi|X)› P(Wj|X) for all j≠i
14
15
 If all of the class conditional densities is known then Bayes
decision rule can be used to design a classifier.
 If the form of class conditional densities is known
(multivariate gaussian) but parameter like an mean vectors
and covariance matrix) not known then we have a
parametric decision problem. Replace the unknown
paramters with estimated value.
 If form of class conditional density not known that we are
in non parametric mode. In such cases we used Parzen
window (estimate the density function) or directly construct
boundry by using KNN rule.
 Optimizing the classifier to maximize its performance on
training data will NOT give such result on test data.
Statistical Pattern Recognition
(cont..)
16
 The number of features is too large relative to the number
of training samples.
Performance of classifier depend on
◦ The sample size,
◦ number of features and
◦ classifier complexity.
Curse of dimensionality
◦ Naive table-lookup technique requires the number of
training data points to be exponential function of feature
dimension.
 Small number of feature can reduce the curse of
dimensionality when Training sample is limited.
17
 If number of training sample is small relative to the number
of feature then it degrade the performance of classifier
Trunk Example
 Two class classification with equal Prior probabilites,
multivariate Gaussian and identity covariance matrix.
 The mean vector have following component
18
19
Case 1: Mean vector m is known:
 Use bayes decision rule with 0/1 loss
function to construct decision boundry.
Case 2 : Mean vector m is unknown:
Pe(n,d)=1/2
Cases
20
Result
 We can’t increase the number of features
when parameters of class conditional
densities estimated from a finite number of
samples.
 Dimensionality of pattern or number of features should be small due to
 Measurement cost and classification accuracy.
 Can reduce the curse of dimensionality when training sample is
limited.
Disadvantage :
 Reduction in number of features lead to a loss in the discrimination power
and lower the accuracy of Rs
Feature Selection :
 Feature selection refers to algorithm which select the best subset of the
input feature set.
Feature extraction
 Feature extraction algorithm are methods which create new feature after
transformation of original feature set.
21
 Chernoff represent each pattern as cartoon face with
nose length, face curvature & eye size as features.
 Setosa looks quite different from others two.
 Two dimensional Plot : PCA and Fisher mapping
22
23
24
25
 Designer have access to multiple classifier.
 A single training set which is collected at different time and
environment uses different feature .
 Each classifier has its own region in feature space
 Some classifier show different result with different
initialization
Schemes to Combine multiple Classifier
 Parallel: All individual classifier invoked independently
 Cascading: Individual classifiers invoked in linear sequence.
 Tree like: Individual classifiers are combined into structure
similar to decision tree classifier.
26
 Stacking
 Bagging
 Boosting
Combiner
 Trainability
 Adaptive
Expectation output
 Confidence
 Rank
 Abstract
27
28
29
 Classification error or error rate Pe is the ultimate
measure of the performance of classifier.
 Error probability.
 For consistent training rule the value of Pe
approaches to bayes error for increasing sample
size.
 A simple analytical expression for Pe is
impossible to write even in multivariate Gaussian
densities.
 Maximum Likelihood estimate Pe˄ of Pe is =T/N
30
31
32
33
 The Objective is to construct decision
boundaries based on unlabeled training data.
 Clustering algorithm based on two technique
◦ Iterative square error clustering.
◦ Agglomerative hierarchical clustering.
34
35
 A given set of n patterns in d dimension has
partitioned in to k clusters. Mean vector
defined as :
 The square error for cluster Ck is the sum of
squared Euclidean distances between each
pattern in Ck and cluster centre m.
36
37
38
39
40

Weitere ähnliche Inhalte

Was ist angesagt?

Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
SPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSINGSPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSINGmuthu181188
 
Chapter 9 morphological image processing
Chapter 9   morphological image processingChapter 9   morphological image processing
Chapter 9 morphological image processingAhmed Daoud
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classificationsathish sak
 
Point processing
Point processingPoint processing
Point processingpanupriyaa7
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reductionmrizwan969
 
Image segmentation ppt
Image segmentation pptImage segmentation ppt
Image segmentation pptGichelle Amon
 
pattern classification
pattern classificationpattern classification
pattern classificationRanjan Ganguli
 
Bias and variance trade off
Bias and variance trade offBias and variance trade off
Bias and variance trade offVARUN KUMAR
 
Enhancement in spatial domain
Enhancement in spatial domainEnhancement in spatial domain
Enhancement in spatial domainAshish Kumar
 
Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentationasodariyabhavesh
 
Fidelity criteria in image compression
Fidelity criteria in image compressionFidelity criteria in image compression
Fidelity criteria in image compressionKadamPawan
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning Mohammad Junaid Khan
 
Feature Extraction
Feature ExtractionFeature Extraction
Feature Extractionskylian
 
Image segmentation
Image segmentationImage segmentation
Image segmentationKuppusamy P
 

Was ist angesagt? (20)

Back propagation
Back propagationBack propagation
Back propagation
 
Edge detection
Edge detectionEdge detection
Edge detection
 
Sharpening spatial filters
Sharpening spatial filtersSharpening spatial filters
Sharpening spatial filters
 
SPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSINGSPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSING
 
Chapter 9 morphological image processing
Chapter 9   morphological image processingChapter 9   morphological image processing
Chapter 9 morphological image processing
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classification
 
Point processing
Point processingPoint processing
Point processing
 
Spatial domain and filtering
Spatial domain and filteringSpatial domain and filtering
Spatial domain and filtering
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reduction
 
Image segmentation ppt
Image segmentation pptImage segmentation ppt
Image segmentation ppt
 
Noise Models
Noise ModelsNoise Models
Noise Models
 
pattern classification
pattern classificationpattern classification
pattern classification
 
Bias and variance trade off
Bias and variance trade offBias and variance trade off
Bias and variance trade off
 
Enhancement in spatial domain
Enhancement in spatial domainEnhancement in spatial domain
Enhancement in spatial domain
 
Canny Edge Detection
Canny Edge DetectionCanny Edge Detection
Canny Edge Detection
 
Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentation
 
Fidelity criteria in image compression
Fidelity criteria in image compressionFidelity criteria in image compression
Fidelity criteria in image compression
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning
 
Feature Extraction
Feature ExtractionFeature Extraction
Feature Extraction
 
Image segmentation
Image segmentationImage segmentation
Image segmentation
 

Andere mochten auch

Statistical pattern recognition and machine learning
Statistical pattern recognition and machine learningStatistical pattern recognition and machine learning
Statistical pattern recognition and machine learningbutest
 
Multivariate decision tree
Multivariate decision treeMultivariate decision tree
Multivariate decision treePrafulla Shukla
 
CART: Not only Classification and Regression Trees
CART: Not only Classification and Regression TreesCART: Not only Classification and Regression Trees
CART: Not only Classification and Regression TreesMarc Garcia
 
Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...
Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...
Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...Sunil Nair
 

Andere mochten auch (18)

Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
 
Introduction to pattern recognition
Introduction to pattern recognitionIntroduction to pattern recognition
Introduction to pattern recognition
 
Statistical pattern recognition and machine learning
Statistical pattern recognition and machine learningStatistical pattern recognition and machine learning
Statistical pattern recognition and machine learning
 
poster on biometrics
poster on biometricsposter on biometrics
poster on biometrics
 
Multivariate decision tree
Multivariate decision treeMultivariate decision tree
Multivariate decision tree
 
C3.3.1
C3.3.1C3.3.1
C3.3.1
 
Csc446: Pattren Recognition (LN2)
Csc446: Pattren Recognition (LN2)Csc446: Pattren Recognition (LN2)
Csc446: Pattren Recognition (LN2)
 
CART: Not only Classification and Regression Trees
CART: Not only Classification and Regression TreesCART: Not only Classification and Regression Trees
CART: Not only Classification and Regression Trees
 
Neural Networks: Introducton
Neural Networks: IntroductonNeural Networks: Introducton
Neural Networks: Introducton
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
 
Pattern Recognition
Pattern RecognitionPattern Recognition
Pattern Recognition
 
Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...
Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...
Data Mining - Classification Of Breast Cancer Dataset using Decision Tree Ind...
 
Decision tree
Decision treeDecision tree
Decision tree
 
Decision trees
Decision treesDecision trees
Decision trees
 
Decision theory
Decision theoryDecision theory
Decision theory
 
Decision tree
Decision treeDecision tree
Decision tree
 
Transducers
TransducersTransducers
Transducers
 
Pattern Recognition
Pattern RecognitionPattern Recognition
Pattern Recognition
 

Ähnlich wie Statistical Pattern recognition(1)

2.2 decision tree
2.2 decision tree2.2 decision tree
2.2 decision treeKrish_ver2
 
Shriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa NaikShriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa NaikShriram Nandakumar
 
Multimodal Biometrics Recognition by Dimensionality Diminution Method
Multimodal Biometrics Recognition by Dimensionality Diminution MethodMultimodal Biometrics Recognition by Dimensionality Diminution Method
Multimodal Biometrics Recognition by Dimensionality Diminution MethodIJERA Editor
 
A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...Yao Wu
 
IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...
IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...
IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...IRJET Journal
 
IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...
IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...
IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...csandit
 
Machine Learning Approach.pptx
Machine Learning Approach.pptxMachine Learning Approach.pptx
Machine Learning Approach.pptxCYPatrickKwee
 
unit classification.pptx
unit  classification.pptxunit  classification.pptx
unit classification.pptxssuser908de6
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality ReductionSaad Elbeleidy
 
AlgorithmsModelsNov13.pptx
AlgorithmsModelsNov13.pptxAlgorithmsModelsNov13.pptx
AlgorithmsModelsNov13.pptxPerumalPitchandi
 
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...ShivarkarSandip
 
Analysis of machine learning algorithms for character recognition: a case stu...
Analysis of machine learning algorithms for character recognition: a case stu...Analysis of machine learning algorithms for character recognition: a case stu...
Analysis of machine learning algorithms for character recognition: a case stu...nooriasukmaningtyas
 
IRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification AlgorithmsIRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification AlgorithmsIRJET Journal
 
IRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification AlgorithmsIRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification AlgorithmsIRJET Journal
 
Study of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systemsStudy of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systemsChemseddine Berbague
 
Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...
Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...
Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...ijcnes
 
A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...
A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...
A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...wajrcs
 

Ähnlich wie Statistical Pattern recognition(1) (20)

2.2 decision tree
2.2 decision tree2.2 decision tree
2.2 decision tree
 
Shriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa NaikShriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa Naik
 
Second subjective assignment
Second  subjective assignmentSecond  subjective assignment
Second subjective assignment
 
Multimodal Biometrics Recognition by Dimensionality Diminution Method
Multimodal Biometrics Recognition by Dimensionality Diminution MethodMultimodal Biometrics Recognition by Dimensionality Diminution Method
Multimodal Biometrics Recognition by Dimensionality Diminution Method
 
A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...
 
IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...
IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...
IRJET- Evaluation of Classification Algorithms with Solutions to Class Imbala...
 
IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...
IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...
IMPROVING SUPERVISED CLASSIFICATION OF DAILY ACTIVITIES LIVING USING NEW COST...
 
Machine Learning Approach.pptx
Machine Learning Approach.pptxMachine Learning Approach.pptx
Machine Learning Approach.pptx
 
unit classification.pptx
unit  classification.pptxunit  classification.pptx
unit classification.pptx
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reduction
 
AlgorithmsModelsNov13.pptx
AlgorithmsModelsNov13.pptxAlgorithmsModelsNov13.pptx
AlgorithmsModelsNov13.pptx
 
Bank loan purchase modeling
Bank loan purchase modelingBank loan purchase modeling
Bank loan purchase modeling
 
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
Classification, Attribute Selection, Classifiers- Decision Tree, ID3,C4.5,Nav...
 
Analysis of machine learning algorithms for character recognition: a case stu...
Analysis of machine learning algorithms for character recognition: a case stu...Analysis of machine learning algorithms for character recognition: a case stu...
Analysis of machine learning algorithms for character recognition: a case stu...
 
IRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification AlgorithmsIRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification Algorithms
 
IRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification AlgorithmsIRJET- Performance Evaluation of Various Classification Algorithms
IRJET- Performance Evaluation of Various Classification Algorithms
 
Study of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systemsStudy of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systems
 
Supervised Learning.pptx
Supervised Learning.pptxSupervised Learning.pptx
Supervised Learning.pptx
 
Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...
Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...
Perfomance Comparison of Decsion Tree Algorithms to Findout the Reason for St...
 
A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...
A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...
A Fairness-aware Machine Learning Interface for End-to-end Discrimination Dis...
 

Mehr von Syed Atif Naseem

Asset Integrity _ Asset Managment ISO 55001 Certification
Asset Integrity _ Asset Managment ISO 55001 CertificationAsset Integrity _ Asset Managment ISO 55001 Certification
Asset Integrity _ Asset Managment ISO 55001 CertificationSyed Atif Naseem
 
Electrical & Energy Audit 22 Nov.pptx
Electrical & Energy Audit 22 Nov.pptxElectrical & Energy Audit 22 Nov.pptx
Electrical & Energy Audit 22 Nov.pptxSyed Atif Naseem
 
Electrical & Energy Audit (2).pptx
Electrical & Energy Audit (2).pptxElectrical & Energy Audit (2).pptx
Electrical & Energy Audit (2).pptxSyed Atif Naseem
 
Electrical FM Compiled.pdf
Electrical  FM Compiled.pdfElectrical  FM Compiled.pdf
Electrical FM Compiled.pdfSyed Atif Naseem
 
Electrical Safety Audit - FM Pvt ltd.pptx
Electrical Safety Audit - FM Pvt ltd.pptxElectrical Safety Audit - FM Pvt ltd.pptx
Electrical Safety Audit - FM Pvt ltd.pptxSyed Atif Naseem
 
Electrical Audit (Bill's Inc ).pptx
Electrical Audit (Bill's Inc ).pptxElectrical Audit (Bill's Inc ).pptx
Electrical Audit (Bill's Inc ).pptxSyed Atif Naseem
 
Neural Network to identify fault in distribution network of smart grid
Neural Network to identify fault in distribution network of smart gridNeural Network to identify fault in distribution network of smart grid
Neural Network to identify fault in distribution network of smart gridSyed Atif Naseem
 
Energy storage Technologies
Energy storage TechnologiesEnergy storage Technologies
Energy storage TechnologiesSyed Atif Naseem
 
Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...
Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...
Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...Syed Atif Naseem
 
Formal verification of FDIR
Formal verification of FDIRFormal verification of FDIR
Formal verification of FDIRSyed Atif Naseem
 
Reduction in Current leakage in CMOS VLSI Circuits
Reduction in Current leakage in CMOS VLSI CircuitsReduction in Current leakage in CMOS VLSI Circuits
Reduction in Current leakage in CMOS VLSI CircuitsSyed Atif Naseem
 
Insulation testing of power cable
Insulation testing of power cableInsulation testing of power cable
Insulation testing of power cableSyed Atif Naseem
 

Mehr von Syed Atif Naseem (20)

Electrical Safety Audit
Electrical Safety AuditElectrical Safety Audit
Electrical Safety Audit
 
Asset Integrity _ Asset Managment ISO 55001 Certification
Asset Integrity _ Asset Managment ISO 55001 CertificationAsset Integrity _ Asset Managment ISO 55001 Certification
Asset Integrity _ Asset Managment ISO 55001 Certification
 
EMS ISO 50001
EMS ISO 50001 EMS ISO 50001
EMS ISO 50001
 
Electrical & Energy Audit 22 Nov.pptx
Electrical & Energy Audit 22 Nov.pptxElectrical & Energy Audit 22 Nov.pptx
Electrical & Energy Audit 22 Nov.pptx
 
Electrical & Energy Audit (2).pptx
Electrical & Energy Audit (2).pptxElectrical & Energy Audit (2).pptx
Electrical & Energy Audit (2).pptx
 
Electrical FM Compiled.pdf
Electrical  FM Compiled.pdfElectrical  FM Compiled.pdf
Electrical FM Compiled.pdf
 
Energy Audit..pptx
Energy Audit..pptxEnergy Audit..pptx
Energy Audit..pptx
 
Electrical Safety Audit - FM Pvt ltd.pptx
Electrical Safety Audit - FM Pvt ltd.pptxElectrical Safety Audit - FM Pvt ltd.pptx
Electrical Safety Audit - FM Pvt ltd.pptx
 
Electrical Audit (Bill's Inc ).pptx
Electrical Audit (Bill's Inc ).pptxElectrical Audit (Bill's Inc ).pptx
Electrical Audit (Bill's Inc ).pptx
 
Neural Network to identify fault in distribution network of smart grid
Neural Network to identify fault in distribution network of smart gridNeural Network to identify fault in distribution network of smart grid
Neural Network to identify fault in distribution network of smart grid
 
ELECTRIC VEHICLE
ELECTRIC VEHICLEELECTRIC VEHICLE
ELECTRIC VEHICLE
 
Energy storage Technologies
Energy storage TechnologiesEnergy storage Technologies
Energy storage Technologies
 
Asset management
Asset managementAsset management
Asset management
 
Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...
Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...
Probability Estimation for the Fault Detection and Isolation of PMU-Based Tra...
 
Formal verification of FDIR
Formal verification of FDIRFormal verification of FDIR
Formal verification of FDIR
 
Reduction in Current leakage in CMOS VLSI Circuits
Reduction in Current leakage in CMOS VLSI CircuitsReduction in Current leakage in CMOS VLSI Circuits
Reduction in Current leakage in CMOS VLSI Circuits
 
SMART GRID
SMART GRIDSMART GRID
SMART GRID
 
Insulation testing of power cable
Insulation testing of power cableInsulation testing of power cable
Insulation testing of power cable
 
Monitoring System
Monitoring SystemMonitoring System
Monitoring System
 
SAUDI FRENCH
SAUDI FRENCHSAUDI FRENCH
SAUDI FRENCH
 

Statistical Pattern recognition(1)

  • 2. The Review Paper is divided in to 9 section:  Introduction.  Statistical Pattern Recognition.  The Curse of Dimensionality  Dimensionality Reduction.  Classifiers.  Classifier Combination.  Error Estimation.  Unsupervised Classification.  Frontiers of Pattern Recognition. 2
  • 3. 3 Introduction Topics covered: Pattern Recognition & Example. Template Plating. Statistical Approach Syntactic Approach Neural Networks.
  • 4. 4 Objective  To summarize and compare well known methods Goal  Goal of PR is to supervised or unsupervised Classification. Pattern  As opposite of a Coas  It is an Entity  vaguely defined  Example: Finger Print image, Human Face, Speech signal, hand written cursive
  • 5.  Selection of Training and Test samples.  Definition of pattern classes  Sensing environment  Pattern representation  Feature extraction and selection  Cluster analysis  Classifier design 5
  • 6. 6
  • 7. 7
  • 8.  A template having 2 D shape or prototype of pattern is matched against the stored template.  Determines the similarity between 2 entities  Correlation. Disadvantage  Patterns are distorted. 8
  • 9.  Each pattern is represented in D features in d dimensional space as a point.  Objective to establish decision boundaries in the feature space which separate pattern of different classes.  Discriminate analysis based approach for classification  Using mean squared error criteria  Construct the decision boundaries of the specified form 9
  • 10.  Simplest/Elementary sub patterns are called primitives  Complex pattern are represented as the interrelation of these primitives  A formal analogy is drawn between structure of Patterns and syntax language in which pattern viewed as sentences and primitives viewed as alphabet of language. Challenges  Segmentation of noisy patterns. 10
  • 11.  Massively parallel computing system Consists of an extremely large number of simple processors with many interconnection.  Ability to learn complex non linear input/output relationship.  Feed forward network, Self-Organizing map(SOM). 11
  • 12. 12
  • 13.  Pattern is represented by set of d features/attributes viewed as D-dimensional feature space.  System is operating in two modes i.e Training and classification. 13
  • 14. Decision Making Process  Pattern assign to one of the C categories/Class W1,W2,...,Wc based on a vector of d features values x=(x1,x2,...,xd)  Class conditional Probability = P(x|wi)  Conditional Risk = R(wi|x)=∑L(wi,wj).P(wj|X) where L(wi,wj) is loss in curred in deciding wi when true class is wj.  Posterior Probability = P(Wj|X)  For 0/1 loss function = L(wi,wj)={0,i=j {1,i≠j  Assign input pattern x to class wi if P(Wi|X)› P(Wj|X) for all j≠i 14
  • 15. 15  If all of the class conditional densities is known then Bayes decision rule can be used to design a classifier.  If the form of class conditional densities is known (multivariate gaussian) but parameter like an mean vectors and covariance matrix) not known then we have a parametric decision problem. Replace the unknown paramters with estimated value.  If form of class conditional density not known that we are in non parametric mode. In such cases we used Parzen window (estimate the density function) or directly construct boundry by using KNN rule.  Optimizing the classifier to maximize its performance on training data will NOT give such result on test data. Statistical Pattern Recognition (cont..)
  • 16. 16
  • 17.  The number of features is too large relative to the number of training samples. Performance of classifier depend on ◦ The sample size, ◦ number of features and ◦ classifier complexity. Curse of dimensionality ◦ Naive table-lookup technique requires the number of training data points to be exponential function of feature dimension.  Small number of feature can reduce the curse of dimensionality when Training sample is limited. 17
  • 18.  If number of training sample is small relative to the number of feature then it degrade the performance of classifier Trunk Example  Two class classification with equal Prior probabilites, multivariate Gaussian and identity covariance matrix.  The mean vector have following component 18
  • 19. 19 Case 1: Mean vector m is known:  Use bayes decision rule with 0/1 loss function to construct decision boundry. Case 2 : Mean vector m is unknown: Pe(n,d)=1/2 Cases
  • 20. 20 Result  We can’t increase the number of features when parameters of class conditional densities estimated from a finite number of samples.
  • 21.  Dimensionality of pattern or number of features should be small due to  Measurement cost and classification accuracy.  Can reduce the curse of dimensionality when training sample is limited. Disadvantage :  Reduction in number of features lead to a loss in the discrimination power and lower the accuracy of Rs Feature Selection :  Feature selection refers to algorithm which select the best subset of the input feature set. Feature extraction  Feature extraction algorithm are methods which create new feature after transformation of original feature set. 21
  • 22.  Chernoff represent each pattern as cartoon face with nose length, face curvature & eye size as features.  Setosa looks quite different from others two.  Two dimensional Plot : PCA and Fisher mapping 22
  • 23. 23
  • 24. 24
  • 25. 25
  • 26.  Designer have access to multiple classifier.  A single training set which is collected at different time and environment uses different feature .  Each classifier has its own region in feature space  Some classifier show different result with different initialization Schemes to Combine multiple Classifier  Parallel: All individual classifier invoked independently  Cascading: Individual classifiers invoked in linear sequence.  Tree like: Individual classifiers are combined into structure similar to decision tree classifier. 26
  • 27.  Stacking  Bagging  Boosting Combiner  Trainability  Adaptive Expectation output  Confidence  Rank  Abstract 27
  • 28. 28
  • 29. 29
  • 30.  Classification error or error rate Pe is the ultimate measure of the performance of classifier.  Error probability.  For consistent training rule the value of Pe approaches to bayes error for increasing sample size.  A simple analytical expression for Pe is impossible to write even in multivariate Gaussian densities.  Maximum Likelihood estimate Pe˄ of Pe is =T/N 30
  • 31. 31
  • 32. 32
  • 33. 33
  • 34.  The Objective is to construct decision boundaries based on unlabeled training data.  Clustering algorithm based on two technique ◦ Iterative square error clustering. ◦ Agglomerative hierarchical clustering. 34
  • 35. 35
  • 36.  A given set of n patterns in d dimension has partitioned in to k clusters. Mean vector defined as :  The square error for cluster Ck is the sum of squared Euclidean distances between each pattern in Ck and cluster centre m. 36
  • 37. 37
  • 38. 38
  • 39. 39
  • 40. 40