SlideShare ist ein Scribd-Unternehmen logo
1 von 80
Classifications
Mahmoud Rafeek Alfarra
http://mfarra.cst.ps
University College of Science & Technology- Khan yonis
Development of computer systems
2016
Chapter 3 – Lecture 1
Outline
Definition of classification
Basic principles of classification
Typical applications of classification
How Does Classification Works?
Difference between Classification & Prediction.
 Machine learning techniques
 Decision Trees
 k-Nearest Neighbors
2
Definition
 Classification is: Techniques used to predict group
membership for data instances.
 Classification is: a data mining function that assigns items
in a collection to target categories or classes.
 The goal of classification is to accurately predict the target
class for each case in the data.
Definition
For example:
 Identify loan applicants as low, medium, or high credit risks.
 A bank loan officer wants to analyze the data in order to know
which customer (loan applicant) are risky or which are safe.
A marketing manager at a company needs to analyze a
customer with a given profile, who will buy a new computer.
4
Definition
5
Definition
6
Definition
7
Definition
 This type of learning called supervised learning
8
Basic principles of classification
9
Want to classify objects as boats and houses
Basic principles of classification
10
 All objects before the coast line are boats and all objects after the coast
line are houses.
 Coast line serves as a decision surface that separates two classes.
11
 The methods that build classification models (i.e., “classification algorithms”)
operate very similarly to the previous example.
 First all objects are represented geometrically.
Basic principles of classification
12
Then the algorithm seeks to find a decision surface that
separates classes of objects .
Basic principles of classification
13
Unseen (new) objects are classified as “boats” if they fall
below the decision surface and as “houses” if the fall above it.
Basic principles of classification
Applications of classification
 Medical diagnosis: if a tumor is Primary or Metastatic
14
Credit/loan approval.
Fraud detection: if a transaction is fraudulent
Web page categorization: which category it is
Classifying secondary structures of protein
as alpha-helix, beta-sheet, or random
coil
Categorizing news stories as finance, weather,
entertainment, sports, etc
15
Applications of classification
Classifications
Mahmoud Rafeek Alfarra
http://mfarra.cst.ps
University College of Science & Technology- Khan yonis
Development of computer systems
2016
Chapter 3 – Lecture 2
Outline
 Definition of classification
 Basic principles of classification
 Typical applications of classification
 How Does Classification Works?
 Difference between Classification & Prediction.
 Machine learning techniques
 Decision Trees
 k-Nearest Neighbors
17
How Does Classification Works?
The Data Classification process includes two steps:
 Building the Classifier or Model
 Using Classifier for Classification
18
How Does Classification Works?
19
Given a collection of records (training set )
Each record contains a set of attributes, one of the
attributes is the class.
Find a model for class attribute as a function of the
values of other attributes.
Goal: previously unseen records should be assigned a
class as accurately as possible.
A test set is used to determine the accuracy of the
model.
Usually, the given data set is divided into training and
test sets, with training set used to build the model and
test set used to validate it.
20
How Does Classification Works?
21
Training
Data
NAME RANK YEARS TENURED
Mike Assistant Prof 3 no
Mary Assistant Prof 7 yes
Bill Professor 2 yes
Jim Associate Prof 7 yes
Dave Assistant Prof 6 no
Anne Associate Prof 3 no
Classification
Algorithms
IF rank = ‘professor’
OR years > 6
THEN tenured = ‘yes’
Classifier
(Model)
Process 1 Model Construction: example
22
Classifier
Testing
Data
NAME RANK YEARS TENURED
Tom Assistant Prof 2 no
Merlisa Associate Prof 7 no
George Professor 5 yes
Joseph Assistant Prof 7 yes
Unseen Data
(Jeff, Professor, 4)
Tenured?
Process 2 Using the Model in Prediction
Difference between
Classification & Prediction
Classification
 Has prior knowledge
about class.
A model or a classifier will
be constructed that predicts
the class of unseen object.
Prediction
 Has prior knowledge
about class.
 A model or a predictor
will be constructed that
predicts a continuous-
valued-function or ordered
value.
23
Difference between
Classification & Prediction
If you use a classification model to predict the
treatment outcome for a new patient, it would be
a prediction.
 In the book "Data Mining Concepts and
Techniques", Han and Kamber's view is that
predicting class labels is classification, and
predicting values is prediction.
24
Machine learning techniques
 Learning: Things learn when they change their behavior in a
way that makes them perform better in the future.
 Machine learning: is the subfield of artificial intelligence that
is concerned with the design and development of algorithms
that allow computers (machines) to improve their performance
over time (to learn) based on data, such as from sensor data or
databases.
Machine learning techniques
Comparing Classification Methods:
 Predictive Accuracy: Ability to correctly predict the class
label.
 Speed: Computation costs involved in generating and
using model.
 Robustness: Ability to make correct predictions given
noisy or/and missing values.
 Scalability: Ability to construct model efficiently given
large amounts of data.
 Interpretability: Level of understanding and insight that is
provided by the model.
26
Machine learning techniques
Examples of machine learning techniques:
Decision Trees
Rule Induction
k-Nearest Neighbors
Naïve Bayesian Classifiers
Neural Network
27
Classifications
Decision Trees
Mahmoud Rafeek Alfarra
http://mfarra.cst.ps
University College of Science & Technology- Khan yonis
Development of computer systems
2016
Chapter 3 – Lecture 3
Outline
 Definition of classification
 Basic principles of classification
 Typical applications of classification
 How Does Classification Works?
 Difference between Classification & Prediction.
 Machine learning techniques
 Decision Trees
 k-Nearest Neighbors
29
Outline
 Definition
 Decision tree consist of …
 Decision Tree Classification Task
 Apply Model to Test Data
 Building Tree
 A criterion for attribute selection
 Decision Tree to Decision Rules
30
Definition
Decision tree learning is a common method used in data
mining.
 It is an efficient method for producing classifiers from data.
A Decision Tree is a tree-structured plan of a set of attributes
to test in order to predict the output.
It is a type of tree-diagram used in determining the optimum
course of action, in situations having several possible
alternatives with uncertain outcomes.
31
32
Definition
33
Example
34
Example
35
Example
Decision Trees: Definition
Decision tree consist of:
An internal node is a test on an attribute, e.g. Body temperature .
A branch represents an outcome of the test, e.g., Warm
A leaf node represents a class label e.g. Mammals
At each node, one attribute is chosen to split training examples
into distinct classes as much as possible
A new case is classified by following a matching path to a leaf
node.
36
Example of a Decision Tree
Tid Refund Marital
Status
Taxable
Income Cheat
1 Yes Single 125K No
2 No Married 100K No
3 No Single 70K No
4 Yes Married 120K No
5 No Divorced 95K Yes
6 No Married 60K No
7 Yes Divorced 220K No
8 No Single 85K Yes
9 No Married 75K No
10 No Single 90K Yes
10
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Splitting Attributes
Training Data Model: Decision Tree
Another Example of Decision Tree
Tid Refund Marital
Status
Taxable
Income Cheat
1 Yes Single 125K No
2 No Married 100K No
3 No Single 70K No
4 Yes Married 120K No
5 No Divorced 95K Yes
6 No Married 60K No
7 Yes Divorced 220K No
8 No Single 85K Yes
9 No Married 75K No
10 No Single 90K Yes
10
MarSt
Refund
TaxInc
YESNO
NO
NO
Yes No
Married
Single,
Divorced
< 80K > 80K
There could be more than one tree that fits
the same data!
Decision Tree Classification Task
Apply
Model
Induction
Deduction
Learn
Model
Model
Tid Attrib1 Attrib2 Attrib3 Class
1 Yes Large 125K No
2 No Medium 100K No
3 No Small 70K No
4 Yes Medium 120K No
5 No Large 95K Yes
6 No Medium 60K No
7 Yes Large 220K No
8 No Small 85K Yes
9 No Medium 75K No
10 No Small 90K Yes
10
Tid Attrib1 Attrib2 Attrib3 Class
11 No Small 55K ?
12 Yes Medium 80K ?
13 Yes Large 110K ?
14 No Small 95K ?
15 No Large 67K ?
10
Test Set
Tree
Induction
algorithm
Training Set
Decision Tree
Apply Model to Test Data
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Refund Marital
Status
Taxable
Income Cheat
No Married 80K ?
10
Test Data
Start from the root of tree.
Apply Model to Test Data
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Refund Marital
Status
Taxable
Income Cheat
No Married 80K ?
10
Test Data
Apply Model to Test Data
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Refund Marital
Status
Taxable
Income Cheat
No Married 80K ?
10
Test Data
Apply Model to Test Data
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Refund Marital
Status
Taxable
Income Cheat
No Married 80K ?
10
Test Data
Apply Model to Test Data
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Refund Marital
Status
Taxable
Income Cheat
No Married 80K ?
10
Test Data
Apply Model to Test Data
Refund
MarSt
TaxInc
YESNO
NO
NO
Yes No
MarriedSingle, Divorced
< 80K > 80K
Refund Marital
Status
Taxable
Income Cheat
No Married 80K ?
10
Test Data
Assign Cheat to “No”
Decision Tree Classification Task
Apply
Model
Induction
Deduction
Learn
Model
Model
Tid Attrib1 Attrib2 Attrib3 Class
1 Yes Large 125K No
2 No Medium 100K No
3 No Small 70K No
4 Yes Medium 120K No
5 No Large 95K Yes
6 No Medium 60K No
7 Yes Large 220K No
8 No Small 85K Yes
9 No Medium 75K No
10 No Small 90K Yes
10
Tid Attrib1 Attrib2 Attrib3 Class
11 No Small 55K ?
12 Yes Medium 80K ?
13 Yes Large 110K ?
14 No Small 95K ?
15 No Large 67K ?
10
Test Set
Tree
Induction
algorithm
Training Set
Decision Tree
Decision Trees: Building Tree
There is a large number of decision-tree induction
algorithms described primarily in the machine-learning
and applied-statistics literature.
 They are supervised learning methods that construct
decision trees from a set of input-output samples.
Optimal tree is the smallest.
47
Decision Trees: Building Tree
There is a large number of decision-tree induction
algorithms described primarily in the machine-learning
and applied-statistics literature.
 They are supervised learning methods that construct
decision trees from a set of input-output samples.
Optimal tree is the smallest.
48
Decision Trees: Building Tree
Top-down tree construction
◦ At start, all training examples are at the root.
◦ Partition the examples recursively by choosing one
attribute each time.
49
Decision Trees: Building Tree
Top-Down Induction of Decision Trees (Greedy Tree Growing)
Recursive Partitioning
◦ find “best” attribute test to install at root
◦ split data on root test
◦ find “best” attribute test to install at each new node
◦ split data on new test
◦ repeat until:
◦ all nodes are pure
◦ all nodes contain fewer than k cases
◦ tree reaches predetermined max depth
◦ no more attributes to test
50
Find “best” attribute test to install at root
Outlook Temperature Humidity Windy Play?
sunny hot high false No
sunny hot high true No
overcast hot high false Yes
rain mild high false Yes
rain cool normal false Yes
rain cool normal true No
overcast cool normal true Yes
sunny mild high false No
sunny cool normal false Yes
rain mild normal false Yes
sunny mild normal true Yes
overcast mild high true Yes
overcast hot normal false Yes
rain mild high true No
Decision Trees: Example
Decision Trees: Example
Overcast it
does not
need split
because it’s
need yes
A criterion for attribute selection
Which is the best attribute?
◦ The one which will result in the smallest tree
◦ Heuristic: choose the attribute that produces the “purest”
nodes
Popular impurity criterion: information gain
◦ Information gain increases with the average purity of the
subsets that an attribute produces
Strategy: choose attribute that results in greatest information
gain
53
Decision Tree to Decision Rules
A decision tree can easily be transformed to a set of rules
by mapping from the root node to the leaf nodes one by
one.
54
Decision Tree to Decision Rules
55
Practice: Draw decision tree using
the following instances:
56
Weekend Weather Parents Money
Decision
(Category)
W1 Sunny Yes Rich Cinema
W2 Sunny No Rich Tennis
W3 Windy Yes Rich Cinema
W4 Rainy Yes Poor Cinema
W5 Rainy No Rich Stay in
W6 Rainy Yes Poor Cinema
W7 Windy No Poor Cinema
W8 Windy No Rich Shopping
W9 Windy Yes Rich Cinema
W10 Sunny No Rich Tennis
Practice: Convert the following to
Decision Rules
57
Weekend Weather Parents Money
Decision
(Category)
D1 Sunny Yes Rich Cinema
W2 Sunny No Rich Tennis
W3 Windy Yes Rich Cinema
W4 Rainy Yes Poor Cinema
W5 Rainy No Rich Stay in
W6 Rainy Yes Poor Cinema
W7 Windy No Poor Cinema
W8 Windy No Rich Shopping
W9 Windy Yes Rich Cinema
W10 Sunny No Rich Tennis
Practice: Classify the following
instances:
58
Weekend Weather Parents Money
Decision
(Category)
W11 Rainy Yes Rich ?
W12 Windy No Poor ?
W13 Sunny No Poor ?
W14 Windy Yes Poor ?
W15 Rainy No Poor ?
Practice: Again using the following
59
Classifications
k-Nearest Neighbors
Mahmoud Rafeek Alfarra
http://mfarra.cst.ps
University College of Science & Technology- Khan yonis
Development of computer systems
2016
Chapter 3 – Lecture 4
Outline
 Definition of classification
 Basic principles of classification
 Typical applications of classification
 How Does Classification Works?
 Difference between Classification & Prediction.
 Machine learning techniques
 Decision Trees
 k-Nearest Neighbors
61
Definition: K-Nearest Neighbor
K-nearest neighbor is a supervised learning algorithm.
 A case is classified by a majority vote of its neighbors, with the
case being assigned to the class most common amongst its K
nearest neighbors measured by a distance function.
If K = 1, then the case is simply assigned to the class of its nearest
neighbor.
 The purpose of this algorithm is to classify a new object based on
attributes and training samples.
Also called instance based learning.
62
Definition: K-Nearest Neighbor
63
K-Nearest Neighbor Algorithm
Given a new instance x,
find its nearest neighbor <x’,y’>
Return y’ as the class of x
64
K-Nearest Neighbor: Example
A simple example of the nearest neighbor prediction
algorithm is that if you look at the people in your
neighborhood you may notice that, in general, you all have
similar incomes.
 So if your neighbors have an income greater than
$50,000, you have a chance to have a high income as well.
65
K-Nearest NeighborAlgorithm


n
i
ii yxYXd
1
2
)(),(
66
All instances correspond to points in the n-D space.
The nearest neighbor are defined in terms of Euclidean
distance.
Euclidean distance between two points , X=(x1,x2,…,xn) and
Y=(y1,y2,…yn) is:
K-Nearest Neighbor Algorithm
Here is step by step on how to compute K-nearest
neighbors KNN algorithm:
1) Determine parameter K = number of nearest
neighbors
2) Calculate the distance between the query-instance
and all the training samples
3) Sort the distance and determine nearest neighbors
based on the K-th minimum distance
4) Gather the category of the nearest neighbors
5) Use simple majority of the category of nearest
neighbors as the prediction value of the query instance
67
K-Nearest Neighbor Algorithm
68
K-Nearest Neighbor Algorithm
69
Using the standardized
distance on the same
training set, the
unknown case
returned a different
neighbor which is not
a good sign of
robustness.
Example
X1 X2 Y
7 7 Bad
7 4 Bad
3 4 Good
1 4 Good
70
We have data from the
questionnaires survey here is
four training samples :
test with X1 = 3 and X2 = 7
Example
X1 X2 Distance
7 7 (7-3)2+(7-7)2=16
7 4 (7-3)2+(4-7)2=25
3 4 (3-3)2+(4-7)2=9
1 4 (1-3)2+(4-7)2=13
71
1. Suppose use K = 3
2. Calculate the distance between the query-instance and
all the training samples
Example
X1 X2 Distance Rank
7 7 (7-3)2+(7-7)2=16 3
7 4 (7-3)2+(4-7)2=25 4
3 4 (3-3)2+(4-7)2=9 1
1 4 (1-4)2+(4-7)2=13 2
72
3. Sort the distance and determine nearest neighbors based
on the K-th minimum distance
Example
X1 X2 Distance Rank Y
7 7 (7-3)2-(7-7)2=16 3 Bad
7 4 (7-3)2-(4-7)2=25 4 -
3 4 (3-3)2-(4-7)2=9 1 Good
1 4 (1-4)2-(4-7)2=13 2 Good
73
4. Gather the category of the nearest neighbors. Notice in the
second row last column that the category of nearest neighbor
(Y) is not included because the rank of this data is more than
3 (=K).
Example
5) Use simple majority of the category of nearest
neighbors as the prediction value of the query instance
We have 2 good and 1 bad, since 2>1 then we conclude
that a new test with X1 = 3 and X2 = 7 is included in
Good category.
74
Scaling issues
◦ Attributes may have to be scaled to prevent distance
measures from being dominated by one of the
attributes
◦ Example:
◦ height of a person may vary from 1.5m to 1.8m
◦ weight of a person may vary from 90lb to 300lb
◦ income of a person may vary from $10K to $1M
Solution: Normalize the vectors to unit length (make
all values between 0 and 1).
Strength and Weakness
Advantage
◦ Robust to noisy training data
◦ Effective if the training data is large
Disadvantage
◦ Need to determine value of parameter K (number of nearest
neighbors)
◦ Distance based learning is not clear which type of distance to
use and which attribute to use to produce the best results. Shall
we use all attributes or certain attributes only?
◦ Computation cost is quite high because we need to compute
distance of each query instance to all training samples.
76
Lazy learning (e.g. nearest Neighbor):
Simply stores training data (or only minor processing) and
waits until it is given a test tuple
less time in training but more time in predicting
Eager learning (e.g. Decision tree and neural network):
Given a set of training set, constructs a classification model
before receiving new (e.g., test) data to classify
77
Lazy vs. eager learning
Practice: Nearest Neighbor
Customer ID Debt Income Marital Status Risk
Abel High High Married Good
Ben Low High Married Doubtful
Candy Medium Very low Unmarried Poor
Dale Very high Low Married Poor
Ellen High Low Unmarried Poor
Fred High Very low Married Poor
George Low High Unmarried Doubtful
Harry Low Medium Married Doubtful
Igor Very Low Very High Married Good
Jack Very High Medium Married Poor
Customer ID Debt Income Marital Sta. Risk Distance
Abel High High Married Good
Ben Low High Married Doubtful
Candy Medium Very low Unmarried Poor
Dale Very high Low Married Poor
Ellen High Low Unmarried Poor
Fred High Very low Married Poor
George Low High Unmarried Doubtful
Harry Low Medium Married Doubtful
Igor Very Low Very High Married Good
Jack Very High Medium Married Poor
Practice: Nearest Neighbor
Customer ID Debt Income Marital Status Risk
Zeb High Medium Married ?
Practice: Nearest Neighbor

Weitere ähnliche Inhalte

Was ist angesagt?

Propositional logic
Propositional logicPropositional logic
Propositional logic
Rushdi Shams
 
Intro to scan conversion
Intro to scan conversionIntro to scan conversion
Intro to scan conversion
Mohd Arif
 
Multi-Layer Perceptrons
Multi-Layer PerceptronsMulti-Layer Perceptrons
Multi-Layer Perceptrons
ESCOM
 

Was ist angesagt? (20)

Propositional logic
Propositional logicPropositional logic
Propositional logic
 
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
Data Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessingData Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessing
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
 
Lexical analysis - Compiler Design
Lexical analysis - Compiler DesignLexical analysis - Compiler Design
Lexical analysis - Compiler Design
 
Back face detection
Back face detectionBack face detection
Back face detection
 
Bayesian networks
Bayesian networksBayesian networks
Bayesian networks
 
Dda algorithm
Dda algorithmDda algorithm
Dda algorithm
 
Mc culloch pitts neuron
Mc culloch pitts neuronMc culloch pitts neuron
Mc culloch pitts neuron
 
Composite transformation
Composite transformationComposite transformation
Composite transformation
 
Computer graphics basic transformation
Computer graphics basic transformationComputer graphics basic transformation
Computer graphics basic transformation
 
Intro to scan conversion
Intro to scan conversionIntro to scan conversion
Intro to scan conversion
 
NP completeness
NP completenessNP completeness
NP completeness
 
Semantic nets in artificial intelligence
Semantic nets in artificial intelligenceSemantic nets in artificial intelligence
Semantic nets in artificial intelligence
 
Mid point line Algorithm - Computer Graphics
Mid point line Algorithm - Computer GraphicsMid point line Algorithm - Computer Graphics
Mid point line Algorithm - Computer Graphics
 
Arithmetic coding
Arithmetic codingArithmetic coding
Arithmetic coding
 
Multi-Layer Perceptrons
Multi-Layer PerceptronsMulti-Layer Perceptrons
Multi-Layer Perceptrons
 
Bayesian learning
Bayesian learningBayesian learning
Bayesian learning
 
Divide and Conquer
Divide and ConquerDivide and Conquer
Divide and Conquer
 
Data preprocessing
Data preprocessingData preprocessing
Data preprocessing
 
Machine Learning - Splitting Datasets
Machine Learning - Splitting DatasetsMachine Learning - Splitting Datasets
Machine Learning - Splitting Datasets
 
2 d viewing computer graphics
2 d viewing computer graphics2 d viewing computer graphics
2 d viewing computer graphics
 

Ähnlich wie 3 classification

Cluster2
Cluster2Cluster2
Cluster2
work
 
Chapter 4 Classification in data sience .pdf
Chapter 4 Classification in data sience .pdfChapter 4 Classification in data sience .pdf
Chapter 4 Classification in data sience .pdf
AschalewAyele2
 
Machine Learning with Python- Methods for Machine Learning.pptx
Machine Learning with Python- Methods for Machine Learning.pptxMachine Learning with Python- Methods for Machine Learning.pptx
Machine Learning with Python- Methods for Machine Learning.pptx
iaeronlineexm
 

Ähnlich wie 3 classification (20)

Research trends in data warehousing and data mining
Research trends in data warehousing and data miningResearch trends in data warehousing and data mining
Research trends in data warehousing and data mining
 
dataminingclassificationprediction123 .pptx
dataminingclassificationprediction123 .pptxdataminingclassificationprediction123 .pptx
dataminingclassificationprediction123 .pptx
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learning
 
Machine Learning - Deep Learning
Machine Learning - Deep LearningMachine Learning - Deep Learning
Machine Learning - Deep Learning
 
Cluster2
Cluster2Cluster2
Cluster2
 
Classification
ClassificationClassification
Classification
 
Unit 1.pptx
Unit 1.pptxUnit 1.pptx
Unit 1.pptx
 
Presentation on supervised learning
Presentation on supervised learningPresentation on supervised learning
Presentation on supervised learning
 
Supervised learning techniques and applications
Supervised learning techniques and applicationsSupervised learning techniques and applications
Supervised learning techniques and applications
 
Chapter 1.pdf
Chapter 1.pdfChapter 1.pdf
Chapter 1.pdf
 
Data Mining
Data MiningData Mining
Data Mining
 
For iiii year students of cse ML-UNIT-V.pptx
For iiii year students of cse ML-UNIT-V.pptxFor iiii year students of cse ML-UNIT-V.pptx
For iiii year students of cse ML-UNIT-V.pptx
 
Chapter 4 Classification in data sience .pdf
Chapter 4 Classification in data sience .pdfChapter 4 Classification in data sience .pdf
Chapter 4 Classification in data sience .pdf
 
Data mining chapter04and5-best
Data mining chapter04and5-bestData mining chapter04and5-best
Data mining chapter04and5-best
 
5. Machine Learning.pptx
5.  Machine Learning.pptx5.  Machine Learning.pptx
5. Machine Learning.pptx
 
Industrial training ppt
Industrial training pptIndustrial training ppt
Industrial training ppt
 
Machine Learning with Python- Methods for Machine Learning.pptx
Machine Learning with Python- Methods for Machine Learning.pptxMachine Learning with Python- Methods for Machine Learning.pptx
Machine Learning with Python- Methods for Machine Learning.pptx
 
DataMining_CA2-4
DataMining_CA2-4DataMining_CA2-4
DataMining_CA2-4
 
Review of Algorithms for Crime Analysis & Prediction
Review of Algorithms for Crime Analysis & PredictionReview of Algorithms for Crime Analysis & Prediction
Review of Algorithms for Crime Analysis & Prediction
 
Introduction to Data Mining
Introduction to Data MiningIntroduction to Data Mining
Introduction to Data Mining
 

Mehr von Mahmoud Alfarra

8 programming-using-java decision-making practices 20102011
8 programming-using-java decision-making practices 201020118 programming-using-java decision-making practices 20102011
8 programming-using-java decision-making practices 20102011
Mahmoud Alfarra
 

Mehr von Mahmoud Alfarra (20)

Computer Programming, Loops using Java - part 2
Computer Programming, Loops using Java - part 2Computer Programming, Loops using Java - part 2
Computer Programming, Loops using Java - part 2
 
Computer Programming, Loops using Java
Computer Programming, Loops using JavaComputer Programming, Loops using Java
Computer Programming, Loops using Java
 
Chapter 10: hashing data structure
Chapter 10:  hashing data structureChapter 10:  hashing data structure
Chapter 10: hashing data structure
 
Chapter9 graph data structure
Chapter9  graph data structureChapter9  graph data structure
Chapter9 graph data structure
 
Chapter 8: tree data structure
Chapter 8:  tree data structureChapter 8:  tree data structure
Chapter 8: tree data structure
 
Chapter 7: Queue data structure
Chapter 7:  Queue data structureChapter 7:  Queue data structure
Chapter 7: Queue data structure
 
Chapter 6: stack data structure
Chapter 6:  stack data structureChapter 6:  stack data structure
Chapter 6: stack data structure
 
Chapter 5: linked list data structure
Chapter 5: linked list data structureChapter 5: linked list data structure
Chapter 5: linked list data structure
 
Chapter 4: basic search algorithms data structure
Chapter 4: basic search algorithms data structureChapter 4: basic search algorithms data structure
Chapter 4: basic search algorithms data structure
 
Chapter 3: basic sorting algorithms data structure
Chapter 3: basic sorting algorithms data structureChapter 3: basic sorting algorithms data structure
Chapter 3: basic sorting algorithms data structure
 
Chapter 2: array and array list data structure
Chapter 2: array and array list  data structureChapter 2: array and array list  data structure
Chapter 2: array and array list data structure
 
Chapter1 intro toprincipleofc#_datastructure_b_cs
Chapter1  intro toprincipleofc#_datastructure_b_csChapter1  intro toprincipleofc#_datastructure_b_cs
Chapter1 intro toprincipleofc#_datastructure_b_cs
 
Chapter 0: introduction to data structure
Chapter 0: introduction to data structureChapter 0: introduction to data structure
Chapter 0: introduction to data structure
 
8 programming-using-java decision-making practices 20102011
8 programming-using-java decision-making practices 201020118 programming-using-java decision-making practices 20102011
8 programming-using-java decision-making practices 20102011
 
7 programming-using-java decision-making220102011
7 programming-using-java decision-making2201020117 programming-using-java decision-making220102011
7 programming-using-java decision-making220102011
 
6 programming-using-java decision-making20102011-
6 programming-using-java decision-making20102011-6 programming-using-java decision-making20102011-
6 programming-using-java decision-making20102011-
 
5 programming-using-java intro-tooop20102011
5 programming-using-java intro-tooop201020115 programming-using-java intro-tooop20102011
5 programming-using-java intro-tooop20102011
 
4 programming-using-java intro-tojava20102011
4 programming-using-java intro-tojava201020114 programming-using-java intro-tojava20102011
4 programming-using-java intro-tojava20102011
 
3 programming-using-java introduction-to computer
3 programming-using-java introduction-to computer3 programming-using-java introduction-to computer
3 programming-using-java introduction-to computer
 
2 programming-using-java how to built application
2 programming-using-java how to built application2 programming-using-java how to built application
2 programming-using-java how to built application
 

Kürzlich hochgeladen

The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
ZurliaSoop
 

Kürzlich hochgeladen (20)

COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
Plant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptxPlant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptx
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 

3 classification

  • 1. Classifications Mahmoud Rafeek Alfarra http://mfarra.cst.ps University College of Science & Technology- Khan yonis Development of computer systems 2016 Chapter 3 – Lecture 1
  • 2. Outline Definition of classification Basic principles of classification Typical applications of classification How Does Classification Works? Difference between Classification & Prediction.  Machine learning techniques  Decision Trees  k-Nearest Neighbors 2
  • 3. Definition  Classification is: Techniques used to predict group membership for data instances.  Classification is: a data mining function that assigns items in a collection to target categories or classes.  The goal of classification is to accurately predict the target class for each case in the data.
  • 4. Definition For example:  Identify loan applicants as low, medium, or high credit risks.  A bank loan officer wants to analyze the data in order to know which customer (loan applicant) are risky or which are safe. A marketing manager at a company needs to analyze a customer with a given profile, who will buy a new computer. 4
  • 8. Definition  This type of learning called supervised learning 8
  • 9. Basic principles of classification 9 Want to classify objects as boats and houses
  • 10. Basic principles of classification 10  All objects before the coast line are boats and all objects after the coast line are houses.  Coast line serves as a decision surface that separates two classes.
  • 11. 11  The methods that build classification models (i.e., “classification algorithms”) operate very similarly to the previous example.  First all objects are represented geometrically. Basic principles of classification
  • 12. 12 Then the algorithm seeks to find a decision surface that separates classes of objects . Basic principles of classification
  • 13. 13 Unseen (new) objects are classified as “boats” if they fall below the decision surface and as “houses” if the fall above it. Basic principles of classification
  • 14. Applications of classification  Medical diagnosis: if a tumor is Primary or Metastatic 14
  • 15. Credit/loan approval. Fraud detection: if a transaction is fraudulent Web page categorization: which category it is Classifying secondary structures of protein as alpha-helix, beta-sheet, or random coil Categorizing news stories as finance, weather, entertainment, sports, etc 15 Applications of classification
  • 16. Classifications Mahmoud Rafeek Alfarra http://mfarra.cst.ps University College of Science & Technology- Khan yonis Development of computer systems 2016 Chapter 3 – Lecture 2
  • 17. Outline  Definition of classification  Basic principles of classification  Typical applications of classification  How Does Classification Works?  Difference between Classification & Prediction.  Machine learning techniques  Decision Trees  k-Nearest Neighbors 17
  • 18. How Does Classification Works? The Data Classification process includes two steps:  Building the Classifier or Model  Using Classifier for Classification 18
  • 20. Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is the class. Find a model for class attribute as a function of the values of other attributes. Goal: previously unseen records should be assigned a class as accurately as possible. A test set is used to determine the accuracy of the model. Usually, the given data set is divided into training and test sets, with training set used to build the model and test set used to validate it. 20 How Does Classification Works?
  • 21. 21 Training Data NAME RANK YEARS TENURED Mike Assistant Prof 3 no Mary Assistant Prof 7 yes Bill Professor 2 yes Jim Associate Prof 7 yes Dave Assistant Prof 6 no Anne Associate Prof 3 no Classification Algorithms IF rank = ‘professor’ OR years > 6 THEN tenured = ‘yes’ Classifier (Model) Process 1 Model Construction: example
  • 22. 22 Classifier Testing Data NAME RANK YEARS TENURED Tom Assistant Prof 2 no Merlisa Associate Prof 7 no George Professor 5 yes Joseph Assistant Prof 7 yes Unseen Data (Jeff, Professor, 4) Tenured? Process 2 Using the Model in Prediction
  • 23. Difference between Classification & Prediction Classification  Has prior knowledge about class. A model or a classifier will be constructed that predicts the class of unseen object. Prediction  Has prior knowledge about class.  A model or a predictor will be constructed that predicts a continuous- valued-function or ordered value. 23
  • 24. Difference between Classification & Prediction If you use a classification model to predict the treatment outcome for a new patient, it would be a prediction.  In the book "Data Mining Concepts and Techniques", Han and Kamber's view is that predicting class labels is classification, and predicting values is prediction. 24
  • 25. Machine learning techniques  Learning: Things learn when they change their behavior in a way that makes them perform better in the future.  Machine learning: is the subfield of artificial intelligence that is concerned with the design and development of algorithms that allow computers (machines) to improve their performance over time (to learn) based on data, such as from sensor data or databases.
  • 26. Machine learning techniques Comparing Classification Methods:  Predictive Accuracy: Ability to correctly predict the class label.  Speed: Computation costs involved in generating and using model.  Robustness: Ability to make correct predictions given noisy or/and missing values.  Scalability: Ability to construct model efficiently given large amounts of data.  Interpretability: Level of understanding and insight that is provided by the model. 26
  • 27. Machine learning techniques Examples of machine learning techniques: Decision Trees Rule Induction k-Nearest Neighbors Naïve Bayesian Classifiers Neural Network 27
  • 28. Classifications Decision Trees Mahmoud Rafeek Alfarra http://mfarra.cst.ps University College of Science & Technology- Khan yonis Development of computer systems 2016 Chapter 3 – Lecture 3
  • 29. Outline  Definition of classification  Basic principles of classification  Typical applications of classification  How Does Classification Works?  Difference between Classification & Prediction.  Machine learning techniques  Decision Trees  k-Nearest Neighbors 29
  • 30. Outline  Definition  Decision tree consist of …  Decision Tree Classification Task  Apply Model to Test Data  Building Tree  A criterion for attribute selection  Decision Tree to Decision Rules 30
  • 31. Definition Decision tree learning is a common method used in data mining.  It is an efficient method for producing classifiers from data. A Decision Tree is a tree-structured plan of a set of attributes to test in order to predict the output. It is a type of tree-diagram used in determining the optimum course of action, in situations having several possible alternatives with uncertain outcomes. 31
  • 36. Decision Trees: Definition Decision tree consist of: An internal node is a test on an attribute, e.g. Body temperature . A branch represents an outcome of the test, e.g., Warm A leaf node represents a class label e.g. Mammals At each node, one attribute is chosen to split training examples into distinct classes as much as possible A new case is classified by following a matching path to a leaf node. 36
  • 37. Example of a Decision Tree Tid Refund Marital Status Taxable Income Cheat 1 Yes Single 125K No 2 No Married 100K No 3 No Single 70K No 4 Yes Married 120K No 5 No Divorced 95K Yes 6 No Married 60K No 7 Yes Divorced 220K No 8 No Single 85K Yes 9 No Married 75K No 10 No Single 90K Yes 10 Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Splitting Attributes Training Data Model: Decision Tree
  • 38. Another Example of Decision Tree Tid Refund Marital Status Taxable Income Cheat 1 Yes Single 125K No 2 No Married 100K No 3 No Single 70K No 4 Yes Married 120K No 5 No Divorced 95K Yes 6 No Married 60K No 7 Yes Divorced 220K No 8 No Single 85K Yes 9 No Married 75K No 10 No Single 90K Yes 10 MarSt Refund TaxInc YESNO NO NO Yes No Married Single, Divorced < 80K > 80K There could be more than one tree that fits the same data!
  • 39. Decision Tree Classification Task Apply Model Induction Deduction Learn Model Model Tid Attrib1 Attrib2 Attrib3 Class 1 Yes Large 125K No 2 No Medium 100K No 3 No Small 70K No 4 Yes Medium 120K No 5 No Large 95K Yes 6 No Medium 60K No 7 Yes Large 220K No 8 No Small 85K Yes 9 No Medium 75K No 10 No Small 90K Yes 10 Tid Attrib1 Attrib2 Attrib3 Class 11 No Small 55K ? 12 Yes Medium 80K ? 13 Yes Large 110K ? 14 No Small 95K ? 15 No Large 67K ? 10 Test Set Tree Induction algorithm Training Set Decision Tree
  • 40. Apply Model to Test Data Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Refund Marital Status Taxable Income Cheat No Married 80K ? 10 Test Data Start from the root of tree.
  • 41. Apply Model to Test Data Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Refund Marital Status Taxable Income Cheat No Married 80K ? 10 Test Data
  • 42. Apply Model to Test Data Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Refund Marital Status Taxable Income Cheat No Married 80K ? 10 Test Data
  • 43. Apply Model to Test Data Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Refund Marital Status Taxable Income Cheat No Married 80K ? 10 Test Data
  • 44. Apply Model to Test Data Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Refund Marital Status Taxable Income Cheat No Married 80K ? 10 Test Data
  • 45. Apply Model to Test Data Refund MarSt TaxInc YESNO NO NO Yes No MarriedSingle, Divorced < 80K > 80K Refund Marital Status Taxable Income Cheat No Married 80K ? 10 Test Data Assign Cheat to “No”
  • 46. Decision Tree Classification Task Apply Model Induction Deduction Learn Model Model Tid Attrib1 Attrib2 Attrib3 Class 1 Yes Large 125K No 2 No Medium 100K No 3 No Small 70K No 4 Yes Medium 120K No 5 No Large 95K Yes 6 No Medium 60K No 7 Yes Large 220K No 8 No Small 85K Yes 9 No Medium 75K No 10 No Small 90K Yes 10 Tid Attrib1 Attrib2 Attrib3 Class 11 No Small 55K ? 12 Yes Medium 80K ? 13 Yes Large 110K ? 14 No Small 95K ? 15 No Large 67K ? 10 Test Set Tree Induction algorithm Training Set Decision Tree
  • 47. Decision Trees: Building Tree There is a large number of decision-tree induction algorithms described primarily in the machine-learning and applied-statistics literature.  They are supervised learning methods that construct decision trees from a set of input-output samples. Optimal tree is the smallest. 47
  • 48. Decision Trees: Building Tree There is a large number of decision-tree induction algorithms described primarily in the machine-learning and applied-statistics literature.  They are supervised learning methods that construct decision trees from a set of input-output samples. Optimal tree is the smallest. 48
  • 49. Decision Trees: Building Tree Top-down tree construction ◦ At start, all training examples are at the root. ◦ Partition the examples recursively by choosing one attribute each time. 49
  • 50. Decision Trees: Building Tree Top-Down Induction of Decision Trees (Greedy Tree Growing) Recursive Partitioning ◦ find “best” attribute test to install at root ◦ split data on root test ◦ find “best” attribute test to install at each new node ◦ split data on new test ◦ repeat until: ◦ all nodes are pure ◦ all nodes contain fewer than k cases ◦ tree reaches predetermined max depth ◦ no more attributes to test 50
  • 51. Find “best” attribute test to install at root Outlook Temperature Humidity Windy Play? sunny hot high false No sunny hot high true No overcast hot high false Yes rain mild high false Yes rain cool normal false Yes rain cool normal true No overcast cool normal true Yes sunny mild high false No sunny cool normal false Yes rain mild normal false Yes sunny mild normal true Yes overcast mild high true Yes overcast hot normal false Yes rain mild high true No Decision Trees: Example
  • 52. Decision Trees: Example Overcast it does not need split because it’s need yes
  • 53. A criterion for attribute selection Which is the best attribute? ◦ The one which will result in the smallest tree ◦ Heuristic: choose the attribute that produces the “purest” nodes Popular impurity criterion: information gain ◦ Information gain increases with the average purity of the subsets that an attribute produces Strategy: choose attribute that results in greatest information gain 53
  • 54. Decision Tree to Decision Rules A decision tree can easily be transformed to a set of rules by mapping from the root node to the leaf nodes one by one. 54
  • 55. Decision Tree to Decision Rules 55
  • 56. Practice: Draw decision tree using the following instances: 56 Weekend Weather Parents Money Decision (Category) W1 Sunny Yes Rich Cinema W2 Sunny No Rich Tennis W3 Windy Yes Rich Cinema W4 Rainy Yes Poor Cinema W5 Rainy No Rich Stay in W6 Rainy Yes Poor Cinema W7 Windy No Poor Cinema W8 Windy No Rich Shopping W9 Windy Yes Rich Cinema W10 Sunny No Rich Tennis
  • 57. Practice: Convert the following to Decision Rules 57 Weekend Weather Parents Money Decision (Category) D1 Sunny Yes Rich Cinema W2 Sunny No Rich Tennis W3 Windy Yes Rich Cinema W4 Rainy Yes Poor Cinema W5 Rainy No Rich Stay in W6 Rainy Yes Poor Cinema W7 Windy No Poor Cinema W8 Windy No Rich Shopping W9 Windy Yes Rich Cinema W10 Sunny No Rich Tennis
  • 58. Practice: Classify the following instances: 58 Weekend Weather Parents Money Decision (Category) W11 Rainy Yes Rich ? W12 Windy No Poor ? W13 Sunny No Poor ? W14 Windy Yes Poor ? W15 Rainy No Poor ?
  • 59. Practice: Again using the following 59
  • 60. Classifications k-Nearest Neighbors Mahmoud Rafeek Alfarra http://mfarra.cst.ps University College of Science & Technology- Khan yonis Development of computer systems 2016 Chapter 3 – Lecture 4
  • 61. Outline  Definition of classification  Basic principles of classification  Typical applications of classification  How Does Classification Works?  Difference between Classification & Prediction.  Machine learning techniques  Decision Trees  k-Nearest Neighbors 61
  • 62. Definition: K-Nearest Neighbor K-nearest neighbor is a supervised learning algorithm.  A case is classified by a majority vote of its neighbors, with the case being assigned to the class most common amongst its K nearest neighbors measured by a distance function. If K = 1, then the case is simply assigned to the class of its nearest neighbor.  The purpose of this algorithm is to classify a new object based on attributes and training samples. Also called instance based learning. 62
  • 64. K-Nearest Neighbor Algorithm Given a new instance x, find its nearest neighbor <x’,y’> Return y’ as the class of x 64
  • 65. K-Nearest Neighbor: Example A simple example of the nearest neighbor prediction algorithm is that if you look at the people in your neighborhood you may notice that, in general, you all have similar incomes.  So if your neighbors have an income greater than $50,000, you have a chance to have a high income as well. 65
  • 66. K-Nearest NeighborAlgorithm   n i ii yxYXd 1 2 )(),( 66 All instances correspond to points in the n-D space. The nearest neighbor are defined in terms of Euclidean distance. Euclidean distance between two points , X=(x1,x2,…,xn) and Y=(y1,y2,…yn) is:
  • 67. K-Nearest Neighbor Algorithm Here is step by step on how to compute K-nearest neighbors KNN algorithm: 1) Determine parameter K = number of nearest neighbors 2) Calculate the distance between the query-instance and all the training samples 3) Sort the distance and determine nearest neighbors based on the K-th minimum distance 4) Gather the category of the nearest neighbors 5) Use simple majority of the category of nearest neighbors as the prediction value of the query instance 67
  • 69. K-Nearest Neighbor Algorithm 69 Using the standardized distance on the same training set, the unknown case returned a different neighbor which is not a good sign of robustness.
  • 70. Example X1 X2 Y 7 7 Bad 7 4 Bad 3 4 Good 1 4 Good 70 We have data from the questionnaires survey here is four training samples : test with X1 = 3 and X2 = 7
  • 71. Example X1 X2 Distance 7 7 (7-3)2+(7-7)2=16 7 4 (7-3)2+(4-7)2=25 3 4 (3-3)2+(4-7)2=9 1 4 (1-3)2+(4-7)2=13 71 1. Suppose use K = 3 2. Calculate the distance between the query-instance and all the training samples
  • 72. Example X1 X2 Distance Rank 7 7 (7-3)2+(7-7)2=16 3 7 4 (7-3)2+(4-7)2=25 4 3 4 (3-3)2+(4-7)2=9 1 1 4 (1-4)2+(4-7)2=13 2 72 3. Sort the distance and determine nearest neighbors based on the K-th minimum distance
  • 73. Example X1 X2 Distance Rank Y 7 7 (7-3)2-(7-7)2=16 3 Bad 7 4 (7-3)2-(4-7)2=25 4 - 3 4 (3-3)2-(4-7)2=9 1 Good 1 4 (1-4)2-(4-7)2=13 2 Good 73 4. Gather the category of the nearest neighbors. Notice in the second row last column that the category of nearest neighbor (Y) is not included because the rank of this data is more than 3 (=K).
  • 74. Example 5) Use simple majority of the category of nearest neighbors as the prediction value of the query instance We have 2 good and 1 bad, since 2>1 then we conclude that a new test with X1 = 3 and X2 = 7 is included in Good category. 74
  • 75. Scaling issues ◦ Attributes may have to be scaled to prevent distance measures from being dominated by one of the attributes ◦ Example: ◦ height of a person may vary from 1.5m to 1.8m ◦ weight of a person may vary from 90lb to 300lb ◦ income of a person may vary from $10K to $1M Solution: Normalize the vectors to unit length (make all values between 0 and 1).
  • 76. Strength and Weakness Advantage ◦ Robust to noisy training data ◦ Effective if the training data is large Disadvantage ◦ Need to determine value of parameter K (number of nearest neighbors) ◦ Distance based learning is not clear which type of distance to use and which attribute to use to produce the best results. Shall we use all attributes or certain attributes only? ◦ Computation cost is quite high because we need to compute distance of each query instance to all training samples. 76
  • 77. Lazy learning (e.g. nearest Neighbor): Simply stores training data (or only minor processing) and waits until it is given a test tuple less time in training but more time in predicting Eager learning (e.g. Decision tree and neural network): Given a set of training set, constructs a classification model before receiving new (e.g., test) data to classify 77 Lazy vs. eager learning
  • 78. Practice: Nearest Neighbor Customer ID Debt Income Marital Status Risk Abel High High Married Good Ben Low High Married Doubtful Candy Medium Very low Unmarried Poor Dale Very high Low Married Poor Ellen High Low Unmarried Poor Fred High Very low Married Poor George Low High Unmarried Doubtful Harry Low Medium Married Doubtful Igor Very Low Very High Married Good Jack Very High Medium Married Poor
  • 79. Customer ID Debt Income Marital Sta. Risk Distance Abel High High Married Good Ben Low High Married Doubtful Candy Medium Very low Unmarried Poor Dale Very high Low Married Poor Ellen High Low Unmarried Poor Fred High Very low Married Poor George Low High Unmarried Doubtful Harry Low Medium Married Doubtful Igor Very Low Very High Married Good Jack Very High Medium Married Poor Practice: Nearest Neighbor
  • 80. Customer ID Debt Income Marital Status Risk Zeb High Medium Married ? Practice: Nearest Neighbor