SlideShare ist ein Scribd-Unternehmen logo
1 von 6
Downloaden Sie, um offline zu lesen
IOSR Journal of Computer Engineering (IOSR-JCE)
e-ISSN: 2278-0661,p-ISSN: 2278-8727, Volume 17, Issue 5, Ver. III (Sep. – Oct. 2015), PP 35-40
www.iosrjournals.org
DOI: 10.9790/0661-17533540 www.iosrjournals.org 35 | Page
Improving Image Classification Result using
Neural Networks
1
Seyed Peyman Zolnouri, 2
Fardad Farokhi, 3
Mehdi Nadiri Andabili
1,2,3
(Department of Electrical Engineering, Central Tehran Branch, Islamic Azad University, Tehran, Iran)
Abstract: Image classification is among the complex processes which depend on several factors. In this paper,
the most common existing methods for image classification are compared first and different structures of
multilayer Perceptron (MLP) and Radial Basis neural networks (RBF) are then reviewed for classification.
Finally, they are compared to non-neural algorithms. The function obtained by using the proposed method
enjoys better results as compared to other methods and it has the best result among other methods with respect
to accuracy with an average accuracy of 98.84%.
Keywords: Image Classification, MLP Neural Network, RBF, KNN
I. Introduction
In the recent years, several methods have been presented for image classification. Some of these
methods include minimum distance, maximum similarity, and support vector machine (SVM). The main steps in
image classification involve determining the appropriate classification system, choosing the appropriate
approach for classification, post-classification processing and accuracy assessment [1].
A successful classification requires an appropriate classification system and sufficient training samples.
Another method which is widely used in image classification is the use of a neural network which includes error
back-propagation networks as well as RBF. In this paper, we modify classification accuracy by MLP. The
results show that the proposed method has the best answer among the referred algorithms. Order of the contents
of paper is such that SVM method which has been already used for classification in previous articles is
discussed in section II. The proposed algorithm which is based on back-propagation neural network is presented
in section III. Simulation and comparison of the obtained results are described in sections VI- VII. Finally, the
conclusion obtained from the paper is presented in section VIII.
II. Image Classification Methods
1. Support Vector Machine (SVM)
SVM is a multivariable machine learning system developed by Vapink [2]. An appropriate method for
data classification includes SVM which uses learning methods with supervisor for data classification. SVM is a
means for estimation of classification which uses machine learning theory, in the manner that the main idea of
SVM is to maximize the margin between the two classes and to minimize the error. In this method, the two
classes are separated by using a linear boundary. A number of learning points with the minimum distance to
decision border can be considered as a subset to define decision boarders and as a support vector. Learning set is
in terms of
N
iii
yx 1
)},{( 
with any input for  1i
y . SVM algorithm is calculated as (1) for 0i
 .
minimize 


N
i
i
cw
1
2
2
1
 (1)
Where C is a parameter which is adjusted by user and i
 measures the difference between ),( wxa i
and i
y [3].
2. K NEAREST NEIGHBOR (KNN)
Classification K nearest neighbor is the most common and most appropriate method for classification
due to a high understanding and lack of any need to making any hypothesis on the data. In this classification, the
sample existing in the test set belongs to a class with the highest number of votes among k of its nearest
neighbors. Euclidean distance is used to calculate the nearest neighbor of a sample which is in terms of the
following relation.
),(),(
1
txdtxd
m
i
i
eucleucl  
 (2)
If the amounts are continuous, the amount of eucl
d will be calculated as follows:
Improving Image Classification Result using Neural Networks
DOI: 10.9790/0661-17533540 www.iosrjournals.org 36 | Page
2
))()((),( taxatxd ii
i
eucl
 (3)
Two major disadvantages of this method are as follows where Determination of k amount by user
which is one of the important points of KNN method and All the nearest neighbors of test sample are
considered with the same degree of importance [4].
3. Neural Network Classification
The book ―Classification Algorithm‖ which was written in 2003 by Landgrebe describes different
image classification algorithms. In the recent years, advanced classification approaches such as artificial neural
networks, fuzzy sets and expert systems which are highly used in image classification have been introduced.
Classification approaches are classified in terms of with and without supervisor, parametric and non-parametric,
soft and hard (fuzzy) classification and pre-pixel approaches. Neural networks are in terms of pre-pixel
algorithms [1].
Neural network is an important means for classification. Researches show that neural networks can be
a substitute alternative for traditional classification methods. Neural networks are used for classification in
industry, trade and science. In most papers, multilayer perceptron neural networks (MLP) are used for
classification. A comparison between neural networks and traditional classification methods which are generally
in terms of statistical methods shows that neural networks for classification are in terms of linear methods and
free models while statistical methods are linear and based on the model [5].
III. Implementation
As we know, type of inputting data and order of data are very important for neural networks so that
they will have a close relation with the final results. For this purpose, a series of processes shall first be made on
the database so that finally the data is prepared to be applied to the network in two groups, namely learning data
and test data. The following steps were taken in order to prepare the aforesaid data.
First step: To make the data uniform, at first total data is changed into numerical data by removing the
character data in Classes section of database and replacing that with numbers. The number attributed to each
class is presented in table(1).
Second step: As we know, if a feature has a constant amount in all instances, it will have no effect on
the network performance. As a result, the third feature called region-pixel-count with a constant amount of 9
was removed at this stage aiming at preprocess of data. In this case, number of features has decreased to 18
features which are in turn useful for the network.
Table 1. Replacing Characters with numbers
Class Assigned Number
BRICKFACE 1
SKY 2
FOLIAGE 3
CEMENT 4
WINDOW 5
PATH 6
GRASS 7
Third step: Then, by surveying each column of features and by finding maximum and minimum
amounts, all the data of that column are normalized by passing through a linear function (4), in the manner that
after completion of this stage, all the amounts will be placed in the interval of {-1,1}.
1
minmax
min
2 








x
f (4)
Fourth step: Finally, one-third of all data in which there are equal numbers of each class is
considered as learning data and the rest is considered as test data. Moreover, all learning data are rearranged
randomly to remove network sensitivity to order of instances.
Improving Image Classification Result using Neural Networks
DOI: 10.9790/0661-17533540 www.iosrjournals.org 37 | Page
IV. MLP Network With Back Propagation Algorithm
MLP neural networks are among the most widely used and most common methods for classification
of inputs so that Several algorithms have been provided to adjust the parameters of this type of network and
Back Propagation algorithm is one of them which modifies the weighs of network by calculating the error in
terms of returning and calculation of error of each neuron. As shown in Fig.1, MLP networks are constituted of
several sections: inputting section which is directly connected to data, middle layers which are also called
Hidden Layers, output layers which are equal to the number of classes in view of number of neurons, neurons
which are constituted of Activation Functions and weighs which indicate the degree of importance of one input.
Figure 1. A scheme of MLP network
Moreover, BP algorithm adds another parameter to the aforesaid parameters called Learning rate.
Since this type of network and algorithm have equal and predefined principles, the difference seen in the results
is due to the difference in the amounts of network parameters and adjusting them by error will result in
questioning the type of network. Considering the aforesaid issues, the amounts of parameters are explained.
 Input: number of inputs is always equal to the number of features and here it is 18.
 Number of internal layers: In this paper, number of hidden layers starts from one layer and will finally end
in two layers.
 Number of neurons of external layer is equal to the number of classes which is 7 for this database.
 Number of neurons of hidden layers is 5 and 10 neurons for each hidden layer, respectively.
Moreover, in case activator function is Antisymmetric (symmetric against the origin, odd function);
learning rate with BP algorithm will be higher. As we know, the standard logistic function has not this form
while hyperbolic tangent function with a=1.7159 and b=2/3 coefficients enjoys these conditions [6]. First of all,
some amounts should be considered for the weighs to start program performance. It is better that these amounts
have zero averages and variance amounts of weighs are equal to 2
1

 mw
 (m: number of synoptic relations of
a neuron). Here, initial amounts of weighs are selected within an interval of [-0.5 +0.5].
In addition to the above items, network learning method can also play an important role in the final
results. Basically, Sequential learning method in which weighs are modified after providing each example, will
offer better results in this algorithm. Further to this, training is preferred to sequential method in view of
operations because it requires less local storage of weighs for each connection. Moreover, searching for a weight
space is made randomly by providing instances to the network on a natural basis. This in turn lessens the
probability of network to be trapped in the local minimums. This is while in Batch Learning method, weighs are
modified after providing all the examples. However, learning speed in both methods depends on the parameter
of learning rate. So that learning speed decreases by having a low learning rate and vice versa. Here, learning
rate starts from 0.01 and then decreases in the middle of training.
V. Radial Basis Function Network
In fact, this type of network is a special type of MLP network with only one hidden layer which is
shown in Fig.2. Moreover, in contrast with BP algorithm, modification of network parameters starts from input
towards output. As it is clear from its name, the main kernel of the neurons of this network is developed based
on radial functions such as Gaussian Functions (3). Since the number of hidden layers is limited to one layer, the
most important parameters of this network are therefore limited to Number of neurons or internal layer kernels,
type of activator function in each neuron, Radius and centers of activator functions and weighs.
Improving Image Classification Result using Neural Networks
DOI: 10.9790/0661-17533540 www.iosrjournals.org 38 | Page
Figure 2. RBF Network
In this paper, RBF networks are tested with 10, 15 and 20 neurons all of which use Gaussian
Functions. Moreover, with the initial amounts, the centers are randomly selected from several examples based
on which the initial amounts of variance or radius of each neuron are calculated considering (5)
M
d
2
 (5)
where d is the space between selected centers and M is the number of neurons. Furthermore, initial
amounts of weighs are similar to MLP-BP network. Since three parameters are modified during learning
process, three parameters are required to determine learning rate. According to the obtained results, the less the
learning rate is for the centers, the better the results of the network will be. Here, three initial amounts are
considered for the three mentioned parameters according to table(2).
Table 2. RBP Learning rates
ValueLearning Rate
0.001Weight rate
0.0001Center rate
0.001Variance rate
VI. K-Nearest Neighbors Algorithm
As it was said at the beginning, this algorithm is outside of neural networks. In fact, this method acts
based on Euclidean distance between the samples. Despite neural networks which have two processes, namely
training and testing, this method performs all stages within just one stage. The goal of using this algorithm is to
establish a basis for the networks used in this paper. In addition, different nature of this method results in
application of a series of different processes in establishing the inputting data. Since the best results obtained by
this algorithm were examined in testing the studied database based on K=3 neighborhood, this amount will be
considered as a basis for comparison.
VII. Results
1. Image classification database
UCI image classification database is used to check the results of the proposed method. The database has
2310 instances and 7 output classes. Each instance is in terms of a 3*3 area with 19 features [3] .
2. Simulation
Different structures of networks are tested, in the manner that the aforesaid database is used for all of
them. Finally, after comparing the results of RBF and MLP-BP networks, the best results of which are compared
to the results of KNN algorithm. First of all, accuracy results are provided for 4 different MLP-BP neural
networks, so that the first one has one hidden layer with 5 neurons and the second one has the same layer with
10 neurons. This is while by adding another layer to the first and second network with the same number of
neurons, the third and fourth networks are obtained, respectively. As seen in Fig.3, the network has offered the
Best result among other configurations with two hidden layers each with 10 neurons. However, from practical
viewpoint, considering the results of the second network and since it has less layers and less neurons and as a
result, it needs a smaller memory, it is preferred to other configurations for implementation in the embedded
systems.
Improving Image Classification Result using Neural Networks
DOI: 10.9790/0661-17533540 www.iosrjournals.org 39 | Page
Figure 3. MLP Network Results
The results for three different configurations of RBF network are then described, in the manner that
the first one has 10 neurons and the second and third ones have 15 and 20 neurons, respectively. As seen in
Fig.4, the third configuration is better in four classes as compared to other configurations and it is a little
different with the first and second networks among others, but generally, RBF network with an average
accuracy of 84.47% is significantly different with MLP-BP network with an average accuracy of 98.84%, in the
manner that in order to prove the function of MLP-BP network, its results are provided in table(3) beside the
results of KNN algorithm. By examining the results in table(3), it is concluded that MLP-BP network has had a
better function even better than KNN algorithm which is based on Euclidean distance, in the manner that it
almost has 100% accuracy in four classes and their minimum for other classes is 96.7%. Moreover, by
comparing the results in [3], it can be seen that our proposed method has a better function compared to that
method.
Figure 4. RBF Network Results
Table 3. MLP versus KNN results
Classes
7 6 5 4 3 2 1
MLP
Sensitivity 0.985 1.00 0.845 0.94 0.95 1.00 0.995
Specificity 0.996 1.00 0.997 0.988 0.97 0.999 1.00
Accuracy 0.995 1.00 0.975 0.981 0.969 0.999 0.999
KNN
Sensitivity 0.965 1.00 0.86 0.810 0.930 1.00 0.995
Specificity 0.994 1.00 0.986 0.992 0.961 0.991 1.00
Accuracy 0.990 1.00 0.968 0.966 0.957 0.992 0.999
Improving Image Classification Result using Neural Networks
DOI: 10.9790/0661-17533540 www.iosrjournals.org 40 | Page
VIII. Conclusion
In this paper, a new look is given to image classification using neural networks. Furthermore, by
using different networks with variable configurations and also by comparing the results With non-neural
algorithms, it is shown that MLP-BP algorithm has a better function as compared to other aforesaid networks.
The importance of this issue is further shown by describing data preparation method for the network. In
addition, as the future work, we first want to completely apply the preprocessed methods on the database to
decrease the number of features if there is any dependency between the features. We also want to use
evolutionary neural networks such as genetic algorithms to increase the space for searching for number of layers
and number neurons in MLP-BP networks.
References
[1] D. LU,Q. WENG, A survey of image classification methods and techniques for improving classification performance, International
Journal of Remote Sensing 2007, 823–870.
[2] M.He, F. M.Imran, B. Belkacem, and Sh.Mei , IMPROVING HYPERSPECTRAL IMAGE CLASSIFICATION ACCURACY
USING ITERATIVE SVM WITH SPATIAL-SPECTRAL INFORMATION, IEEE conference, 2013.
[3] J.Tin,Y.Kwok, Moderating the Outputs of Support Vector Machine Classifiers, IEEE Conference,1999.
[4] N.Verbiest, C.Cornelis, and R.Jensen, Fuzzy rough Positive region based Nearest Neighbour Classificaton , IEEE World congress
on Computational Intelligence, 2012.
[5] G.P.Zhang, Neural Networks for Classification: A Survey ,IEEE TRANSACTIONS ON SYSTEMS, MAN, AND
CYBERNETICS—PART C: APPLICATIONS AND REVIEWS, 2000.
[6] Simon Haykin, Multilayer Perceptrons (in Neural Networks, 2th ed. New Jersey, Prentice Hall, 1999),156-178.

Weitere ähnliche Inhalte

Was ist angesagt?

A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather ForecastingA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecastingijctcm
 
Artificial neural networks in hydrology
Artificial neural networks in hydrology Artificial neural networks in hydrology
Artificial neural networks in hydrology Jonathan D'Cruz
 
Targeted Visual Content Recognition Using Multi-Layer Perceptron Neural Network
Targeted Visual Content Recognition Using Multi-Layer Perceptron Neural NetworkTargeted Visual Content Recognition Using Multi-Layer Perceptron Neural Network
Targeted Visual Content Recognition Using Multi-Layer Perceptron Neural Networkijceronline
 
Neural networks for the prediction and forecasting of water resources variables
Neural networks for the prediction and forecasting of water resources variablesNeural networks for the prediction and forecasting of water resources variables
Neural networks for the prediction and forecasting of water resources variablesJonathan D'Cruz
 
Black-box modeling of nonlinear system using evolutionary neural NARX model
Black-box modeling of nonlinear system using evolutionary neural NARX modelBlack-box modeling of nonlinear system using evolutionary neural NARX model
Black-box modeling of nonlinear system using evolutionary neural NARX modelIJECEIAES
 
Adaptive network based fuzzy
Adaptive network based fuzzyAdaptive network based fuzzy
Adaptive network based fuzzyijaia
 
I041214752
I041214752I041214752
I041214752IOSR-JEN
 
Image Segmentation Using Two Weighted Variable Fuzzy K Means
Image Segmentation Using Two Weighted Variable Fuzzy K MeansImage Segmentation Using Two Weighted Variable Fuzzy K Means
Image Segmentation Using Two Weighted Variable Fuzzy K MeansEditor IJCATR
 
OPTIMIZED TASK ALLOCATION IN SENSOR NETWORKS
OPTIMIZED TASK ALLOCATION IN SENSOR NETWORKSOPTIMIZED TASK ALLOCATION IN SENSOR NETWORKS
OPTIMIZED TASK ALLOCATION IN SENSOR NETWORKSZac Darcy
 
An efficient technique for color image classification based on lower feature ...
An efficient technique for color image classification based on lower feature ...An efficient technique for color image classification based on lower feature ...
An efficient technique for color image classification based on lower feature ...Alexander Decker
 
Hyper-parameter optimization of convolutional neural network based on particl...
Hyper-parameter optimization of convolutional neural network based on particl...Hyper-parameter optimization of convolutional neural network based on particl...
Hyper-parameter optimization of convolutional neural network based on particl...journalBEEI
 
IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...
IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...
IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...IRJET Journal
 
Optimization as a model for few shot learning
Optimization as a model for few shot learningOptimization as a model for few shot learning
Optimization as a model for few shot learningKaty Lee
 
Meta learning with memory augmented neural network
Meta learning with memory augmented neural networkMeta learning with memory augmented neural network
Meta learning with memory augmented neural networkKaty Lee
 
ISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASET
ISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASETISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASET
ISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASETijmnct
 
Expert system design for elastic scattering neutrons optical model using bpnn
Expert system design for elastic scattering neutrons optical model using bpnnExpert system design for elastic scattering neutrons optical model using bpnn
Expert system design for elastic scattering neutrons optical model using bpnnijcsa
 

Was ist angesagt? (19)

A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather ForecastingA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecasting
 
Artificial neural networks in hydrology
Artificial neural networks in hydrology Artificial neural networks in hydrology
Artificial neural networks in hydrology
 
Targeted Visual Content Recognition Using Multi-Layer Perceptron Neural Network
Targeted Visual Content Recognition Using Multi-Layer Perceptron Neural NetworkTargeted Visual Content Recognition Using Multi-Layer Perceptron Neural Network
Targeted Visual Content Recognition Using Multi-Layer Perceptron Neural Network
 
Neural networks for the prediction and forecasting of water resources variables
Neural networks for the prediction and forecasting of water resources variablesNeural networks for the prediction and forecasting of water resources variables
Neural networks for the prediction and forecasting of water resources variables
 
Black-box modeling of nonlinear system using evolutionary neural NARX model
Black-box modeling of nonlinear system using evolutionary neural NARX modelBlack-box modeling of nonlinear system using evolutionary neural NARX model
Black-box modeling of nonlinear system using evolutionary neural NARX model
 
Adaptive network based fuzzy
Adaptive network based fuzzyAdaptive network based fuzzy
Adaptive network based fuzzy
 
Mnist soln
Mnist solnMnist soln
Mnist soln
 
I041214752
I041214752I041214752
I041214752
 
Image Segmentation Using Two Weighted Variable Fuzzy K Means
Image Segmentation Using Two Weighted Variable Fuzzy K MeansImage Segmentation Using Two Weighted Variable Fuzzy K Means
Image Segmentation Using Two Weighted Variable Fuzzy K Means
 
OPTIMIZED TASK ALLOCATION IN SENSOR NETWORKS
OPTIMIZED TASK ALLOCATION IN SENSOR NETWORKSOPTIMIZED TASK ALLOCATION IN SENSOR NETWORKS
OPTIMIZED TASK ALLOCATION IN SENSOR NETWORKS
 
2009 spie hmm
2009 spie hmm2009 spie hmm
2009 spie hmm
 
An efficient technique for color image classification based on lower feature ...
An efficient technique for color image classification based on lower feature ...An efficient technique for color image classification based on lower feature ...
An efficient technique for color image classification based on lower feature ...
 
Hyper-parameter optimization of convolutional neural network based on particl...
Hyper-parameter optimization of convolutional neural network based on particl...Hyper-parameter optimization of convolutional neural network based on particl...
Hyper-parameter optimization of convolutional neural network based on particl...
 
IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...
IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...
IRJET- Expert Independent Bayesian Data Fusion and Decision Making Model for ...
 
6119ijcsitce01
6119ijcsitce016119ijcsitce01
6119ijcsitce01
 
Optimization as a model for few shot learning
Optimization as a model for few shot learningOptimization as a model for few shot learning
Optimization as a model for few shot learning
 
Meta learning with memory augmented neural network
Meta learning with memory augmented neural networkMeta learning with memory augmented neural network
Meta learning with memory augmented neural network
 
ISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASET
ISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASETISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASET
ISSUES RELATED TO SAMPLING TECHNIQUES FOR NETWORK TRAFFIC DATASET
 
Expert system design for elastic scattering neutrons optical model using bpnn
Expert system design for elastic scattering neutrons optical model using bpnnExpert system design for elastic scattering neutrons optical model using bpnn
Expert system design for elastic scattering neutrons optical model using bpnn
 

Andere mochten auch

Andere mochten auch (20)

H0213337
H0213337H0213337
H0213337
 
C0941217
C0941217C0941217
C0941217
 
B012540914
B012540914B012540914
B012540914
 
F012142530
F012142530F012142530
F012142530
 
H017514655
H017514655H017514655
H017514655
 
A017660107
A017660107A017660107
A017660107
 
Fault detection and diagnosis of high speed switching devices in power inverter
Fault detection and diagnosis of high speed switching devices in power inverterFault detection and diagnosis of high speed switching devices in power inverter
Fault detection and diagnosis of high speed switching devices in power inverter
 
B0730406
B0730406B0730406
B0730406
 
89 jan-roger linna - 7032576 - capillary heating control and fault detectio...
89   jan-roger linna - 7032576 - capillary heating control and fault detectio...89   jan-roger linna - 7032576 - capillary heating control and fault detectio...
89 jan-roger linna - 7032576 - capillary heating control and fault detectio...
 
F010244149
F010244149F010244149
F010244149
 
L1303077783
L1303077783L1303077783
L1303077783
 
I1304026367
I1304026367I1304026367
I1304026367
 
K018217680
K018217680K018217680
K018217680
 
L012337275
L012337275L012337275
L012337275
 
T180203125133
T180203125133T180203125133
T180203125133
 
F1802043747
F1802043747F1802043747
F1802043747
 
F017633538
F017633538F017633538
F017633538
 
J018127176.publishing paper of mamatha (1)
J018127176.publishing paper of mamatha (1)J018127176.publishing paper of mamatha (1)
J018127176.publishing paper of mamatha (1)
 
G017633943
G017633943G017633943
G017633943
 
H010526975
H010526975H010526975
H010526975
 

Ähnlich wie F017533540

Neural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabNeural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabijcses
 
Comparison of Neural Network Training Functions for Hematoma Classification i...
Comparison of Neural Network Training Functions for Hematoma Classification i...Comparison of Neural Network Training Functions for Hematoma Classification i...
Comparison of Neural Network Training Functions for Hematoma Classification i...IOSR Journals
 
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...IRJET Journal
 
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...IAEME Publication
 
AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...
AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...
AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...IJNSA Journal
 
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network AlgorithmsWeb Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network Algorithmsaciijournal
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsaciijournal
 
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...cscpconf
 
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...csandit
 
Improving face recognition by artificial neural network using principal compo...
Improving face recognition by artificial neural network using principal compo...Improving face recognition by artificial neural network using principal compo...
Improving face recognition by artificial neural network using principal compo...TELKOMNIKA JOURNAL
 
Q UANTUM C LUSTERING -B ASED F EATURE SUBSET S ELECTION FOR MAMMOGRAPHIC I...
Q UANTUM  C LUSTERING -B ASED  F EATURE SUBSET  S ELECTION FOR MAMMOGRAPHIC I...Q UANTUM  C LUSTERING -B ASED  F EATURE SUBSET  S ELECTION FOR MAMMOGRAPHIC I...
Q UANTUM C LUSTERING -B ASED F EATURE SUBSET S ELECTION FOR MAMMOGRAPHIC I...ijcsit
 
Pattern recognition system based on support vector machines
Pattern recognition system based on support vector machinesPattern recognition system based on support vector machines
Pattern recognition system based on support vector machinesAlexander Decker
 
Hand Written Digit Classification
Hand Written Digit ClassificationHand Written Digit Classification
Hand Written Digit Classificationijtsrd
 
Electricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural NetworkElectricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural NetworkNaren Chandra Kattla
 
Comparison Between Clustering Algorithms for Microarray Data Analysis
Comparison Between Clustering Algorithms for Microarray Data AnalysisComparison Between Clustering Algorithms for Microarray Data Analysis
Comparison Between Clustering Algorithms for Microarray Data AnalysisIOSR Journals
 
Survey on classification algorithms for data mining (comparison and evaluation)
Survey on classification algorithms for data mining (comparison and evaluation)Survey on classification algorithms for data mining (comparison and evaluation)
Survey on classification algorithms for data mining (comparison and evaluation)Alexander Decker
 
Survey on Artificial Neural Network Learning Technique Algorithms
Survey on Artificial Neural Network Learning Technique AlgorithmsSurvey on Artificial Neural Network Learning Technique Algorithms
Survey on Artificial Neural Network Learning Technique AlgorithmsIRJET Journal
 
Poster_Reseau_Neurones_Journees_2013
Poster_Reseau_Neurones_Journees_2013Poster_Reseau_Neurones_Journees_2013
Poster_Reseau_Neurones_Journees_2013Pedro Lopes
 
Text Recognition using Convolutional Neural Network: A Review
Text Recognition using Convolutional Neural Network: A ReviewText Recognition using Convolutional Neural Network: A Review
Text Recognition using Convolutional Neural Network: A ReviewIRJET Journal
 

Ähnlich wie F017533540 (20)

Neural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabNeural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlab
 
Comparison of Neural Network Training Functions for Hematoma Classification i...
Comparison of Neural Network Training Functions for Hematoma Classification i...Comparison of Neural Network Training Functions for Hematoma Classification i...
Comparison of Neural Network Training Functions for Hematoma Classification i...
 
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
 
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
 
AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...
AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...
AN ANN APPROACH FOR NETWORK INTRUSION DETECTION USING ENTROPY BASED FEATURE S...
 
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network AlgorithmsWeb Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithms
 
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
 
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
X-TREPAN : A Multi Class Regression and Adapted Extraction of Comprehensible ...
 
Improving face recognition by artificial neural network using principal compo...
Improving face recognition by artificial neural network using principal compo...Improving face recognition by artificial neural network using principal compo...
Improving face recognition by artificial neural network using principal compo...
 
Q UANTUM C LUSTERING -B ASED F EATURE SUBSET S ELECTION FOR MAMMOGRAPHIC I...
Q UANTUM  C LUSTERING -B ASED  F EATURE SUBSET  S ELECTION FOR MAMMOGRAPHIC I...Q UANTUM  C LUSTERING -B ASED  F EATURE SUBSET  S ELECTION FOR MAMMOGRAPHIC I...
Q UANTUM C LUSTERING -B ASED F EATURE SUBSET S ELECTION FOR MAMMOGRAPHIC I...
 
Pattern recognition system based on support vector machines
Pattern recognition system based on support vector machinesPattern recognition system based on support vector machines
Pattern recognition system based on support vector machines
 
Hand Written Digit Classification
Hand Written Digit ClassificationHand Written Digit Classification
Hand Written Digit Classification
 
Electricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural NetworkElectricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural Network
 
Comparison Between Clustering Algorithms for Microarray Data Analysis
Comparison Between Clustering Algorithms for Microarray Data AnalysisComparison Between Clustering Algorithms for Microarray Data Analysis
Comparison Between Clustering Algorithms for Microarray Data Analysis
 
O18020393104
O18020393104O18020393104
O18020393104
 
Survey on classification algorithms for data mining (comparison and evaluation)
Survey on classification algorithms for data mining (comparison and evaluation)Survey on classification algorithms for data mining (comparison and evaluation)
Survey on classification algorithms for data mining (comparison and evaluation)
 
Survey on Artificial Neural Network Learning Technique Algorithms
Survey on Artificial Neural Network Learning Technique AlgorithmsSurvey on Artificial Neural Network Learning Technique Algorithms
Survey on Artificial Neural Network Learning Technique Algorithms
 
Poster_Reseau_Neurones_Journees_2013
Poster_Reseau_Neurones_Journees_2013Poster_Reseau_Neurones_Journees_2013
Poster_Reseau_Neurones_Journees_2013
 
Text Recognition using Convolutional Neural Network: A Review
Text Recognition using Convolutional Neural Network: A ReviewText Recognition using Convolutional Neural Network: A Review
Text Recognition using Convolutional Neural Network: A Review
 

Mehr von IOSR Journals (20)

A011140104
A011140104A011140104
A011140104
 
M0111397100
M0111397100M0111397100
M0111397100
 
L011138596
L011138596L011138596
L011138596
 
K011138084
K011138084K011138084
K011138084
 
J011137479
J011137479J011137479
J011137479
 
I011136673
I011136673I011136673
I011136673
 
G011134454
G011134454G011134454
G011134454
 
H011135565
H011135565H011135565
H011135565
 
F011134043
F011134043F011134043
F011134043
 
E011133639
E011133639E011133639
E011133639
 
D011132635
D011132635D011132635
D011132635
 
C011131925
C011131925C011131925
C011131925
 
B011130918
B011130918B011130918
B011130918
 
A011130108
A011130108A011130108
A011130108
 
I011125160
I011125160I011125160
I011125160
 
H011124050
H011124050H011124050
H011124050
 
G011123539
G011123539G011123539
G011123539
 
F011123134
F011123134F011123134
F011123134
 
E011122530
E011122530E011122530
E011122530
 
D011121524
D011121524D011121524
D011121524
 

Kürzlich hochgeladen

The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...gurkirankumar98700
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 

Kürzlich hochgeladen (20)

The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 

F017533540

  • 1. IOSR Journal of Computer Engineering (IOSR-JCE) e-ISSN: 2278-0661,p-ISSN: 2278-8727, Volume 17, Issue 5, Ver. III (Sep. – Oct. 2015), PP 35-40 www.iosrjournals.org DOI: 10.9790/0661-17533540 www.iosrjournals.org 35 | Page Improving Image Classification Result using Neural Networks 1 Seyed Peyman Zolnouri, 2 Fardad Farokhi, 3 Mehdi Nadiri Andabili 1,2,3 (Department of Electrical Engineering, Central Tehran Branch, Islamic Azad University, Tehran, Iran) Abstract: Image classification is among the complex processes which depend on several factors. In this paper, the most common existing methods for image classification are compared first and different structures of multilayer Perceptron (MLP) and Radial Basis neural networks (RBF) are then reviewed for classification. Finally, they are compared to non-neural algorithms. The function obtained by using the proposed method enjoys better results as compared to other methods and it has the best result among other methods with respect to accuracy with an average accuracy of 98.84%. Keywords: Image Classification, MLP Neural Network, RBF, KNN I. Introduction In the recent years, several methods have been presented for image classification. Some of these methods include minimum distance, maximum similarity, and support vector machine (SVM). The main steps in image classification involve determining the appropriate classification system, choosing the appropriate approach for classification, post-classification processing and accuracy assessment [1]. A successful classification requires an appropriate classification system and sufficient training samples. Another method which is widely used in image classification is the use of a neural network which includes error back-propagation networks as well as RBF. In this paper, we modify classification accuracy by MLP. The results show that the proposed method has the best answer among the referred algorithms. Order of the contents of paper is such that SVM method which has been already used for classification in previous articles is discussed in section II. The proposed algorithm which is based on back-propagation neural network is presented in section III. Simulation and comparison of the obtained results are described in sections VI- VII. Finally, the conclusion obtained from the paper is presented in section VIII. II. Image Classification Methods 1. Support Vector Machine (SVM) SVM is a multivariable machine learning system developed by Vapink [2]. An appropriate method for data classification includes SVM which uses learning methods with supervisor for data classification. SVM is a means for estimation of classification which uses machine learning theory, in the manner that the main idea of SVM is to maximize the margin between the two classes and to minimize the error. In this method, the two classes are separated by using a linear boundary. A number of learning points with the minimum distance to decision border can be considered as a subset to define decision boarders and as a support vector. Learning set is in terms of N iii yx 1 )},{(  with any input for  1i y . SVM algorithm is calculated as (1) for 0i  . minimize    N i i cw 1 2 2 1  (1) Where C is a parameter which is adjusted by user and i  measures the difference between ),( wxa i and i y [3]. 2. K NEAREST NEIGHBOR (KNN) Classification K nearest neighbor is the most common and most appropriate method for classification due to a high understanding and lack of any need to making any hypothesis on the data. In this classification, the sample existing in the test set belongs to a class with the highest number of votes among k of its nearest neighbors. Euclidean distance is used to calculate the nearest neighbor of a sample which is in terms of the following relation. ),(),( 1 txdtxd m i i eucleucl    (2) If the amounts are continuous, the amount of eucl d will be calculated as follows:
  • 2. Improving Image Classification Result using Neural Networks DOI: 10.9790/0661-17533540 www.iosrjournals.org 36 | Page 2 ))()((),( taxatxd ii i eucl  (3) Two major disadvantages of this method are as follows where Determination of k amount by user which is one of the important points of KNN method and All the nearest neighbors of test sample are considered with the same degree of importance [4]. 3. Neural Network Classification The book ―Classification Algorithm‖ which was written in 2003 by Landgrebe describes different image classification algorithms. In the recent years, advanced classification approaches such as artificial neural networks, fuzzy sets and expert systems which are highly used in image classification have been introduced. Classification approaches are classified in terms of with and without supervisor, parametric and non-parametric, soft and hard (fuzzy) classification and pre-pixel approaches. Neural networks are in terms of pre-pixel algorithms [1]. Neural network is an important means for classification. Researches show that neural networks can be a substitute alternative for traditional classification methods. Neural networks are used for classification in industry, trade and science. In most papers, multilayer perceptron neural networks (MLP) are used for classification. A comparison between neural networks and traditional classification methods which are generally in terms of statistical methods shows that neural networks for classification are in terms of linear methods and free models while statistical methods are linear and based on the model [5]. III. Implementation As we know, type of inputting data and order of data are very important for neural networks so that they will have a close relation with the final results. For this purpose, a series of processes shall first be made on the database so that finally the data is prepared to be applied to the network in two groups, namely learning data and test data. The following steps were taken in order to prepare the aforesaid data. First step: To make the data uniform, at first total data is changed into numerical data by removing the character data in Classes section of database and replacing that with numbers. The number attributed to each class is presented in table(1). Second step: As we know, if a feature has a constant amount in all instances, it will have no effect on the network performance. As a result, the third feature called region-pixel-count with a constant amount of 9 was removed at this stage aiming at preprocess of data. In this case, number of features has decreased to 18 features which are in turn useful for the network. Table 1. Replacing Characters with numbers Class Assigned Number BRICKFACE 1 SKY 2 FOLIAGE 3 CEMENT 4 WINDOW 5 PATH 6 GRASS 7 Third step: Then, by surveying each column of features and by finding maximum and minimum amounts, all the data of that column are normalized by passing through a linear function (4), in the manner that after completion of this stage, all the amounts will be placed in the interval of {-1,1}. 1 minmax min 2          x f (4) Fourth step: Finally, one-third of all data in which there are equal numbers of each class is considered as learning data and the rest is considered as test data. Moreover, all learning data are rearranged randomly to remove network sensitivity to order of instances.
  • 3. Improving Image Classification Result using Neural Networks DOI: 10.9790/0661-17533540 www.iosrjournals.org 37 | Page IV. MLP Network With Back Propagation Algorithm MLP neural networks are among the most widely used and most common methods for classification of inputs so that Several algorithms have been provided to adjust the parameters of this type of network and Back Propagation algorithm is one of them which modifies the weighs of network by calculating the error in terms of returning and calculation of error of each neuron. As shown in Fig.1, MLP networks are constituted of several sections: inputting section which is directly connected to data, middle layers which are also called Hidden Layers, output layers which are equal to the number of classes in view of number of neurons, neurons which are constituted of Activation Functions and weighs which indicate the degree of importance of one input. Figure 1. A scheme of MLP network Moreover, BP algorithm adds another parameter to the aforesaid parameters called Learning rate. Since this type of network and algorithm have equal and predefined principles, the difference seen in the results is due to the difference in the amounts of network parameters and adjusting them by error will result in questioning the type of network. Considering the aforesaid issues, the amounts of parameters are explained.  Input: number of inputs is always equal to the number of features and here it is 18.  Number of internal layers: In this paper, number of hidden layers starts from one layer and will finally end in two layers.  Number of neurons of external layer is equal to the number of classes which is 7 for this database.  Number of neurons of hidden layers is 5 and 10 neurons for each hidden layer, respectively. Moreover, in case activator function is Antisymmetric (symmetric against the origin, odd function); learning rate with BP algorithm will be higher. As we know, the standard logistic function has not this form while hyperbolic tangent function with a=1.7159 and b=2/3 coefficients enjoys these conditions [6]. First of all, some amounts should be considered for the weighs to start program performance. It is better that these amounts have zero averages and variance amounts of weighs are equal to 2 1   mw  (m: number of synoptic relations of a neuron). Here, initial amounts of weighs are selected within an interval of [-0.5 +0.5]. In addition to the above items, network learning method can also play an important role in the final results. Basically, Sequential learning method in which weighs are modified after providing each example, will offer better results in this algorithm. Further to this, training is preferred to sequential method in view of operations because it requires less local storage of weighs for each connection. Moreover, searching for a weight space is made randomly by providing instances to the network on a natural basis. This in turn lessens the probability of network to be trapped in the local minimums. This is while in Batch Learning method, weighs are modified after providing all the examples. However, learning speed in both methods depends on the parameter of learning rate. So that learning speed decreases by having a low learning rate and vice versa. Here, learning rate starts from 0.01 and then decreases in the middle of training. V. Radial Basis Function Network In fact, this type of network is a special type of MLP network with only one hidden layer which is shown in Fig.2. Moreover, in contrast with BP algorithm, modification of network parameters starts from input towards output. As it is clear from its name, the main kernel of the neurons of this network is developed based on radial functions such as Gaussian Functions (3). Since the number of hidden layers is limited to one layer, the most important parameters of this network are therefore limited to Number of neurons or internal layer kernels, type of activator function in each neuron, Radius and centers of activator functions and weighs.
  • 4. Improving Image Classification Result using Neural Networks DOI: 10.9790/0661-17533540 www.iosrjournals.org 38 | Page Figure 2. RBF Network In this paper, RBF networks are tested with 10, 15 and 20 neurons all of which use Gaussian Functions. Moreover, with the initial amounts, the centers are randomly selected from several examples based on which the initial amounts of variance or radius of each neuron are calculated considering (5) M d 2  (5) where d is the space between selected centers and M is the number of neurons. Furthermore, initial amounts of weighs are similar to MLP-BP network. Since three parameters are modified during learning process, three parameters are required to determine learning rate. According to the obtained results, the less the learning rate is for the centers, the better the results of the network will be. Here, three initial amounts are considered for the three mentioned parameters according to table(2). Table 2. RBP Learning rates ValueLearning Rate 0.001Weight rate 0.0001Center rate 0.001Variance rate VI. K-Nearest Neighbors Algorithm As it was said at the beginning, this algorithm is outside of neural networks. In fact, this method acts based on Euclidean distance between the samples. Despite neural networks which have two processes, namely training and testing, this method performs all stages within just one stage. The goal of using this algorithm is to establish a basis for the networks used in this paper. In addition, different nature of this method results in application of a series of different processes in establishing the inputting data. Since the best results obtained by this algorithm were examined in testing the studied database based on K=3 neighborhood, this amount will be considered as a basis for comparison. VII. Results 1. Image classification database UCI image classification database is used to check the results of the proposed method. The database has 2310 instances and 7 output classes. Each instance is in terms of a 3*3 area with 19 features [3] . 2. Simulation Different structures of networks are tested, in the manner that the aforesaid database is used for all of them. Finally, after comparing the results of RBF and MLP-BP networks, the best results of which are compared to the results of KNN algorithm. First of all, accuracy results are provided for 4 different MLP-BP neural networks, so that the first one has one hidden layer with 5 neurons and the second one has the same layer with 10 neurons. This is while by adding another layer to the first and second network with the same number of neurons, the third and fourth networks are obtained, respectively. As seen in Fig.3, the network has offered the Best result among other configurations with two hidden layers each with 10 neurons. However, from practical viewpoint, considering the results of the second network and since it has less layers and less neurons and as a result, it needs a smaller memory, it is preferred to other configurations for implementation in the embedded systems.
  • 5. Improving Image Classification Result using Neural Networks DOI: 10.9790/0661-17533540 www.iosrjournals.org 39 | Page Figure 3. MLP Network Results The results for three different configurations of RBF network are then described, in the manner that the first one has 10 neurons and the second and third ones have 15 and 20 neurons, respectively. As seen in Fig.4, the third configuration is better in four classes as compared to other configurations and it is a little different with the first and second networks among others, but generally, RBF network with an average accuracy of 84.47% is significantly different with MLP-BP network with an average accuracy of 98.84%, in the manner that in order to prove the function of MLP-BP network, its results are provided in table(3) beside the results of KNN algorithm. By examining the results in table(3), it is concluded that MLP-BP network has had a better function even better than KNN algorithm which is based on Euclidean distance, in the manner that it almost has 100% accuracy in four classes and their minimum for other classes is 96.7%. Moreover, by comparing the results in [3], it can be seen that our proposed method has a better function compared to that method. Figure 4. RBF Network Results Table 3. MLP versus KNN results Classes 7 6 5 4 3 2 1 MLP Sensitivity 0.985 1.00 0.845 0.94 0.95 1.00 0.995 Specificity 0.996 1.00 0.997 0.988 0.97 0.999 1.00 Accuracy 0.995 1.00 0.975 0.981 0.969 0.999 0.999 KNN Sensitivity 0.965 1.00 0.86 0.810 0.930 1.00 0.995 Specificity 0.994 1.00 0.986 0.992 0.961 0.991 1.00 Accuracy 0.990 1.00 0.968 0.966 0.957 0.992 0.999
  • 6. Improving Image Classification Result using Neural Networks DOI: 10.9790/0661-17533540 www.iosrjournals.org 40 | Page VIII. Conclusion In this paper, a new look is given to image classification using neural networks. Furthermore, by using different networks with variable configurations and also by comparing the results With non-neural algorithms, it is shown that MLP-BP algorithm has a better function as compared to other aforesaid networks. The importance of this issue is further shown by describing data preparation method for the network. In addition, as the future work, we first want to completely apply the preprocessed methods on the database to decrease the number of features if there is any dependency between the features. We also want to use evolutionary neural networks such as genetic algorithms to increase the space for searching for number of layers and number neurons in MLP-BP networks. References [1] D. LU,Q. WENG, A survey of image classification methods and techniques for improving classification performance, International Journal of Remote Sensing 2007, 823–870. [2] M.He, F. M.Imran, B. Belkacem, and Sh.Mei , IMPROVING HYPERSPECTRAL IMAGE CLASSIFICATION ACCURACY USING ITERATIVE SVM WITH SPATIAL-SPECTRAL INFORMATION, IEEE conference, 2013. [3] J.Tin,Y.Kwok, Moderating the Outputs of Support Vector Machine Classifiers, IEEE Conference,1999. [4] N.Verbiest, C.Cornelis, and R.Jensen, Fuzzy rough Positive region based Nearest Neighbour Classificaton , IEEE World congress on Computational Intelligence, 2012. [5] G.P.Zhang, Neural Networks for Classification: A Survey ,IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART C: APPLICATIONS AND REVIEWS, 2000. [6] Simon Haykin, Multilayer Perceptrons (in Neural Networks, 2th ed. New Jersey, Prentice Hall, 1999),156-178.