Professional Documents
Culture Documents
FLANN, Artificial Bee Colony, Classification Problem
FLANN, Artificial Bee Colony, Classification Problem
Abstract—Artificial Neural Networks have emerged as an important tool for classification and have been widely used to classify
a non-linear separable pattern. The most popular artificial neural networks model is a Multilayer Perceptron (MLP) as is able to
perform classification task with significant success. However due to the complexity of MLP structure and also problems such as
local minima trapping, over fitting and weight interference have made neural network training difficult. Thus, the easy way to
avoid these problems is to remove the hidden layers. This paper presents the ability of Functional Link Neural Network (FLNN)
to overcome the complexity structure of MLP by using single layer architecture and propose an Artificial Bee Colony (ABC)
optimization for training the FLNN. The proposed technique is expected to provide better learning scheme for a classifier in
order to get more accurate classification result.
Index Terms— Neural Networks, Functional Link Neural Network, Learning, Artificial Bee Colony.
—————————— ——————————
1 INTRODUCTION
9. If the onlookers are distributed The reason of conduction a simulation on MLP and
a. Go to step 11 FLNN architectures is to provide a comparison on neural
10. else complexity between standard Artificial Neural Network
a. Repeate step 5 until step 8 for onlookers. and Higher Order Neural Network. In this simulation we
11. Apply the greedy selection process for onlookers, vi used a single hidden layer of MLP with the numbers of
12. Determine the abandoned solution for the scout, if hidden nodes equal to it input features while for the
exists, and replace it with a new randomly produced FLNN architecture we used a second order input en-
solution xi using: hancement (2ndFLNN) to provide a nonlinear input-
output mapping capability. The neural complexity refers
to the numbers of trainable parameters (weight and bias)
13. Memorize the best solution needed to implement good approximation in each neural
a. Update new weight and bias for the FLNN network architecture. The less numbers of parameters
14. cycle=cycle+1 indicate that the network required less computational
15. Stop when cycle = Maximum cycle load as there are small numbers of weight and bias to be
16. Save updated FLNN new weight and bias. updated at every epoch or cycle and thus led to a faster
convergence. Table 3 below shows the comparison of
4 SIMULATION RESULT network complexity for each benchmark problems.
Ten trials were performed on each simulation of the
In order to evaluate the performance of the proposed
MLP-BP, 2ndFLNN-BP and 2ndFLNN-ABC with the best
learning scheme FLNN-ABC for classification problem,
training accuracy result is taken out from these 10 trials.
simulation experiments were carried out on a 2.30 GHz
In order to generate the Training and Test sets, each da-
Core i5-2410M Intel CPU with 8.0 GB RAM in a 64-bit
tasets were randomly divided into two equal sets (1st-Fold
Operating System. The comparison of standard BP train-
and 2nd-Fold). Each of these two sets was alternatively
ing and ABC algorithms is discussed based on the simula-
used either as a Training set or as a Test set. The average
tion results implemented in Matlab 2010b. In this work
values of each datasets result were then used for compar-
we considered 4 benchmark classification problems,
ison. Table 4 below, presents the simulation result of
Breast Cancer Wisconsin, PIMA Indian Diabetes and BU-
MLP-BP, 2ndFLNN-BP and 2ndFLNN-ABC architectures. It
PA Liver Disorder.
is shown that, the implementation of ABC as the training
During the experiment, simulations were performed
scheme for FLNN-ABC provides better accuracy on Test
on the training of MLP architecture with Backpropagation
set compared to FLNN-BP and MLP-BP.
TABLE 3
Comparison on complexity of the architecture between MLP and FLNN for each benchmark datasets
Datasets Network type Network structure No of parameters/dimensions
(D)
TABLE 4
Result obtained from MLP-BP, 2ndFLNN-BP and 2ndFLNN-ABC architectures
Datasets Average Values MLP-BP 2nd FLNN-BP 2nd FLNN-ABC
Cancer MSE on Training 0.19186 0.24311 0.071645
Training Accuracy (%) 90.40625 87.84335 93.9638
MSE on Test 0.21442 0.241233 0.180275
Test Accuracy (%) 89.26655 75.5669 90.5899
PIMA MSE on Training 0.697915 0.539665 0.41128
Training Accuracy (%) 65.10415 72.7838 74.33805
MSE on Test 0.6979165 0.5965245 0.580515
Test Accuracy (%) 65.10415 39.5953 69.8524
BUPA MSE on Training 0.84087 0.853635 0.51649
Training Accuracy (%) 57.9564 57.06855 54.99575
MSE on Test 0.840872 1.075618 0.726065
Test Accuracy (%) 57.9564 17.08535 59.1484
[4] S. Dehuri and S.-B. Cho, "A comprehensive survey on functional link
neural networks and an adaptive PSO–BP learning for CFLNN,"
5 CONCLUSION
Neural Computing & Applications vol. 19, pp. 187-205, 2010.
In this work, we evaluated the FLNN-ABC model for the [5] Y. H. Pao and Y. Takefuji, "Functional-link net computing: theory,
task of pattern classification of 2-class classification prob- system architecture, and functionalities," Computer, vol. 25, pp. 76-79,
lems. The experiment has demonstrated that FLNN-ABC 1992.
performs the classification task quite well. For the case of
[6] Y. H. Pao, "Adaptive pattern recognition and neural networks," 1989.
Cancer, PIMA and BUPA the simulation result shows that
[7] S. Haykin, "Neural Networks: A Comprehensive Foundation. ," The
the proposed ABC algorithm can successfully train the
Knowledge Engineering Review vol. 13, pp. 409-412, 1999.
FLNN for solving classification problems with better ac-
[8] E. Sahin, "A New Higher-order Binary-input Neural Unit: Learning
curacy rate on unseen data. The importance of this re-
and Generalizing Effectively via Using Minimal Number of
search work is to provide an alternative training scheme
Monomials " Master, Middle East Technical University of Ankara,
for training the Functional link Neural Network instead
of standard BP learning algorithm. 1994.
[9] B. B. Misra and S. Dehuri, "Functional Link Artificial Neural
Network for Classification Task in Data Mining," Journal of Computer
REFERENCES Science, vol. 3, pp. 948-955, 2007.
[10] A.-S. Chen and M. T. Leung, "Regression neural network for error
[1] G. P. Zhang, "Neural networks for classification: a survey," Systems,
correction in foreign exchange forecasting and trading," Computers
Man, and Cybernetics, Part C: Applications and Reviews, IEEE
& Operations Research, vol. 31, pp. 1049-1068, 2004.
Transactions on, vol. 30, pp. 451-462, 2000.
[11] J. C. Patra and C. Bornand, "Nonlinear dynamic system
[2] S.-H. Liao and C.-H. Wen, "Artificial neural networks classification
identification using Legendre neural network," in Neural Networks
and clustering of methodologies and applications – literature
(IJCNN), The 2010 International Joint Conference on, 2010, pp. 1-7.
analysis from 1995 to 2005," Expert Systems with Applications, vol. 32,
[12] J. C. Patra and A. C. Kot, "Nonlinear dynamic system identification
pp. 1-11, 2007.
using Chebyshev functional link artificial neural networks," Systems,
[3] J. C. Patra and R. N. Pal, "A functional link artificial neural network
Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, vol. 32,
for adaptive channel equalization," Signal Processing, vol. 43, pp. 181-
pp. 505-511, 2002.
195, 1995.
[13] H. M. Abbas, "System Identification Using Optimally Designed
Functional Link Networks via a Fast Orthogonal Search Technique," [29] S. Dehuri, et al., "Genetic Feature Selection for Optimal Functional
JOURNAL OF COMPUTERS, vol. 4, FEBRUARY 2009 2009. Link Artificial Neural Network in Classification," presented at the
[14] S. Emrani, et al., "Individual particle optimized functional link neural Proceedings of the 9th International Conference on Intelligent Data
network for real time identification of nonlinear dynamic systems," Engineering and Automated Learning, Daejeon, South Korea, 2008.
in Industrial Electronics and Applications (ICIEA), 2010 the 5th IEEE [30] D. Pham, et al., "The Bees Algorithm," Manufacturing Engineering
Conference on, 2010, pp. 35-40. Centre, Cardiff University, UK,2005.
[15] S. J. Nanda, et al., "Improved Identification of Nonlinear MIMO [31] D. Karaboga, "An Idea Based on Honey Bee Swarm for Numerical
Plants using New Hybrid FLANN-AIS Model," in Advance Optimization," Erciyes University, Engineering Faculty, Computer
Computing Conference, 2009. IACC 2009. IEEE International, 2009, pp. Science Department, Kayseri/Turkiye2005.
141-146. [32] D. Karaboga and B. Basturk, "On the performance of artificial bee
[16] J. Teeter and C. Mo-Yuen, "Application of functional link neural colony (ABC) algorithm," Elsevier Applied Soft Computing, vol. 8, pp.
network to HVAC thermal dynamic system identification," Industrial 687-697, 2007.
Electronics, IEEE Transactions on, vol. 45, pp. 170-176, 1998. [33] B. Akay and D. Karaboga, "A modified Artificial Bee Colony
[17] P. P. Raghu, et al., "A combined neural network approach for texture algorithm for real-parameter optimization," Information Sciences, vol.
classification," Neural Networks, vol. 8, pp. 975-987, 1995. In Press, Corrected Proof, 2010.
[18] I.-A. Abu-Mahfouz, "A comparative study of three artificial neural [34] D. Karaboga and C. Ozturk, "A novel clustering approach: Artificial
networks for the detection and classification of gear faults " Bee Colony (ABC) algorithm," Applied Soft Computing, vol. 11, pp.
International Journal of General Systems vol. 34, pp. 261-277, 2005. 652-657, 2011.
[19] L. M. Liu, et al., "Image classification in remote sensing using [35] D. Karaboga, et al., "Artificial Bee Colony (ABC) Optimization
functional link neural networks," in Image Analysis and Interpretation, Algorithm for Training Feed-Forward Neural Networks," in
1994., Proceedings of the IEEE Southwest Symposium on, 1994, pp. 54-58. Modeling Decisions for Artificial Intelligence. vol. 4617, V. Torra, et al.,
[20] S. Dehuri and S.-B. Cho, "Evolutionarily optimized features in Eds., ed: Springer Berlin / Heidelberg, 2007, pp. 318-329.
functional link neural network for classification," Expert Systems with
Applications, vol. 37, pp. 4379-4391, 2010. Yana Mazwin Mohmad Hassim is a Ph.D student at Universiti Tun
Hussein Onn Malaysia (UTHM). Her current research focuses on the
[21] M. Klaseen and Y. H. Pao, "The functional link net in structural
optimization of Higher Order Neural Networks using Swarm Intelli-
pattern recognition," in TENCON 90. 1990 IEEE Region 10 Conference gence Algorithms. She received her Master’s Degree in Computer
on Computer and Communication Systems, 1990, pp. 567-571 vol.2. Science from University of Malaya in 2005 and Bachelor’s Degree in
Information Technology from National University of Malaysia in 2001.
[22] G. H. Park and Y. H. Pao, "Unconstrained word-based approach for
off-line script recognition using density-based random-vector
Rozaida Ghazali is a Deputy Dean (Research and Development) at
functional-link net," Neurocomputing, vol. 31, pp. 45-65, 2000. the Faculty of Information Technology and Multimedia, Universiti Tun
[23] R. Majhi, et al., "Development and performance evaluation of Hussein Onn Malaysia (UTHM). She graduated with Ph.D. degree
from the School of Computing and Mathematical Sciences at Liver-
FLANN based model for forecasting of stock markets," Expert
pool John Moores University, United Kingdom in 2007, on the topic
Systems with Applications, vol. 36, pp. 6800-6808, 2009. of Higher Order Neural Networks for Financial Time series Predic-
[24] R. Ghazali, et al., "Dynamic Ridge Polynomial Neural Network: tion. Earlier, in 2003 she completed her M.Sc. degree in Computer
Science from Universiti Teknologi Malaysia (UTM). She received her
Forecasting the univariate non-stationary and stationary trading
B.Sc. (Hons) degree in Computer Science (Information System) from
signals," Expert Systems with Applications, vol. 38, pp. 3765-3776, 2011. the Universiti Sains Malaysia (USM) in 1997. In 2001, Rozaida
[25] A. Namatame and N. Veda, "Pattern classification with Chebyshev joined the academic staff in UTHM. Her research area includes neu-
ral networks, data mining, financial time series prediction, data anal-
neural network," International Jounal of Neural Network, vol. 3, pp. 23–
ysis, physical time series forecasting, and fuzzy logic.
31, 1992.
[26] S. Haring and J. Kok, "Finding functional links for neural networks
by evolutionary computation," in In: Van de Merckt Tet al (eds)
BENELEARN1995, proceedings of the fifth Belgian–Dutch conference on
machine learning, Brussels, Belgium, 1995.
[27] S. Haring, et al., "Feature selection for neural networks through
functional links found by evolutionary computation," In: ILiu X et al
(eds) Adavnces in intelligent data analysis (IDA-97). LNCS 1280, pp.
199–210, 1997.
[28] A. Sierra, et al., "Evolution of functional link networks," Evolutionary
Computation, IEEE Transactions on, vol. 5, pp. 54-65, 2001.