Professional Documents
Culture Documents
A Fuzzy Classification Method Based On Support Vector Machine
A Fuzzy Classification Method Based On Support Vector Machine
Machine Learning Center, Faculty of Mathematics and Computer Science, Hebei University, Badoing, Hebei, China
E-MAIL:heq@maiI.hbu,edu.cn
0-7803-78652/03/$17.0002003 IEEE
1237
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on June 09,2023 at 08:31:08 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the Second International Conference on Machine Learning and Cybernetics, Wan, 2-5 November 2003
2.2 One-against-one
4 3 2 1
Figure 1 The decision DAG
1238
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on June 09,2023 at 08:31:08 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the Second International Conference on Machine Learning and Cybernetics, Wan, 2-5 November 2003
(5.Y, ) A x 2 5 Y 2 1,. ..>(XI+>Y,) (1) For any, sample 3, its distance of hyperplane
f, (x) = Oin equation (3) is
Where X ~ E R " y, , E R , i=1,2,...,N . Since the
decision attributes are numerical, they need to be fuzzified
into lin&istic terms(categorica1 terms). The membership (5)
functions can he approximately determined based on
experts' opinion or people's common perception. In this
paper, we fuzzify decision attributes to K classes. through In some sense, d(x) reflects membership degree of x
fuzzy clustering based on self-organized learning [7,8]. So
we get a set of training points with categorical decision with respect to classj, if f,(x) > 0 ,the larger d(x) is, the
attributes and corresponding membership degree: higher membership degree sample x has belonged to the
class j. So there is a functional relation between the
(XI, 71, ..,(XN Y,, P,)
PI), (xu ?A.. 3 (2) membership degree and distance. since llUill is a
constant for the same hyperplane f,(x)=O, there is a
{1,2 ,...K },i = 1,2,...,N ,pi is
Wherex, E R " , ? ; E functional relation bemeen the membership degree and
the maximal value of ' p i l , p i,...,
2 pjK , where hW.
For training samples in equation (I), we know the
,uq(U# ~[0,1]) is the membership degree indicating following:
with what degree y j belongs to the class j,a n d h is the
class corresponding to the maximal membership degree of
I (dl 9 Y1)'@2 .P2 1,. ..>(dN 3 Y, 1 (6)
,ui,,pi2 ,...,piK i=1,2,...,N , j=1,2,..., K.i.e.:
Where
- P, =max@,,,pi2, ...,piK)
Y;= a m a x { p , , ,...,ptK1
.
z =1,2, ...,N di =majl,z..,KCfj(xi)},~;
=max(u,,,P;i,,. . . , p j K } , i = ~ . . . , N
1239
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on June 09,2023 at 08:31:08 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the Second International Conference on Machine Learningand Cybernetics, Xi”,2-5 November 2003
method. In this section, we will introduce two examples to vector machine, and enhance the intelligentlevel of support
see the benefits of this method. vector machine.
We evaluated this method in using the hodyfat data However, this method needs to he improved ,for there
(from http://silver. sdsmt. edu/”rwjohnso), and the rice still remains some problems. For one thing, this method is
data(from UCI) listed in Table 1. sensitive to noises because of overfitting problem. For
Table 1 Feature of experiment data another, the forecasting capability of membership
prediction needs to be enhanced further.
Acknowledgements
Reference
FCM F1D3
Test Accuracy Rate 100% 79% [l] C. Cortes and V. Vapnik, “Support-vector network,”
Average Error of Machine Learning,vol. 20, pp. 273-297, 1995
o,15 0.17 [2] Cbib-Wei Hsu and Chih-Jen Lin, “A Comparison of
Membership Degree
Methods for Multiclass Support Vector Machine,”
IEEE TRANSACTIONS ON N E W
NETWOFXS,vol.l3,No.2,March2002
[3] S. Knerr, L. Personnaz, and G.Dreyfus, “Single-layer
leaming revisited A stepwise procedure for building
and training a neural network,” in Neurocomputing:
Algorithms, Architectures and Application, J.
Experimental result shows that this new fuzzy Fogelman, Ed. New YorkSpringer-Verlag, 1990
classificationmethod not only has high classified accuracy, [4] U. KreBel, “Pairwise classification and support
but also has strong forecasting capability for the vector machines,” in Advances in Kemel Methods-
membership degree. support vector learning, B. Scholkopf, C. J.C. Burges,
and A.J. smola, Eds. Cambridge, W M I T
5. Conclusions Press,1999, pp.255-268
[5] J.Fridman. (1996) Another Approch to poly-
In conclusion, we have described a new fuzzy chotomous Classification. Dep. Statist., Stanford, CA,
classification method. .The designing fundamentals and the [Online]. Availiahle: httu://www- stat. Stanford. edd
methods of computation and realization are given. The reports/ friedman/poly.ps.Z.
experimental results show that the new method in this [6] J.C. Platt, N.Cristianini, and J. Shave-Taylor, “Large
paper is more effective than Fuzzy ID3.. The characteristic Margin DAGS for Multiclass Classification, ” in
of this method is that, for the data with numerical decision Advances in Neural Information Processing System.
attributes, it fuzzifies the samples to some classes Cambridge, W M I T Press,2OOO,vol. 12, pp.547-553
(linguistic terms) first, then leaming classifier; for new [7] T. Kohonen, Self-Organization and Associ4tive
sample, this method doesn’t forecast the corresponding Memory (Springer, Berlin, 1988)
decision value, but gives the corresponding class and the [8] Yufei Yuan and Micheal JShaw, ”Induction of fuzzy
membership degree with respect to this class.This fuzzy decision trees”, Fuzzy Sets and Systems 69(1995)
decision result is more objective and easier to understand 125-139
than precise decision result. Experimental result including
two databases shows that this new fuzzy classification
method not only has high classified accuracy, hut also has
strong forecasting capability for the membership degree.
This new method optimizes the classified result of support
1240
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on June 09,2023 at 08:31:08 UTC from IEEE Xplore. Restrictions apply.