Volume 15, Issue 3 (12-2018)                   JSDP 2018, 15(3): 101-112 | Back to browse issues page

XML Persian Abstract Print

Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Khodagholi M, Dolati A, Hosseinzadeh A, Shamsolketabi K. A New Method to Determine Data Membership and Find Noise and Outlier Data Using Fuzzy Support Vector Machine. JSDP. 2018; 15 (3) :101-112
URL: http://jsdp.rcisp.ac.ir/article-1-394-en.html
Abstract:   (109 Views)

Support Vector Machine (SVM) is one of the important classification techniques, has been recently attracted by many of the researchers. However, there are some limitations for this approach. Determining the hyperplane that distinguishes classes with the maximum margin and calculating the position of each point (train data) in SVM linear classifier can be interpreted as computing a data membership with certainty. A question may be raised here: how much the level of the certainty of this classification, based on hyperplane, can be trusted. In the standard SVM classification, the significance of error for different train data is considered equal and every datum is assumed to belong to just one class. However, in many cases some of train data, including outlier and vague data with no defined model, cannot be strictly considered as a member of a certain class. That means, a train datum may does not exactly belong to one class and its features may show 90 percent membership of one class and 10 percent of another. In such cases, by using fuzzy SVM based on fuzzy logic, we can determine the significance of data in the train phase and finally determine relative class membership of data.
The method proposed by Lin and Wang is a basic method that introduces a membership function for fuzzy support vector machine. Their membership function is based on the distance between a point and the center of its corresponding class.
In this paper, we introduce a new method for giving membership to train data based on their distance from distinctive hyperplane. In this method, SVM classification together with primary train data membership are used to introduce a fuzzy membership function for the whole space using symmetrical triangular fuzzy numbers. Based on this method, fuzzy membership function value of new data is selected with minimum difference from primary membership of train data and with the maximum level of fuzzification. In the first step, we define the problem as a nonlinear optimization problem. Then we introduce an efficient algorithm using critical points and obtain final membership function of train data. According to the proposed algorithm, the more distant data from the hyperplane will have a higher membership degree. If a datum exists on the hyperplane, it belongs to both classes with the same membership degree. Moreover, by comparing the primary membership degree of train data and calculated final distribution, we compute the level of noise for train data. Finally, we give a numerical example for illustration the efficiency of the proposed method and comparing its results with the results of the Lin and Wang approach.

Full-Text [PDF 4307 kb]   (52 Downloads)    
Type of Study: Research | Subject: Paper
Received: 2015/07/19 | Accepted: 2018/08/18 | Published: 2018/12/19 | ePublished: 2018/12/19

1. [1] Mehralian M A, kazem fouladi K. the recognition of online handwritten persian characters based on their main bodies using svm. JSDP. 2012; 9 (1) :59-68
2. [2] Montazer G A, shayestehfar M. iranian license plate identification with fuzzy support vector ma-chine . JSDP. 2015; 12 (1) :47-56
3. [3] Abe S. Pattern Classification: Neuro-Fuzzy M-ethods And Their Comparison. springer-verlag, London, UK, 2001.
4. [4] Alpayden E. Introduction To Machine Learning. The MIT Press, 2010.
5. [5] Bezdek J. C. Fuzzy Mathematics In Pattern Classi-fication. Phd Dissertation, Cornell University, Ith-aca, NY, 1973.
6. [6] Bishop CM. Pattern Recognition And Machine Learning. springer; 2006.
7. [7] Burges J. C. a tutorial on support vector machine-es for pattern recognition. Data Mining and Know-ledge Discovery 1998; 2(2): 121-167. [DOI:10.1023/A:1009715923555]
8. [[8] Cortes C., Vapnik V. support-vector networks. Machine Learning 1995; 20(3): 273-297. https://doi.org/10.1007/BF00994018 [DOI:10.1023/A:1022627411411]
9. [9] Fisher RA. the use of multiple measurements in taxonomic problems. Annals of eugenics. 1936 Sep;7(2):179-88. [DOI:10.1111/j.1469-1809.1936.tb02137.x]
10. [10] Huang H. P., Liu Y. H. fuzzy support vector machine for pattern recognition and data mining. Int J Fuzzy Syst 2002; 4(3): 826–835.
11. [11] Inoue T., Abe S. fuzzy support vector machines for pattern classification. In Proceeding of IJCNN 2001; 2: 1449–1454. [DOI:10.1109/IJCNN.2001.939575]
12. [12] Jiang X. F., Yi, Z., Lv J. C. fuzzy svm with a new fuzzy membership function. Neural Compute 2006; 15(3-4): 268–276. [DOI:10.1007/s00521-006-0028-z]
13. [13] Lee, G.H., Taur, J. S., Tao, C.W. a robust fuzzy support vector machine for two-class pattern classification. International Journal of Fuzzy Systems 2006; 8(2): 76-87.
14. [14] Li M. Q., Chen F. Z., Kou J. S. candidate vectors selection for training support vector machines. IEEE Computer Society, Third International Conference on Natural Computation (ICNC) 2007; 1: 538-542. [DOI:10.1109/ICNC.2007.292]
15. [15] Lin C. F., Wang S.D. fuzzy support vector ma-chines. IEEE Trans. on Neural Networks 2002; 13(2): 464-471. [DOI:10.1109/72.991432] [PMID]
16. [16] Mitchell T. Machine Learning. McGraw Hill, ISBN 0-07-042807-7, 1997.
17. [17] Nilsson N. J. Introduction To Machine Learning. Robotics Laboratory Department of Computer Science Stanford University Stanford, CA 94305, 2005.
18. [18] Pontil M., Verri A. properties of support vector machines. Neural Computation 1998; 10(4): 955-974. [DOI:10.1162/089976698300017575]
19. [19] Schölkopf B., Burges J. C., Smola A. Advances In Kernel Methods: Support Vector Learning. Cambridge. MA: MIT Press, 1999. [PMID]
20. [20] Shiry S., Sadatpoor S. S. Using Machine Learning Techniques In Homeopathy. Amir Kabir Uni-versity Press, 2010.
21. [21] Tang M. E. fuzzy svm with a new fuzzy membership function to solve the two-class pro-blems. Neural Processing Letters 2011; 34(3): 209-219. [DOI:10.1007/s11063-011-9192-y]
22. [22] Trung L. e., Tran D., Wanli M. A., Sharma D. a new fuzzy membership computation method for fuzzy support vector machines, Communications and Electronics (ICCE), Third International Conference on 2010; 153 – 157.
23. [23] Vapnik V. N. Statistical Learning Theory. New York: Wiley, 1998.
24. [24] Vapnik V. N. The Nature Of Statistical Learning Theory. New York: Springer-Verlag, 1995. [DOI:10.1007/978-1-4757-2440-0] [PMID]
25. [25] Zadeh L. A. Fuzzy Set A Basis For A Thory Of Possibility. In Fuzzy Set and System 1987; 1: 3-28. [DOI:10.1016/0165-0114(78)90029-5]
26. [26] Zadeh L. A. fuzzy sets. information and control, 1965; 8(3): 338-353. [DOI:10.1016/S0019-9958(65)90241-X]
27. [27] Zhang X.G. using class-center vectors to build support vector machines. In: Proceeding of the IEEE signal processing society workshop 1999; 3–11.

Add your comments about this article : Your username or Email:

Send email to the article author

© 2015 All Rights Reserved | Signal and Data Processing