Volume 14, Issue 2 (9-2017)                   JSDP 2017, 14(2): 159-169 | Back to browse issues page


XML Persian Abstract Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Chaghari A, Feizi-Derakhshi M. Automatic Clustering Using Improved Imperialist Competitive Algorithm. JSDP 2017; 14 (2) :159-169
URL: http://jsdp.rcisp.ac.ir/article-1-453-en.html
University of Tabriz
Abstract:   (5699 Views)

Imperialist Competitive Algorithm (ICA) is considered as a prime meta-heuristic algorithm to find the general optimal solution in optimization problems. This paper presents a use of ICA for automatic clustering of huge unlabeled data sets. By using proper structure for each of the chromosomes and the ICA, at run time, the suggested method (ACICA) finds the optimum number of clusters while optimal clustering of the data simultaneously.To increase the accuracy and speed of convergence, the structure of ICA changes. As in different applications, there is a need for data clustering which the number of clusters is not known before it is necessary to have methods that can cluster data without knowing the correct prediction of the number of clusters. In the other words, the proposed algorithm requires no background knowledge to classify the data.  In addition, the proposed method is more accurate in comparison with other clustering methods based on evolutionary algorithms. In Imperialist Competitive Algorithm, firstly steps should be taken to increase search rates and explore possible solution while approaching to the global optimal response the steps should be reduced to ensure that the algorithm is not lost and it is not in the local optimal manner. For this purpose and improvement of imperialist competitive algorithm, mutation rate and revolution operator's operation rate are determined dynamically. DB and CS are cluster validity Indexes. In this paper, DB and CS cluster validity measurements are used as the objective function. To demonstrate the superiority of the proposed method, the average of fitness function and the number of clusters determined by the proposed method is compared with three automatic clustering algorithms based on evolutionary algorithms. The partitional clustering algorithms are based on three powerful well-known optimization algorithms, namely the genetic algorithm, the particle swarm optimization and differential evolutionary algorithm.

Full-Text [PDF 4416 kb]   (4419 Downloads)    
Type of Study: Research | Subject: Paper
Received: 2015/11/8 | Accepted: 2017/03/5 | Published: 2017/10/21 | ePublished: 2017/10/21

References
1. [1] I. Evangelou, "DG Hadjimitsis, AA Lazakidou, Clayton,"" in Data Mining and Knowledge Disco-very in Complex Image Data using Artific-ial Neural Networks", Workshop on Complex Reasoning an Geographical Data, Cyp-rus, 2001. [PMCID]
2. [2] T. Lillesand, R. W. Kiefer, and J. Chipman, Remote sensing and image interpretation. John Wiley & Sons, 2014.
3. [3] K. Fukunaga, Introduction to statistical pattern recognition. Academic press, 2013.
4. [4] H. Frigui and R. Krishnapuram, "A robust competitive clustering algorithm with applica-tions in computer vision," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 21, no. 5, pp. 450-465, 1999. [DOI:10.1109/34.765656]
5. [5] Y. Leung, J.-S. Zhang, and Z.-B. Xu, "Clustering by scale-space filtering," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 22, no. 12, pp. 1396-1410, 2000. [DOI:10.1109/34.895974]
6. [6] A. K. Jain, M. N. Murty, and P. J. Flynn, "Data clustering: a review," ACM computing surveys (CSUR), vol. 31, no. 3, pp. 264-323, 1999.
7. [7] J. Holland, "Adaption in natural and artificial systems," Ann Arbor MI: The University of Michigan Press, 1975.
8. [8] S. Z. Selim and K. Alsultan, "A simulated annealing algorithm for the clustering problem," Pattern recognition, vol. 24, no. 10, pp. 1003-1008, 1991. [DOI:10.1016/0031-3203(91)90097-O]
9. [9] V. Di Gesú, R. Giancarlo, G. L. Bosco, A. Raimondi, and D. Scaturro, "GenClust: A genetic algorithm for clustering gene expression data," BMC bioinformatics, vol. 6, no. 1, p. 289, 2005. [DOI:10.1186/1471-2105-6-289] [PMID] [PMCID]
10. [10] A. Beg, M. Z. Islam, and V. Estivill-Castro, "Genetic algorithm with healthy population and multiple streams sharing information for clustering," Knowledge-Based Systems, vol. 114, pp. 61-78, 2016. [DOI:10.1016/j.knosys.2016.09.030]
11. [11] D. Bajer, G. Martinović, and J. Brest, "A population initialization method for evolution-ary algorithms based on clustering and Cauchy deviates," Expert Systems with Applications, vol. 60, pp. 294-310, 2016. [DOI:10.1016/j.eswa.2016.05.009]
12. [12] J. de Andrade Silva, E. R. Hruschka, and J. Gama, "An evolutionary algorithm for cluster-ing data streams with a variable number of clusters," Expert Systems with Applications, vol. 67, pp. 228-238, 2017. [DOI:10.1016/j.eswa.2016.09.020]
13. [13] S. Alam, G. Dobbie, and S. U. Rehman, "Analysis of particle swarm optimization based hierarchical data clustering approach-es," Swarm and Evolutionary Computation, vol. 25, pp. 36-51, 2015. [DOI:10.1016/j.swevo.2015.10.003]
14. [14] Z. Halim, M. Waqas, and S. F. Hussain, "Clustering large probabilistic graphs using multi-population evolutionary algorithm," Information Sciences, vol. 317, pp. 78-95, 2015. [DOI:10.1016/j.ins.2015.04.043]
15. [15] M. G. Omran, A. Salman, and A. P. Engelbrecht, "Dynamic clustering using parti-cle swarm optimization with application in image segmentation," Pattern Analysis and Applications, vol. 8, no. 4, pp. 332-344, 2006. [DOI:10.1007/s10044-005-0015-5]
16. [16] S. Bandyopadhyay and U. Maulik, "Genetic clustering for automatic evolution of clusters and application to image classification," Patt-ern Recognition, vol. 35, no. 6, pp. 1197-1208, 2002. [DOI:10.1016/S0031-3203(01)00108-X]
17. [17] S. Das, A. Abraham, and A. Konar, "Automatic clustering using an improved differential evolution algorithm," Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, vol. 38, no. 1, pp. 218-237, 2008. [DOI:10.1109/TSMCA.2007.909595]
18. [18] S. Saha and S. Bandyopadhyay, "A generaliz-ed automatic clustering algorithm in a multiobjective framework," Applied Soft Computing, vol. 13, no. 1, pp. 89-108, 2013. [DOI:10.1016/j.asoc.2012.08.005]
19. [19] A. Konar, Computational intelligence: principles, techniques and applications. Sprin-ger Science & Business Media, 2006.
20. [20] P. Brucker, "On the complexity of clustering problems," in Optimization and operations research: Springer, 1978, pp. 45-54. [DOI:10.1007/978-3-642-95322-4_5]
21. [21] D. L. Davies and D. W. Bouldin, "A cluster separation measure," Pattern Analysis and Machine Intelligence, IEEE Transactions on, no. 2, pp. 224-227, 1979.
22. [22] C.-H. Chou, M.-C. Su, and E. Lai, "A new cluster validity measure and its application to image compression," Pattern Analysis and Applications, vol. 7, no. 2, pp. 205-220, 2004. [DOI:10.1007/s10044-004-0218-1]
23. [23] M. K. Pakhira, S. Bandyopadhyay, and U. Maulik, "Validity index for crisp and fuzzy clusters," Pattern recognition, vol. 37, no. 3, pp. 487-501, 2004. [DOI:10.1016/j.patcog.2003.06.005]
24. [24] E. Atashpaz-Gargari and C. Lucas, "Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competi-tion," in Evolutionary computation, 2007. CEC 2007. IEEE Congress on, 2007, pp. 4661-4667: IEEE. [DOI:10.1109/CEC.2007.4425083]

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

© 2015 All Rights Reserved | Signal and Data Processing